"When people in a group repeatedly share content that has been rated false by independent fact-checkers, we will reduce that group's overall News Feed distribution. Starting today, globally," Guy Rosen, Vice President of Integrity at Facebook said in a blog post on late Wednesday.
Facebook said that starting in the coming weeks, when reviewing a Group to decide whether or not to take it down, it will look at admin and moderator content violations in that Group -- including member posts they have approved as a stronger signal that the group violates its standards.
"We're also introducing a new feature called Group Quality, which offers an overview of content removed and flagged for most violations, as well as a section for false news found in the group," added Tessa Lyons, Head of News Feed Integrity at Facebook.
The company has incorporated a "Click-Gap" signal into News Feed ranking.
"Click-Gap" looks for domains with a disproportionate number of outbound Facebook clicks compared to their place in the web graph.
"This can be a sign that the domain is succeeding on News Feed in a way that doesn't reflect the authority they've built outside it and is producing low-quality content," said Facebook.
The company is also expanding the Context Button to images on Instagram.
Launched in April 2018, the Context Button feature provides people more background information about the publishers and articles they see in News Feed so they can better decide what to read, trust and share.
"We're testing enabling this feature for images that have been reviewed by third-party fact-checkers," said Facebook.
Facebook said it will bring the "Verified Badge" into its Messenger service.
"This tool will help people avoid scammers that pretend to be high-profile people by providing a visible indicator of a verified account," said the company.
(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)