Facebook has announced new measures to prevent its interest-based forums called Groups from spreading harmful content, like hate speech and misinformation.
The measures come after the social networking platform faced criticism for its groups being linked to protests that led up to the Capitol riot in the US earlier this year.
"We know we have a greater responsibility when we are amplifying or recommending content," Tom Alison, Vice President of Engineering at Facebook, wrote in a blog post on Wednesday.
These new changes will roll out globally over the coming months, Facebook said.
The social networking giant said that when a group starts to violate its rules, it will now start showing them lower in recommendations, which means it is less likely that people will discover them.
This is similar to its approach in News Feed, where the platform shows lower quality posts further down, so fewer people see them.
"We believe that groups and members that violate our rules should have reduced privileges and reach, with restrictions getting more severe as they accrue more violations, until we remove them completely," Alison said.
"And when necessary in cases of severe harm, we will outright remove groups and people without these steps in between."
Facebook said that it will start to let people know when they are about to join a group that has "Community Standards" violations, so they can make a more informed decision before joining.
"We'll limit invite notifications for these groups, so people are less likely to join," Alison said.
For existing members, the platform will reduce the distribution of that group's content so that it is shown lower in News Feed.
"We think these measures as a whole, along with demoting groups in recommendations, will make it harder to discover and engage with groups that break our rules," Alison said.
Facebook said it will also start requiring admins and moderators to temporarily approve all posts when that group has a substantial number of members who have violated our policies or were part of other groups that were removed for breaking our rules.
This means that content would not be shown to the wider group until an admin or moderator reviews and approves it.
If an admin or moderator repeatedly approves content that breaks its rules, Facebook will take the entire group down.
"When someone has repeated violations in groups, we will block them from being able to post or comment for a period of time in any group," Alison said.
"They also won't be able to invite others to any groups, and won't be able to create new groups."
--IANS
gb/bg
(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
)