Oversight Board, an independent body set up by Facebook, on Thursday said Facebook and Instagram users can now submit appeals to the Board for an independent review of decisions regarding content removal from the two platforms.
Facebook, which has been facing heat in various countries, including India, over handling of hate speech and similar content on the platform, can also refer cases to the Board on whether a particular content needs to be allowed or removed.
The Board expects to reach case decisions and Facebook to act on these decisions within a maximum of 90 days.
Instagram, the popular photo-sharing platform, is owned by Facebook.
"First, individuals who believe Facebook should not have removed their content, and whose appeal to the company was rejected, can now appeal their case to the Oversight Board.
We see this as a significant step forward for free expression and for human rights for Facebook and Instagram users around the world," Oversight Board co-chair Jamal Greene told reporters in a virtual briefing.
Facebook, which has drawn flak in many parts of the world over various issues including data breaches, in 2018 announced plans to create an independent oversight board for content moderation in a transparent manner for its platform along with Instagram.
In May this year, Oversight Board announced the names of 20 members, including former judges, journalists and human rights activists. The number of members which include Sudhir Krishnaswamy, the vice-chancellor of National Law School of India University will rise to 40 over time.
Members of the Oversight Board are not Facebook employees, and cannot be removed by the social networking company. Facebook has established a USD 130 million trust for the Oversight Board, which funds all operations and cannot be revoked.
Greene noted that Facebook can also refer cases to the Oversight Board that could include "significant and difficult" types of decisions under Facebook's own criteria.
"Significant means that the content in question involves real-world impact and issues that are severe, large-scale or important for public discourse.
"Difficult means the content raises questions about current policies or their enforcement, where there are strong arguments for either removing or leaving up the content under review," he said.
He added that the Board is trying to pick cases that are representative of a broader sample of cases.
"The decisions we make are designed to be precedential in the sense that we will try to honour previous decisions that have been made by the Board, so that we can kind of develop a set of guiding principles over time," he added.
Thomas Hughes, director of the Oversight Board Administration, said the Board has set up the case selection committee that will select the cases for review over the coming weeks.
Users can submit an eligible case for review through the Oversight Board website, once they have exhausted their content appeals with Facebook.
Facebook can also refer cases to the Board on an ongoing basis, including in emergency circumstances under the Expedited Review procedure.
Over the following months, users will also have the opportunity to appeal to the Board on content they want Facebook to remove.
After selection, cases will be assigned to a five-member panel with at least one member from the region implicated in the content. No single Board member would be able to make a decision alone.
Cases will be decided upon using both Facebook's Community Standards and Values and international human rights standards. In addition to the cases accepted, the Board is also able to recommend changes to Facebook's Community Standards alongside its decisions.
Each case will have a public comment period to allow third parties to share their insights with the Board. Case descriptions will be posted on the Board website with a request for public comment before the Board begins deliberations. These descriptions will not include any information which could potentially identify the users involved in a case.
"For the first time ever, nearly 3 billion people on these services will have access to independent review of difficult content decisions.
"As with all Facebook products, we will be rolling out the ability to appeal to the Oversight Board to people across the world in waves to ensure stability of the product experience for users," Facebook's Brent Harris said adding that users should be able to do so in the coming few weeks.
Harris said a case management tool has been created to ensure the Board members have a secure and privacy protected way of selecting, reviewing, hearing and making decisions on cases from anywhere in the world.
"We designed the system in consultation with the Board, and kept the independence of the Board, as well as privacy and data security top of Mind throughout the process. As such the case management tool will only show user information that is relevant to the board's operations," he added.
Facebook will only share user information relevant to the case and will do so securely through the tool, taking into account all relevant legal and privacy considerations, Harris said.
He added that those at Facebook, who review information within the case management tool, will be a small group of employees responsible for supporting the Board's proceedings.
The Board members have also participated in mock proceedings, looking at various types of cases including hate speech and nudity.
All decisions made by the Oversight Board will be made public, and Facebook will have to respond publicly to them.
The Board will publish decisions on its website, and also issue a public annual report to evaluate its work and how Facebook is meeting its commitments.
(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)