ALSO READFor Facebook, India now the largest audience country with over 240 mn users Facebook admits the social media platform can be bad for users Facebook kills Snapchat-like app it released last year for users under 21 Now, Facebook to slip in posts by local politicians in your News Feed Remove blasphemous content by 2018 or face ban: Pak's ultimatum to Facebook
Facebook has apologised after an investigation exposed inconsistencies by moderators in removing offensive posts reported by the social network's users.
The investigation reported by ProPublica this week showed that in one case Facebook censors, called content reviewers, approved a picture of a corpse with the statement "the only good Muslim is a f...... dead one" was while another post stating "death to the Muslims!!!" was removed.
In an analysis of 900 posts, the US-based non-profit investigative newsroom found that content reviewers at Facebook often make different calls on items with similar content, and do not always abide by the company's guidelines.
The posts were submitted to ProPublica as part of a crowd-sourced investigation into how Facebook implements its hate-speech rules.
ProPublica asked Facebook to explain its decisions on a sample of 49 items.
Facebook admitted that its reviewers had made a mistake in 22 cases, but the social network defended its rulings in 19 instances.
In six cases, Facebook said that the users had not flagged the content correctly, or the author had deleted it. In the remaining two cases, Facebook said it did not have enough information to respond.
"We're sorry for the mistakes we have made... They do not reflect the community we want to help build," Facebook Vice President Justin Osofsky was quoted as saying by ProPublica.
"We must do better," he added.
Facebook, according to Osofsky, will double the size of its safety and security team, which includes content reviewers and other employees, to 20,000 people in 2018, in an effort to enforce its rules better, the report said on Thursday.