3 min read Last Updated : Nov 18 2021 | 1:02 AM IST
Meta, the rebranded parent company of the Facebook family of apps, has invested in setting up one of the largest networks of third party fact checkers in India, a top executive has said.
In an email interview with Business Standard, Meta’s Monika Bickert, Head of Global Policy Management, said a network of 10 fact checkers working in 11 Indian languages cover India. For the US market, an equal number of fact checkers are involved.
Meta also has people reviewing potentially problematic content in 20 Indian languages. Then there are hate speech classifiers (automated detection technology) in Hindi, Bengali, Tamil and Urdu.
Meta is currently in the middle of a controversy involving a whistle blower Frances Haugen, a former employee, who has filed a complaint with the US Securities and Exchange Commission. One of Haugen’s accusations is that Facebook (Meta) does not do enough to prevent the spread of hate speech and misinformation on its platforms.
"We believe large organizations should be scrutinized - good faith criticism helps us get better. But what we've seen recently is a coordinated effort to selectively use leaked documents to paint a false picture of our company," said Bickert.
She added: "At the heart of these allegations is a premise which is plainly false: that we fail to put people who use our service first and that we conduct research which we then systematically ignore. Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or well-being misunderstands where our own commercial interests lie."
One of Haugen’s other allegations, based on what she says are internal documents, is that the company allocated only 13 per cent of its total budget to counter misinformation and hate speech during the financial year 2020 to markets outside of the United States, including India.
"This number is out of context and does not give a comprehensive view of our efforts or investment to curb misinformation and hate speech outside of the US," said Bickert.
"The majority of the resources we have in place to prevent misinformation on our platform focus on content originating outside the United States. For misinformation, one of the primary ways we combat it is through our third-party fact-checking programme, which includes over 80 partners globally, reviewing content in over 60 languages. The majority of those partners review content originating from outside the United States.”
On Thursday, Meta India executives are expected to appear before the Delhi Assembly’s peace and harmony committee. This is in connection with last year’s riots in north east Delhi, triggered by the Citizenship Amendment Act, when hate content that went viral on social media platforms including Facebook was alleged to have caused significant harm.
Asked what Meta considered to be a ‘viral’ post and what action Facebook takes on hateful content, Bickert said there was no specific number that could define a viral post.
“Instead, we try to remove hateful content before anyone sees it. We do not want to see hate on our platform nor do our users or advertisers. And while we will never take down 100 per cent of hate speech our goal is to keep reducing the prevalence of it, which is the amount of it that people actually see. We report on prevalence to show much hate speech we missed, so that we can continue to improve..."