Social media giant Facebook has said that it has initiated a series of measures, including blocking fake accounts to combat fake news, proactively monitor abuse and increase advertisement transparency as part of its efforts to prevent outsiders' interference in any general elections across the world.
Facebook has been facing criticism for helping to spur violence and for its policies surrounding misinformation in general.
Mark Zuckerberg, Facebook's founder and CEO, had to face grilling in the US and in the European parliament over the Cambridge Analytica scandal, in which Facebook was accused of unduly influencing the 2016 US electoral process.
The social media giant has done some work in the first half of this year to combat fake news all around the world like in Italy, Colombia, Turkey, and even the primaries for US midterms. By the end of the year it will have focused on more than 50 national elections all around the world, Samidh Chakrabarti, Facebook's head of the civic engagement, said.
Using machine learning Facebook has gotten more effective at either blocking, or quickly removing fake accounts.
"That's important since they're often the root of a lot of the bad activity that we see on our platform. We're now at the point that we stop more than a million accounts per day at the point of creation," Chakrabarti said.
Facebook continues to expand its effort on combating fake news with 27 third-party fact-checking partners in 17 countries.
"We continue to bring increased levels of transparency to online political advertising," he said.
And finally, Facebook is proactively monitoring for abuse. "We've been able to apply powerful computational techniques that are traditionally used to fight spam and apply those to the civic space. And that's really made great advancements recently in our ability to precisely detect and disrupt organized information operations," Chakrabarti said.
To make this happen, Facebook is investing both in technology and in people. Members dedicated to this work having recently passed 15,000 and closer to 20,000 people by the end of the year. The social media giant reviews reports of bad content in over 50 languages, which it does round the clock.
According to Chakrabarti, Facebook is working with groups like the Digital Forensic Research Lab at the Atlantic Council, which works closely with its teams to get them real-time insights and intelligence on emerging threats and disinformation campaigns from all around the world.
It also includes collaborations with government, he said adding that it is working with a superior electoral court and regional court and electoral law enforcement authorities, particularly around online ads.
Responding to questions, Facebook officials did not confirm if Russia is meddling in the US midterm elections.
"We know that Russians and other bad actors are going to continue to try to abuse our platform... during the US midterms and around other events and elections," said Nathaniel Gleicher, head of cybersecurity policy efforts, Facebook.
"We are continually looking for that type of activity, and as and when we find things, which we think is inevitable, we'll notify law enforcement, and where we can, the public," he said.
"We are looking for this sort of activity and these are ongoing investigations. And one of the things we have to be really careful with here is that as we think about how we answer these questions, we need to be careful that we aren't compromising investigations that we might be running or investigations the government might be running," Gleicher said.
Facebook CEO Zuckerberg had promised in last April to ensure security and transparency on its forum ahead of elections in different countries.