In a recent announcement, Mark Zuckerberg said that Facebook would make important changes to treatment of news. First, it intends to reduce the news content from about 5 per cent of the average news feed to 4 per cent. And second, it will survey Facebook users to get a handle on the trustworthiness of news sources. In an open post, the chief executive officer of FB stated: “I’ve asked our product teams to make sure we prioritise news that is trustworthy, informative, and local...There’s too much sensationalism, misinformation and polarisation in the world today. Social media enables people to spread information faster than ever before, and if we don’t specifically tackle these problems, then we end up amplifying them”. But there is a fly in the ointment; rather than attempting to verify the credibility and trustworthiness of news sources internally, Facebook intends to crowd-source this by surveying the user-community. It will start with the US and then roll out such surveys to other places. India, as the largest Facebook user base, is likely to see such a survey rolling out fairly soon. This exercise has quite a few implications.
First, it may be possible to game such surveys in multiple ways, and the purveyors of fake news and propagandists have every incentive to do so. Many of these organisations are already experts at gaming Facebook's algorithms by setting up fake accounts, Facebook pages, etc. They will make every effort to develop a high ranking on any credibility survey. By its own estimate, Facebook has over 270 million fake accounts, so this could be a significant problem. Assuming that the social media giant works past this issue, it will still have to control and correct for inherent user-biases. Social media creates and amplifies bubbles for every user, and there is a critical difference between “truth” and “trustworthiness”. For example, the statement, “The earth moves around the sun” is true, regardless of the trustworthiness of the source, while the statement “Cows exhale oxygen” is false. But a “trustworthy” source like a website run by a religious leader may indeed claim that cows exhale oxygen. More nuanced issues arise with a statement like “Mr X, the leader of Y nation, went to Z city where he talked rubbish”. The first part of the statement is verifiably true (or false). The second part is likely opinion (assuming the individual in question did not claim cows exhale oxygen). Hence, control for user-biases becomes extremely important in any such survey.
Readership surveys for the trustworthiness and credibility of news sources have often been conducted in the past by media organisations as well as academic institutions. The Reynolds Journalism Institute at the University of Missouri did one in July 2017 where it attempted to control for political orientation, race, levels of financial support for mainstream media, education, etc. Facebook is actually in a position to undertake such an exercise with a high degree of efficiency if it chooses to, as it already knows a great deal about its users such as what engages them, and what are their biases.
Moreover, will branding of a news source as trustworthy help to drive engagement for the social media platform? Maybe. It can certainly help the platform judge what sources should feature more often in the personalised news feeds that it serves up to individual users. Facebook is presumably hoping that this exercise will help itself become more credible and trusted in the wake of accusations that the platform was used to spread vast amounts of disinformation during the last US presidential elections and the Brexit referendum. Given that it is arguably the largest news platform in the world, a “Trusted by Facebook users” branding will clearly have huge potential for monetisation. It will be interesting to see how the social media giant carries out this exercise without diluting its laser focus on driving user-engagement.