The addition of 'trust' and 'distrust' buttons on social media, alongside standard 'like' buttons, could help to reduce the spread of misinformation, finds a new experimental study.
Incentivising accuracy cut in half the reach of false posts, according to the findings published in eLife.
"Over the past few years, the spread of misinformation, or 'fake news', has skyrocketed, contributing to the polarisation of the political sphere and affecting people's beliefs on anything from vaccine safety to climate change to tolerance of diversity. Existing ways to combat this, such as flagging inaccurate posts, have had limited impact," said Professor Tali Sharot at University College London.
"Part of why misinformation spreads so readily is that users are rewarded with 'likes' and 'shares' for popular posts, but without much incentive to share only what's true.
"Here, we have designed a simple way to incentivise trustworthiness, which we found led to a large reduction in the amount of misinformation being shared," Sharot said.
For the study, the team sought to test out a potential solution, using a simulated social media platform used by 951 study participants across six experiments.
The platforms involved users sharing news articles, half of which were inaccurate, and other users could react not only with 'like' or 'dislike' reactions, and repost stories, but in some versions of the experiment, users could also react with 'trust' or 'distrust' reactions.
The researchers found that the incentive structure was popular, as people used the trust/distrust buttons more than like/dislike buttons, and it was also effective, as users started posting more true than false information in order to gain 'trust' reactions.
Further analysis using computational modelling revealed that after the introduction of trust/distrust reactions, participants were also paying more attention to how reliable a news story appeared to be when deciding whether to repost it.
Additionally, the researchers found that after using the platform, those who had been using the versions with trust/distrust buttons ended up with more accurate beliefs.
"Buttons indicating the trustworthiness of information could easily be incorporated into existing social media platforms, and our findings suggest they could be worthwhile to reduce the spread of misinformation without reducing user engagement," said doctoral student Laura Globig from Massachusetts Institute of Technology.
"While it's difficult to predict how this would play out in the real world with a wider range of influences, given the grave risks of online misinformation, this could be a valuable addition to ongoing efforts to combat misinformation," she added.
--IANS
rvt/vd
(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
)