You should not trust Wikipedia blindly for every topic as entries on politically controversial scientific topics could be subjected to "information sabotage", a study says.
Study co-author Gene E. Likens, a distinguished research professor at the University of Connecticut, has monitored Wikipedia's acid rain entry since 2003.
Likens had co-discovered acid rain in North America.
"In the scientific community, acid rain is not a controversial topic. Its mechanics have been well understood for decades," Likens said.
"Yet, despite having 'semi-protected' status to prevent anonymous changes, Wikipedia's acid rain entry receives near-daily edits, some of which result in egregious errors and a distortion of consensus science," he said.
In an effort to see how Wikipedia's acid rain entry compared to other scientific topics, Likens partnered with Adam M. Wilson, a geographer at the University at Buffalo.
Together, they analysed Wikipedia edit histories for three politically controversial scientific topics (acid rain, evolution, and global warming), and four non-controversial scientific topics (the standard model in physics, helio-centrism, general relativity, and continental drift).
Using nearly a decade of data, Likens and Wilson teased out daily edit rates, the mean size of edits (words added, deleted, or edited), and the mean number of page views per day.
While the edit rate of the acid rain article was less than the edit rate of the evolution and global warming articles, it was significantly higher than the non-controversial topics.
Across the board, politically controversial scientific topics were edited more heavily and viewed more often, the researchers said.
"Wikipedia's global warming entry sees 2-3 editings a day, with more than 100 words altered, while the standard model in physics has around 10 words changed every few weeks," Wilson said.
"The high rate of change observed in politically controversial scientific topics makes it difficult for experts to monitor their accuracy and contribute time-consuming corrections," he said.
"As society turns to Wikipedia for answers, students, educators, and citizens should understand its limitations when researching scientific topics that are politically charged," Likens said.
Wikipedia does employ algorithms to help identify and correct blatantly malicious edits, such as profanity.
However, Likens and Wilson urged users to cast a critical eye on Wikipedia source material, which is found at the bottom of each entry.
The study was published in the journal PLOS ONE.
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
