Editing bots on Wikipedia undo vandalism, enforce bans, check spelling, create links and import content automatically, whereas other bots (which are non-editing) can mine data, identify data or identify copyright infringements.
Researchers from the University of Oxford and the Alan Turing Institute in the UK analysed how much they disrupted Wikipedia, observing how they interacted on 13 different language editions over ten years (from 2001 to 2010).
Researchers said that bots are more like humans than you might expect. Bots appear to behave differently in culturally distinct online environments.
The findings are a warning to those using artificial intelligence for building autonomous vehicles, cyber security systems or for managing social media.
We may have to devote more attention to bots' diverse social life and their different cultures, researchers said.
The research found that although the online world has become an ecosystem of bots, our knowledge of how they interact with each other is still rather poor.
Researchers found that German editions of Wikipedia had fewest conflicts between bots, with each undoing another's edits 24 times, on average, over ten years.
This shows relative efficiency, when compared with bots on the Portuguese Wikipedia edition, which undid another bot's edits 185 times, on average, over ten years, researchers said.
Bots on English Wikipedia undid another bot's work 105 times, on average, over ten years, three times the rate of human reverts, they said.
The findings show that even simple autonomous algorithms can produce complex interactions that result in unintended consequences - 'sterile fights' that may continue for years, or reach deadlock in some cases.
Although such conflicts represent a small proportion of bots' overall editorial activity, the findings are significant in highlighting their unpredictability and complexity.
"We find that bots behave differently in different cultural environments and their conflicts are also very different to the ones between human editors," said Milena Tsvetkova, from the Oxford Internet Institute.
"This has implications not only for how we design artificial agents but also for how we study them. We need more research into the sociology of bots," said Tsvetkova.
Disclaimer: No Business Standard Journalist was involved in creation of this content
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
