Google workers have got word that the internet titan will retreat from a deal to help the US military use artificial intelligence to analyze drone video following an outcry from staff, according to reports.
The collaboration with the US Department of Defense was said to have sparked rebellion inside the California-based company.
An internal petition calling for Google to stay out of "the business of war" garnered thousands of signatures, and some workers reportedly quit to protest a collaboration with the military.
The New York Times and tech news website Gizmodo cited unnamed sources as saying that a Google's cloud team executive announced told employees on Friday that the company would not seek to renew the controversial contract after it expires next year.
The contract was reported to be worth less than $10 million to Google, but was thought to have potential to lead to more lucrative technology collaborations with the military.
Google did not respond to a request for comment.
Google has remained mum about Project Maven, which reportedly uses machine learning and engineering talent to distinguish people and objects in drone videos for the Defense Department.
"We believe that Google should not be in the business of war," the employee petition reads, according to copies posted online.
"Therefore, we ask that Project Maven be cancelled, and that Google draft, publicize and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology." The Electronic Frontier Foundation, an internet rights group, and the International Committee for Robot Arms Control (ICRAC) were among those who have weighed in with support.
"As military commanders come to see the object recognition algorithms as reliable, it will be tempting to attenuate or even remove human review and oversight for these systems," ICRAC said in an open letter.
"We are then just a short step away from authorizing autonomous drones to kill automatically, without human supervision or meaningful human control." Google has gone on the record saying that its work to improve machines' ability to recognize objects is not for offensive uses.
The EFF and others stressed the need for moral and ethical frameworks regarding the use of artificial intelligence in weaponry.
"The use of AI in weapons systems is a crucially important topic and one that deserves an international public discussion and likely some international agreements to ensure global safety," the EFF said in a blog post on the topic.
Disclaimer: No Business Standard Journalist was involved in creation of this content
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
