NGOs, activists call for ban on 'Killer robots' at UN meet

Image
Press Trust of India Geneva
Last Updated : Apr 14 2015 | 1:32 PM IST
Calling for a preemptive ban on the killer robots at a UN conference, NGOs and activists have said that the fully autonomous weapons could be as lethal as a nuclear bomb and could change the face of warfare.
The debate was the part of a five-day long Convention on Conventional Weapons (CCW) meeting of experts on Lethal Autonomous Weapons Systems which began in Geneva yesterday.
"It (Killer robots) is so critically important because it will change the face of warfare. It is as revolutionary as nuclear bombs. That's gunpowder," political activist and Nobel Peace Laureate Jody Williams said.
"It is a huge leap backward, in our view, morally and ethically. The US and the UK would argue it is a huge leap forward," said Williams.
Though robotic weapons, including armed drones, are used around the world with alarming regularities, military experts say that autonomous killer robots could be the next big thing in modern warfare within two decades.
While China said that the prospect of "cold blooded killing" of humans by autonomous machines is not too distant a prospect, Japan said that it will not make weapons that "commit murder." The US had said last year that it is premature to consider a prohibition.
Williams said that it was completely unacceptable that a handful of nations was holding the world captive on the issue when 80 per cent of the world was against it.
"Right now when drones are used, the humans are the ones looking at the computer screen through the machine's software, assessing the area, selecting a particular target and pressing the button for missile," said Thomas Nash of Article 36, a leading UK-based non-profit organisation working to prevent the unintended, unnecessary or unacceptable harm caused by certain weapons.
"But here we are talking about systems where it wouldn't necessarily be a human looking at the target or pressing the button. Those things would be pre-programmed based on complicated algorithms inside the machine and software," he said.
Human Rights Watch, Article 36 and other NGOs have called for a pre-preemptive ban on such weapons.
"There is precedent for this preemptive ban like with blinding lasers in 1990s. That is the model we are looking for," said Bonnie Docherty of Human Rights Watch pointing at the lack of accountability in using killer robots.
"One thing that strikes me about this issue is that there are interdisciplinary arguments (against this). There is moral, there is legal (arguments). Accountability being a legal argument. There is also the issues of arms race, issues of proliferation. It is the range of arguments that makes this a particularly compelling issue," said Docherty.
*Subscribe to Business Standard digital and get complimentary access to The New York Times

Smart Quarterly

₹900

3 Months

₹300/Month

SAVE 25%

Smart Essential

₹2,700

1 Year

₹225/Month

SAVE 46%
*Complimentary New York Times access for the 2nd year will be given after 12 months

Super Saver

₹3,900

2 Years

₹162/Month

Subscribe

Renews automatically, cancel anytime

Here’s what’s included in our digital subscription plans

Exclusive premium stories online

  • Over 30 premium stories daily, handpicked by our editors

Complimentary Access to The New York Times

  • News, Games, Cooking, Audio, Wirecutter & The Athletic

Business Standard Epaper

  • Digital replica of our daily newspaper — with options to read, save, and share

Curated Newsletters

  • Insights on markets, finance, politics, tech, and more delivered to your inbox

Market Analysis & Investment Insights

  • In-depth market analysis & insights with access to The Smart Investor

Archives

  • Repository of articles and publications dating back to 1997

Ad-free Reading

  • Uninterrupted reading experience with no advertisements

Seamless Access Across All Devices

  • Access Business Standard across devices — mobile, tablet, or PC, via web or app

More From This Section

First Published: Apr 14 2015 | 1:32 PM IST

Next Story