Experts are currently gathering in Geneva for a conference to look at the issue amid warnings of the threat posed by fully autonomous weapons systems.
In a description which called to mind the sci-fi Terminator portrayed by Arnold Schwarzenegger, Rasha Abdul Rahim, an advisor on Artificial Intelligence and Human Rights with Amnesty International, painted a nightmarish future of a world dominated by devastatingly destructive machines.
And he warned: “Killer robots are no longer the stuff of science fiction.
“From artificially intelligent drones to automated guns that can choose their own targets, technological advances in weaponry are far outpacing international law.
“We are sliding towards a future where humans could be erased from decision-making around the use of force.”
Mr Rahim added: “It’s not too late to change course. A ban on fully autonomous weapons systems could prevent some truly dystopian scenarios, like a new high-tech arms race between world superpowers which would cause autonomous weapons to proliferate widely.
“We are calling on states present in Geneva this week to act with the urgency this issue demands, and come up with an ambitious mandate to address the numerous risks posed by autonomous weapons.”
The Convention of Certain Conventional Weapons (CCW) gathered for the four-day conference yesterday.
The conference follows a meeting in April which emphasised the important of retaining human control over weapons systems and use of force.
26 nations, including, Austria, Brazil and Egypt called for a total ban – but Amnesty said the UK, along with France, Israel, Russia, South Korea and the USA, were trying to develop autonomous weapons systems.
Quite apart from the risk that such technology will eventually be used in warfare, Mr Rahim added that there was also grave concern about it being made available to law enforcement agencies.
He said: “So far, the likelihood that autonomous weapons will be used in police operations, with all the risks that entails, has been largely overlooked.
“But drones capable of shooting electric-shock darts, tear gas and pepperball already exist.
“Israel recently deployed semi-autonomous drones to fire tear gas at protesters in Gaza, and we are likely to see more use by law enforcement agencies of this kind of technology in future.
“The use of fully autonomous weapons in law enforcement without effective and meaningful human control would be incompatible with international human rights law, and could lead to unlawful killings, injuries and other violations of human rights.
“We are calling on states to take concrete steps to halt the spread of these dangerous weapons, both on the streets and on the battlefield, before it’s too late.”
Amnesty International and its partners in the Campaign to Stop Killer Robots are calling for a complete ban on the development, production and use of fully autonomous weapon systems, in light of the human rights and humanitarian risks they pose.
The concept of deadly machines capable of operating independently of human command has been the subject of considerable discussion from decades.
In science fiction author Isaac Newton’s 1942 short story Runaround, he set out his “Three Laws of Robotics” which have generally been held to be the standard to which all artificial intelligence should adhere ever since.
These state that:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Lawsent drones to automated guns that can choose their own targets, technological advances in weaponry are far outpacing international law.