Killer Robots Must Be Stopped, Say Campaigners
'Autonomous weapons', which could be ready within a decade, pose grave risk to international law, claim activists
By Tracy McVeigh
February 25, 2013 "Information Clearing House" - (The Guardian) - A new global campaign to persuade nations to ban "killerrobots" before they reach the production stage is to be launched in the UK by a group of academics, pressure groups and Nobel peace prize laureates.
'Autonomous weapons', which could be ready within a decade, pose grave risk to international law, claim activists
By Tracy McVeigh
February 25, 2013 "Information Clearing House" - (The Guardian) - A new global campaign to persuade nations to ban "killerrobots" before they reach the production stage is to be launched in the UK by a group of academics, pressure groups and Nobel peace prize laureates.
A scene from the 2003 film Terminator 3: Rise of the Machines. Scientists say killer robots are not science fiction. Photograph: Observer |
Robot warfare and autonomous weapons, the next step from unmanned drones, are already being worked on by scientists and will be available within the decade, said Dr Noel Sharkey, a leading robotics and artificial intelligence expert and professor at Sheffield University. He believes that development of the weapons is taking place in an effectively unregulated environment, with little attention being paid to moral implications and international law.
The Stop the Killer Robots campaign will be launched in April at the House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons.
"These things are not science fiction; they are well into development," said Sharkey. "The research wing of the Pentagon in the US is working on the X47B [unmanned plane] which has supersonic twists and turns with a G-force that no human being could manage, a craft which would take autonomous armed combat anywhere in the planet.
"In America they are already training more drone pilots than real aircraft pilots, looking for young men who are very good at computer games. They are looking at swarms of robots, with perhaps one person watching what they do."
Sharkey insists he is not anti-war but deeply concerned about how quickly science is moving ahead of the presumptions underlying the Geneva convention and the international laws of war.
"There are a lot of people very excited about this technology, in the US, at BAE Systems, in China, Israel and Russia, very excited at what is set to become a multibillion-dollar industry. This is going to be big, big money. But actually there is no transparency, no legal process. The laws of war allow for rights of surrender, for prisoner of war rights, for a human face to take judgments on collateral damage. Humans are thinking, sentient beings. If a robot goes wrong, who is accountable? Certainly not the robot."
He disputes the justification that deploying robot soldiers would potentially save lives of real soldiers. "Autonomous robotic weapons won't get tired, they won't seek revenge if their colleague is killed, but neither will my washing machine. No one on your side might get killed, but what effect will you be having on the other side, not just in lives but in attitudes and anger?
"The public is not being invited to have a view on the morals of all of this. We won't hear about it until China has sold theirs to Iran. That's why we are forming this campaign to look at a pre-emptive ban.
"The idea is that it's a machine that will find a target, decide if it is the right target and then kill it. No human involvement. Article 36 in the Geneva Convention says that any new weapon has to take into account whether it can distinguish and discriminate between combatant and civilian, but the problem here is that an autonomous robot is not a weapon until you clip on the gun."
At present, Sharkey says, there is no mechanism in a robot's "mind" to distinguish between a child holding up a sweet and an adult pointing a gun. "We are struggling to get them to distinguish between a human being and a car. We have already seen utter incompetence in the use of drones, operators making a lot of mistakes and not being properly supervised."
Last November the international campaign group Human Rights Watch produced a 50-page report, Losing Humanity: the Case Against Killer Robots, outlining concerns about fully autonomous weapons.
"Giving machines the power to decide who lives and dies on the battlefield would take technology too far," said Steve Goose, arms division director at Human Rights Watch. "Human control of robotic warfare is essential to minimising civilian deaths and injuries."
US political activist Jody Williams, who won a Nobel peace prize for her work at the International Campaign to Ban Landmines, is expected to join Sharkey at the launch at the House of Commons. Williams said she was confident that a pre-emptive ban on autonomous weapons could be achieved in the same way as the international embargo on anti-personnel landmines. "I know we can do the same thing with killer robots. I know we can stop them before they hit the battlefield," said Williams, who chairs the Nobel Women's Initiative.
"Killer robots loom over our future if we do not take action to ban them now," she said. "The six Nobel peace laureates involved in the Nobel Women's Initiative fully support the call for an international treaty to ban fully autonomous weaponised robots."
No comments:
Post a Comment