PDA

View Full Version : Terminator-like killer robots worry UN



Serpo
30th May 2013, 03:35 PM
Should robots be allowed to take a human life, without direct supervision or command?
Science fiction met reality at the United Nations in Geneva overnight, where this question was debated at a meeting of the Human Rights Council.
UN special rapporteur Christof Heyns told the council that countries are developing armed robots that can kill without the need for human choice or intervention, and they need to call a halt before it's too late.
"The possible introduction of LARs (lethal autonomous robots) raises far-reaching concerns about the protection of life during war and peace," Heyns said. "If this is done, machines and not humans, will take the decision on who is alive or dies."
Heyns presented a report on his research and called for a worldwide moratorium on the production and deployment of such machines, while nations figured out the knotty legal and ethical issues.
"War without reflection is mechanical slaughter," he said. "In the same way that the taking of any human life deserves - as a minimum - some deliberation, a decision to allow machines to be deployed to kill human beings deserves a collective pause worldwide."
Heyns warned that if humans are taken "out of the loop" then it could make war more likely.
It was also unclear how these killer robots could be programmed to distinguish the enemy from innocent civilians.
And because they lacked the ability to act "out of compassion or grace" and understand the bigger picture, a robot would never decide that some specific situations required greater leniency, even in wartime.
In his report, Heyns said robots will be "the next major revolution in military affairs, on par with the introduction of gunpowder and nuclear bombs"
Officially, governments who are capable of producing "Lethal Autonomous Robots" are not currently planning to use them.
Some argue that, "as a matter of principle, robots should not be granted the power to decide who should live and die," the report said – though others say that, used well, they could "even make armed conflict more humane and save lives on all sides".
Heyns acknowledged that future generations of robots could be able to employ less lethal force, causing fewer unnecessary deaths, with their greater ability to immobilise or disarm a target.
"LARs will not be susceptible to some of the human shortcomings that may undermine the protection of life," the report said. "Typically they would not act out of revenge, panic, anger, spite, prejudice or fear.
"Robots also do not rape."
During the debate Pakistan's council delegate – speaking on behalf of 56 Islamic states – said the international community should consider a complete ban, not just national moratoria.
Lethal autonomous robots would fundamentally change the nature of war, she said.
Pakistan has been the focus for anti-terrorism drone strikes. "The experience with drones shows that once such weapons are in use, it is impossible to stop them," she said.
Most of the delegates said they found the report interesting and worthy of further debate, though several said it would be better negotiated outside of a human rights forum.
The European Union delegate said the question would be more appropriately dealt with by arms control negotiations between states. Germany supported the idea of an international register for all unmanned systems.
Argentina warned at a potential killer robot arms race, and possible use by terrorists.
The US delegate pointed out that some systems, such as the Aegis and Patriot surface-to-air missile defence systems, already have an "autonomous mode" that acts when a split-second response is needed.
Last November the USA Department of Defence issued a policy directive for autonomous weapon systems, highlighting technical dangers such as "unintended engagements" (ie, killing the wrong person) and "loss of control of the system to unauthorised parties" (ie, enemies hacking your robots and turning them against you).
France said the "role of humans in the decision to fire must be retained, however the UK said existing law and treaties were sufficient to govern lethal autonomous robotics.
Russia said such machines could "undermine legal order" but did not comment on the report's recommendations.
No 'killer robots' as such are yet known to exist, but precursor technology is already used in the US, UK, Israel and South Korea (see breakout) – and possibly in Russia and China.
Unmanned drones have their weapon systems controlled remotely by humans.
As the potential for autonomous weapons has grown, several organisations have started arguing for a ban or moratorium.
Last year Human Rights Watch issued a report on "Losing Humanity: the case against killer robots".
HRW employee Mary Wareham is co-ordinating the Campaign to Stop Killer Robots.
She said this was a "day of firsts", including the first time governments have publicly discussed the issue.
"People have been concerned about this for quite a while now and it's come to fruition … and it's had a really excellent response," she said. "One of our fears was that they would say 'why are we discussing this, is it really a problem'. But nobody said that. Many were asking how are we taking this forward, who's going to take this forward."
HRW will be campaigning for governments including Australia, New Zealand and Canada, who did not take part in the debate, to make their position clear.
"There is a debate going on between the technology people and the more traditional warriros, and it reflects an unease with the trend towards autonomy in warfare," Ms Wareham said. "There are quite a few military who are not happy about this."
Some country needed to "champion" the issue on the world stage, to move towards an international treaty, she said.
REAL-LIFE KILLER ROBOTS
Current "semi-autonomous" lethal weapons include:
- the US Phalanx anti-air system that automatically detects, tracks and engages anti-ship missiles and aircraft
- Israel’s Harpy, a "fire-and-forget" weapon that destroys radar emitters
- The US Navy’s X-47B, a fighter-size drone prototype that can launch, navigate and land autonomously
- The UK's Taranis, a jet-propelled combat drone prototype that can autonomously search, identify and locate enemies, though it only opens fire after human authorisation.
- The Samsung Techwin guard robots deployed in the demilitarised zone between North and South Korea, that detect targets through infrared sensors. They are operated by humans but also have an "automatic mode".
- FFX Aus

http://www.stuff.co.nz/technology/gadgets/8740534/Terminator-like-killer-robots-worry-UN

Santa
30th May 2013, 04:25 PM
No 'killer robots' as such are yet known to exist, but precursor technology is already used in the US, UK, Israel and South Korea (see breakout) – and possibly in Russia and China.

Oy vey! I'm so relieved that it's only the "good" guys that are developing killer robots.


"Robots also do not rape."

That's nice to know. Unless of course they're programmed to rape too. And why not program killer robots to rape? Just add on big metal dildos and send them out. Easy Shmeezy.

Serpo
30th May 2013, 04:35 PM
Oy vey! I'm so relieved that it's only the "good" guys that are developing killer robots.



That's nice to know. Unless of course they're programmed to rape too. And why not program killer robots to rape? Just add on big metal dildos and send them out. Easy Shmeezy.

This is meant to give you a warm fuzzy feeling ,they cant rape ,only kill,..........

Santa
30th May 2013, 04:48 PM
This is meant to give you a warm fuzzy feeling ,they cant rape ,only kill,..........

Yeah, wtf! Load em up with Obama sperm, or even better, Henry Kissinger sperm and send a couple thousand of em to Boise Idaho.

gunDriller
1st June 2013, 11:47 AM
ah, yes. worry about the sci-fi killer robots and ignore the REAL LIVE killer robots - Jews who continue to practice ritual human sacrifice, through their positions at the Pentagon & elsewhere.

BrewTech
1st June 2013, 01:12 PM
It was also unclear how these killer robots could be programmed to distinguish the enemy from innocent civilians.

What difference does it make? Human warmakers can't be bothered to make the distinction, so why bother programming robots to do it?

gunDriller
2nd June 2013, 12:13 PM
What difference does it make? Human warmakers can't be bothered to make the distinction, so why bother programming robots to do it?

"their family was killed by insurgents" - a line from an episode of NCIS.

not, "their family was killed by the US military, conducting military operations because they're Israel's bitch", which would actually be true.


the NCIS writing staff is supervised by Don Belisario, retired US military & Super-Shabbas-Goyim.

NCIS and other shows, like "Person of Interest", are all post-9-11 shows created to push the War on Terror theme.


i wonder which TV show will push the "Killer American Robots Keep us Safe" theme, first ?

actually, i think Wired Magazine & Popular Mechanics will try to win that contest.

Wired magazine used to be genuinely radical, now they are Super Butt-kissing.