Forgot your password?
typodupeerror
The Military Robotics Technology

Human Rights Watch: Petition Against Robots On the Battle Field 275

Posted by samzenpus
from the why-was-I-programmed-to-feel-pain? dept.
New submitter KublaCant writes "'At this very moment, researchers around the world – including in the United States – are working to develop fully autonomous war machines: killer robots. This is not science fiction. It is a real and powerful threat to humanity.' These are the first words of a Human Rights Watch Petition to President Obama to keep robots from the battlefield. The argument is that robots possess neither common sense, 'real' reason, any sense of mercy nor — most important — the option to not obey illegal commands. With the fast-spreading use of drones et al., we are allegedly a long way off from Asimov's famous Three Laws of Robotics being implanted in autonomous fighting machines, or into any ( semi- ) autonomous robot. A 'Stop the Killer Robots' campaign will also be launched in April at the British House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons. The Guardian has more about this, including quotes from well-known robotics researcher Noel Sharkey from Sheffield University."
This discussion has been archived. No new comments can be posted.

Human Rights Watch: Petition Against Robots On the Battle Field

Comments Filter:
  • by Anonymous Coward on Monday February 25, 2013 @10:03AM (#43002117)
    In robot drone murders and you morons think he will sign something? Obama, Nobel Peace Prize winner that has killed the most innocent women and children yet!
  • by concealment (2447304) on Monday February 25, 2013 @10:17AM (#43002245) Homepage Journal

    I don't mean to be the dark figure in this conversation, but I think it's inevitable that robots will be used on the battlefield, just like people are going to continue to use cluster bombs, land mines, dum-dum bullets and other horrible devices. The reason is that they're effective.

    War is a measurement of who is most effective at holding territory. It is often fought between uneven sides, for example the Iraqi army in their 40-year-old tanks going out against the American Apaches who promptly slaughtered them. Sometimes, there are seeming upsets but often there's an uneven balance behind the scenes there as well.

    Robots are going to make it to the battlefield because they are effective not as killing machines, but as defensive machines. They're an improvement over land mines, actually. The reason for this is that you can programmatically define "defense" where offense is going to require more complexity.

    Already South Korean is deploying robotic machine gun-equipped sentries on its border [cnet.com]. Why put a human out there to die from sniper fire when you can have armored robots watching the whole border?

    Eventually, robots may make it to offensive roles. I think this is more dubious because avoiding friendly fire is difficult, and using transponders just gives the enemy homing beacons. In the meantime, they'll make it to the battlefield, no matter how many teary people sign petitions and throw flowers at them.

  • by Dr. Tom (23206) <tomh@nih.gov> on Monday February 25, 2013 @10:23AM (#43002315) Homepage

    How many times must it be said? Asimov's 3 "laws" have nothing to do with real robotics, future or present. They were a _plot device_, designed to make his (fictional) stories more interesting. Even mentioning them at all in this context implies ignorance of actual robotics in reality. In reality, robot 'brains' are computers, programmed with software. Worry more about bugs in that software, and lack of oversight on the people controlling them.

  • by arcite (661011) on Monday February 25, 2013 @10:32AM (#43002387)

    There are all indications that the coming robotic revolution will usher in a new era of human peace and prosperity. Robots have no emotion, no bias. Imagine deploying a few hundred (or thousand) semi-autonomous robotic peacekeepers into a conflict zone. They maintain the peace 24/7, they never tire, they are alert and objective in their duties. War is traditionally an incredibly wasteful and expensive exercise. Look at Iraq and Afghanistan! $1 trillion and thousands of allied casualties. Deploy a robot army and watch the costs come down. No need for living quarters, no need of food or water, logistics becomes cheaper in every aspect.

    Like them or loath them, Drones are incredibly efficient in what they do. They are very lethal, but they are precise. How many innocents died in the decades of embargo on Iraq and the subsequent large scale bombings under Bush? Estimates run into over 100,000. Use of drones in Libya, Mali, Yemen, Pakistan have reduced costs by hundreds of millions and prevented thousands of needless casualties. Drones are the future and the US has an edge that will not give up.

  • by Anonymous Coward on Monday February 25, 2013 @10:37AM (#43002437)

    In robot drone murders and you morons think he will sign something? Obama, Nobel Peace Prize winner that has killed the most innocent women and children yet!

    I believe Yasar Arafat, Henry Kissinger, Yitzhak Rabin, Shimon Peres, Menachem Begin, and Le Duc Tho all currently lead Obama the "Number of Innocents Killed by a Nobel Peace Prize Winner" race.

  • by ultranova (717540) on Monday February 25, 2013 @10:52AM (#43002599)

    Robots have no emotion, no bias. Imagine deploying a few hundred (or thousand) semi-autonomous robotic peacekeepers into a conflict zone. They maintain the peace 24/7, they never tire, they are alert and objective in their duties.

    An autonomous robot needs to form a model of what's happening around it, use that to figure out what its possible long- and short-term actions will be, and finally decide how desirable various outcomes are relative to each other. All of these steps are prone to bias, especially since whoever designed the robot and its initial database is going to have their own biases.

    Also, a robot acting in real life cannot carefully think everything through. There's simply not enough time for that. This necessiates some kind of emotion-analogy to provide context for reflex and simple actions, just like it does on living beings.

    Look at Iraq and Afghanistan! $1 trillion and thousands of allied casualties. Deploy a robot army and watch the costs come down. No need for living quarters, no need of food or water, logistics becomes cheaper in every aspect.

    So there will be a lot more "interventions", since the cost (to you) is lower. I think that's part of what worries the the HRW.

  • by Caffinated (38013) on Monday February 25, 2013 @10:57AM (#43002687) Homepage
    Well, that raises the question of the "who controls the robots" question, doesn't it?. Presuming that they'd be as effective as you outline (I quite doubt it), they'd be great for making it domestically painless to invade and occupy places that one doesn't like for whatever reason, and I doubt that's a good thing (Iraq and Afghanistan only happened and went on as long as they did since even with the causalities, the pain was almost entirely borne by military families; heck, we didn't even increase taxes to actually pay for it). In short, I'd imagine that you might have a bit of a concern with autonomous foreign peacekeeping robots patrolling your neighborhood, and I'd expect that people in other places feel that way as well.
  • Re:Deal with it. (Score:4, Insightful)

    by Darth Snowshoe (1434515) on Monday February 25, 2013 @11:04AM (#43002813)

    You are describing your own fantasy rather than a reasoned prediction.

    Surely once the robots break through the curtain of defenders, they will begin quite efficiently to the civilian population and their infrastructure. How would robots even distinguish between them? (In fact, this is a difficulty for human soldiers today.) Is it not likely that civilians would attempt, at the last, to defend themselves and their families also?

    The hope for humanity is not that the winners will somehow be more virtuous than the losers. Our only hope is that, as the consequences of armed conflict escalate, the number and severity of conflicts will dwindle.

The only possible interpretation of any research whatever in the `social sciences' is: some do, some don't. -- Ernest Rutherford

Working...