Forgot your password?
typodupeerror
Robotics The Military Technology

Weapons Systems That Kill According To Algorithms Are Coming. What To Do? 514

Posted by Soulskill
from the have-them-fight-the-decepticons dept.
Lasrick writes "Mark Gubrud has another great piece exploring the slippery slope we seem to be traveling down when it comes to autonomous weapons systems: Quote: 'Autonomous weapons are robotic systems that, once activated, can select and engage targets without further intervention by a human operator. Advances in computer technology, artificial intelligence, and robotics may lead to a vast expansion in the development and use of such weapons in the near future. Public opinion runs strongly against killer robots. But many of the same claims that propelled the Cold War are being recycled to justify the pursuit of a nascent robotic arms race. Autonomous weapons could be militarily potent and therefore pose a great threat.'"
This discussion has been archived. No new comments can be posted.

Weapons Systems That Kill According To Algorithms Are Coming. What To Do?

Comments Filter:
  • Skynet (Score:5, Insightful)

    by ackthpt (218170) on Wednesday January 08, 2014 @07:25PM (#45902517) Homepage Journal

    Yet another predictor.

    Bring on the Terminators.

  • by jjeffries (17675) on Wednesday January 08, 2014 @07:28PM (#45902539)

    They're not "coming" as if from space. We just need to choose for them not to exist and they won't. These things will (or won't) be made by individuals who can make moral decisions.

    Don't be a terrible individual; don't make or participate in the making of terrible things.

  • Sci-Fi to watch... (Score:2, Insightful)

    by Anonymous Coward on Wednesday January 08, 2014 @07:30PM (#45902557)

    Terminator

    ST TNG: Arsenal of Freedom

    Etc...

  • by geekoid (135745) <dadinportlandNO@SPAMyahoo.com> on Wednesday January 08, 2014 @07:36PM (#45902621) Homepage Journal

    Except looking at history, they will probable lead to fewer soldier deaths, fewer bystander deaths, more accurate targeting.

    I don't know why people think they are bad.

  • by mi (197448) <slashdot-2012@virtual-estates.net> on Wednesday January 08, 2014 @07:44PM (#45902687) Homepage

    Weapons Systems That Kill According To Algorithms Are Coming

    I don't get this... Aren't human soldiers killing based on something other than algorithms? Or is it that the implementations are coded in vague human languages, that makes them feel somehow warm and fuzzy? Well, Pentagon's Ada may be considered similar, but only in jest...

    I'd say, whether such systems are bad or good is still up to the algorithms, not the hardware (nor pinkware), that executes them.

  • by Opportunist (166417) on Wednesday January 08, 2014 @07:52PM (#45902797)

    We have more accurate weapons than ever. Compare the average cruise missile to the average arrow and tell me:

    1. Which one is more accurate?
    2. Which one causes more deaths?

    You will notice that they are NOT mutually exclusive. Quite the opposite.

  • One guy'll be making a computer vision system to recognize faces "to make it easier to log in to your cellphone".

    Another guy'll be making a robot painting system that aims it's cars "so make a more profitable assembly line".

    Yet another'll make a self-driving car "so you won't have to worry about drunk drivers anymore".

    Once those pieces are all there (hint, today), it doesn't take much for the last guy to glue the 3 together; hand it a gun instead of spraypaint; and load it with a databases of faces you don't like.

  • by Anonymous Coward on Wednesday January 08, 2014 @08:14PM (#45902987)

    Except looking at history, they will probable lead to fewer soldier deaths, fewer bystander deaths, more accurate targeting.

    I don't know why people think they are bad.

    Extra-judicial killings of US citizens.

  • by TiggertheMad (556308) on Wednesday January 08, 2014 @08:15PM (#45903007) Homepage Journal
    I think that there is a difference, though. It is one thing to create unrelated technology that when linked together is dangerous. It is another thing to just create technology that doesn't have an application outside of killing people. By your argument, every invention all they way back to using flint and tinder to create fire is nothing but a weapon, and why should we even have bothered?

    My prediction is that this technology will float about the edge of popular awareness, until an unbalanced individual sets up a KILLMAX(tm) brand 'smartgun perimeter defense turret' in an elementary school and murders a bunch of children and escapes because he didn't have to be on the scene. Then national outrage will lead to mass bans on such weapons.

    Should we be making such weapons? I don't know, I suppose that the argument can be made that they fill the same role as land mines, but have the upside that there is less problem with getting rid of them when the fighting stops. I find the glee we as a species have in building better was of killing each other to be really depressing on the whole.
  • by Charliemopps (1157495) on Wednesday January 08, 2014 @08:25PM (#45903097)

    Or... and I know this sounds crazy... we could just not kill people anymore. I know we like to be the super heroes of the world, running around fighting everyones wars and everything... hell, I used to think that way to. But at a certain point you just have to stand back and say "you know what? Fuck it. I'm done blowing 1/3rd of our budget dropping bombs on people I don't know for a cause I barely understand just to have any and all progress erased in a few years because the real problems in other parts of the world have little to do with their totalitarian leaderships."

  • by zaft (597194) on Wednesday January 08, 2014 @08:27PM (#45903117) Homepage

    Except looking at history, they will probable lead to fewer soldier deaths, fewer bystander deaths, more accurate targeting.

    I don't know why people think they are bad.

    Extra-judicial killings of US citizens.

    Let's call it what is is: murder of innocent US citizens.

    (don't think they are innocent? They are innocent until proven guilty!)

  • by Hatta (162192) on Wednesday January 08, 2014 @08:40PM (#45903209) Journal

    No. These things *will* be made, by people who make immoral decisions. The people who get to make those sorts of decisions are already mostly terrible people.

  • by Anonymous Coward on Wednesday January 08, 2014 @08:45PM (#45903237)

    Useless comment.

    If you live in certain countries and pay taxes in certain countries, one day you WILL participate in making of terrible things.

  • by sjames (1099) on Wednesday January 08, 2014 @08:57PM (#45903343) Homepage

    Far too easy for all humans involved to disavow any responsibility when the thing shoots up a busload of children. No ability to decide the CO has gone nutsy cuckoo and report up the chain of command. No ability to decide the CO's order is just plain illegal and refuse.

    Nobody to report back home about how ugly and unnecessary it all is. Killing people, especially lots of people should NOT be cost effective.

    Other than that, it's just great.

  • by Anonymous Coward on Wednesday January 08, 2014 @09:17PM (#45903445)

    How about: Murder of innocent citizens.

    95% of them aren't americans (me included). Why would the distinction be important?

  • by Anonymous Coward on Wednesday January 08, 2014 @09:42PM (#45903595)

    To all the engineers working on this: you're responsible. You are doing this. You are a terrible person.

  • by ozmanjusri (601766) <aussie_bob&hotmail,com> on Wednesday January 08, 2014 @10:08PM (#45903731) Journal

    We still rely on chemical energy to power our weapons and as such they all have the ultimate fail safe system.

    Brace yourself before clicking the link. This may come as a surprise to you.

    http://en.wikipedia.org/wiki/Nuclear_weapon [wikipedia.org]

  • Blame Canada (Score:2, Insightful)

    by Anonymous Coward on Wednesday January 08, 2014 @10:13PM (#45903763)

    Failsafe system will be contracted out to the people who profited by writing and then fixing the Affordable Healthcare websites.

    So the Canadians will be responsible for SkyNet?

  • by dryeo (100693) on Wednesday January 08, 2014 @10:28PM (#45903835)

    How about: Murder of innocent citizens.

    95% of them aren't americans (me included). Why would the distinction be important?

    Americans don't seem to think that non-Americans are people, therefore not deserving of rights

  • Re:Skynet (Score:4, Insightful)

    by farble1670 (803356) on Wednesday January 08, 2014 @11:03PM (#45903997)

    These are only a problem if they are built and used.

    how do you know that smart weapons won't result in fewer deaths, and fewer deaths of non-combatants?

    humans have a pretty poor track record and it wouldn't take much to approve upon. if you think the man in the trenches is making good judgments about when and who to kill, you should talk to a vietnam vet.

  • by GodfatherofSoul (174979) on Wednesday January 08, 2014 @11:19PM (#45904069)

    I can't remember the documentary; maybe Fog of War starring Satan's favorite child Robert McNamara. But, they figured out that in combat 25% of of soldiers weren't actually shooting at other people. They were intentionally shooting up in the air to avoid killing. So, part of the Army's training post WWII was to get soldiers to fire without thinking. The outcome was soldiers were more effective in battle. The consequence was soldiers weren't evaluating the act of taking lives until AFTER they'd done it which contributed to the increased mental issues Vietnam-era soldiers endure.

  • by Charliemopps (1157495) on Thursday January 09, 2014 @01:17AM (#45904453)

    Yea, but they CANT destroy the US. It's not possible. It's like we live in a mansion and a rat ran in and shit on our floor. So now we have the entire staff chopping up the floorboards and taring the plaster off the walls looking for the fucking thing. We're doing far more damage than the stupid rat ever could. Some pests just don't go away, so you have to keep the cheese in the fridge, put out some traps and deal with it. Don't burn the house down around you just to win.

  • by MtHuurne (602934) on Thursday January 09, 2014 @02:35AM (#45904675) Homepage

    To kill someone with a knife, you have to stand very close to them and thrust the weapon into their body. To kill them with a gun, they have to be in line of sight and pull the trigger. To kill them with a drone, you need them on live camera and push a button. To kill them with an autonomous robot, you need to have a description of what they look like and what area they are located in and program that into the robot. Every step becomes more indirect, more emotionally detached.

    "Guns don't kill people" is just a slogan. A gun is a tool. For killing people. The real questions include "Do guns deter crime or make it more violent?" and "Does home gun ownership help prevent a government from turning on its own people?", but those have no simple answers, so they are not as useful in propaganda.

  • by sociocapitalist (2471722) on Thursday January 09, 2014 @07:07AM (#45905335)

    Weapons Systems That Kill According To Algorithms Are Coming

    I don't get this... Aren't human soldiers killing based on something other than algorithms? Or is it that the implementations are coded in vague human languages, that makes them feel somehow warm and fuzzy? Well, Pentagon's Ada may be considered similar, but only in jest...

    I'd say, whether such systems are bad or good is still up to the algorithms, not the hardware (nor pinkware), that executes them.

    For me the big difference is that if you activate the military to suppress their own populace when it demonstrates that the soldiers can at least choose not to follow orders.

    The idea of the US (for example) with the ever increasing trend of the suppression of constitutional rights having robots that kill whoever they're activated against is terrifying.

  • Re:Skynet (Score:5, Insightful)

    by YttriumOxide (837412) <yttriumox@gma[ ]com ['il.' in gap]> on Thursday January 09, 2014 @08:38AM (#45905633) Homepage Journal

    because all evidence shows that the weak point always lies with the soldier that has to pull the trigger and decide to kill a fellow human being.

    All evidence that I've seen shows that a large number - possibly even the majority - of soldiers have been brainwashed in to following orders unconditionally and will commit the most horrendous crimes against humanity when ordered to do so. And - even when not ordered - that same brainwashing includes training in not thinking of 'the enemy' as human, because that causes you to delay in the critical moment. So they dehumanise the enemy to the point that further atrocities can be committed even when not under orders to do so.

    Note that I don't blame the soldiers themselves in a lot of these situations - they are often good people who given time to think and reason it through would not behave that way, but their training has so messed with them that some actions they'll take don't reflect on the person they are.

    Also note that I did say "a large number of soldiers" and not all. There are plenty of cases you can find of soldiers going against orders they believe to be morally reprehensible, but the fact that OTHER soldiers then do it is a testament for the argument and not against it.

"Whoever undertakes to set himself up as a judge of Truth and Knowledge is shipwrecked by the laughter of the gods." -- Albert Einstein

Working...