Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Robotics The Military

Elon Musk Backs Call For A Global Ban On Killer Robots (cnn.com) 214

An anonymous reader quotes CNN: Tesla boss Elon Musk is among a group of 116 founders of robotics and artificial intelligence companies who are calling on the United Nations to ban autonomous weapons. "Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend," the experts warn in an open letter released Monday...

"Unlike other potential manifestations of AI, which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability," said Ryan Gariepy, the founder of Clearpath Robotics and the first person to sign the letter. More than a dozen countries -- including the United States, China, Israel, South Korea, Russia and Britain -- are currently developing autonomous weapons systems, according to Human Rights Watch.

This discussion has been archived. No new comments can be posted.

Elon Musk Backs Call For A Global Ban On Killer Robots

Comments Filter:
  • When I saw this post the first thing that popped into my mind was:

              https://en.wikipedia.org/wiki/... [wikipedia.org]

    This was one of my favorite comic books back in the 60s

  • Land mines (Score:5, Insightful)

    by Anonymous Coward on Monday August 21, 2017 @02:50AM (#55055361)

    Why don't you start with banning land mines first?

    Land mine is the simplest "autonomous weapon" you can have, its definition is clear and well understood, it is already actively being used (much more than "on the cusp of development") and is causing harm on civilians.

    We all know why -- the US won't stop using land mines while most other countries have already stopped.

    So, instead of calling for a ban (on land mines) that might actually change something, Elon is calling for a ban (on some fantasy weapon) that is mere posturing and makes him feel good.

    And of course we already knew that all countries, including the good old US of A, would continue the development of such weapons regardless of such a ban.

    • Re:Land mines (Score:5, Informative)

      by ShanghaiBill ( 739463 ) on Monday August 21, 2017 @03:16AM (#55055399)

      the US won't stop using land mines.

      The US does not employ landmines anywhere other than the Korean DMZ. They are used there because North Korea also uses them, and removing them would require increases in other capabilities. Any other capabilities could be used offensively, and would be destabilizing, while landmines are purely defensive.

      If other countries really feel that these landmines are unjustified, they are welcome to come and defend the DMZ without mines, and the 28,000 American troops in South Korea can come home.

    • Why don't you start with banning land mines first?

      You can't order a bunch of land mines to clear the streets of starving, rioting serfs. Yet..
      But a solar powered terminator won't bat an eye when you tell it to commit war crimes on your behalf.

  • good luck with that one, These already pretty well exist in current military lineups, none of the major powers are going to agree to this. Also the current weapons that do the most harm to innocents are the ones WITHOUT any sort of AI, cluster bombs, fuel air bombs, guns etc, a good AI I think would likely improve the situation.
    • It will be much like the ban on "machine guns" a century or so ago. The Great Powers will all agree that it only OK to use them against "terrorists" (or "savages" as the excuse was phrased back then) until there is an actual real war and the ban will go out the window
  • by xski ( 113281 ) on Monday August 21, 2017 @03:10AM (#55055391)
    Does anyone seriously think this isn't going to happen? Even after they sleep-off whatever they're on?

    They'll be made. They'll be deployed. They'll go wrong. They'll be refined and we'll be assured it will never happen again. It will.

    Lather, rinse, repeat.

    • Re: (Score:3, Interesting)

      by AmiMoJo ( 196126 )

      People seem to misunderstand what these bans are far. Nukes are banned, but North Korea made them anyway, so might as well not ban them? Is the ban really totally ineffective, or has it allowed us to prevent many more countries from getting nukes and put immense pressure on NK (including sanctions) to stop its own programme?

      Banning killer robots will make it harder to build them, and create negative consequences for having them. Every country will have to decide if it is worth the sanctions and economic fal

      • by swb ( 14022 )

        I would say there's no bans on nuclear weapons, just anti-proliferation strategies that make it a requirement to make your own from scratch and a material strategy that prevents key industrial components from being obtainable by prohibited nations.

        But really, any country with a sufficiently developed industrial base and focus can build a nuclear weapon and there's no way to stop them short of military intervention. That's how Britain, France, Israel, India and Pakistan wound up with them

      • Unfortunately, there are a lot of problems regulating dual-use technologies at any scale. We're seeing this at the small scale now, as terrorists learn that they can simply fill cars full of propane cylinders and improvise an explosive missile.

        The real problem with a ban on autonomous weapons is that they're basically only useful to wealthy industrialised nations: i.e. the ones that can easily violate this kind of ban without fearing too much threat from sanctions. There's little need for, for example,

      • Is the ban really totally ineffective, or has it allowed us to prevent many more countries from getting nukes and put immense pressure on NK (including sanctions) to stop its own programme?

        The ban is totally ineffective. Any country that wants to bother developing nukes isn't going to pay attention to the Ban, and any country that doesn't want to develop nukes isn't going to care that the Ban exists.

        Now, if the Ban included "if you develop nuclear weapons, we'll nuke your country till it glows in the dark

  • Sure, there is going to be a period in which drones and robots can kill humans. But then as countermeasures, they will develop anti-robot robots, and before you know it, entire wars will be fought without a single human life lost

    Bring in the robot soliders, I say.

    • by mark-t ( 151149 )

      ... And when lives aren't lost, there will be no real incentive for either side to offer surrender, imitating to no small measure what was happening in A Taste of Armageddon [wikipedia.org].

      You suggest that no human lives would be lost, and that may be true, but what about human rights? Or do you seriously think that being at war wouldn't impact those?

  • by OpenSourced ( 323149 ) on Monday August 21, 2017 @04:05AM (#55055529) Journal

    Killer autonomous robots is one of the things that must be, once the tech is there.

    ISIS is already using small drones to bomb their enemies. You can watch the videos online. Those are small commercial drones like the ones you can find in your local store, modified to hold a grenade. Is anybody capable to think of a possible way of avoiding that, once the proper intelligence is so easy to buy or download like small drones are today, ISIS or their offspring will use it in the same way? Once you have the first swarm of killer autonomous drones let loose by terrorists in an American city, does anybody really think that any government is going to stand by that (possible) treaty?

    This is not something like chemical weapons. You could say that the attack with sarin in Tokyo did not destroy the agreement on chemical weapons. But chemical weapons cannot be used for defense, and also they are not really a useful weapon in general. Killer autonomous robots (lets create the obvious KILLAR acronym here and now) are going to be precise, and probably the only way of defense against other KILLARs. Nobody is going to renounce to that.

    Also, everything about a KILLAR will be double use. If you think that dual use equipment is a nightmare to control, like it has been in the Iran embargo, just wait until you have to decide if a particular neural network program can be used to detect armed people instead of drowning people. Good luck with that.

  • by myid ( 3783581 ) on Monday August 21, 2017 @04:21AM (#55055581)

    After a tornado or earthquake, people are sometimes trapped in collapesed buildings. So I invent a robot that can force its way through building walls, by shooting laser beams and by punching holes in the walls. That makes it easier for rescuers to get to victims, right? Therefore it's a peaceful robot, right?

    What's to stop me from using this robot in a war, to get to enemy soldiers who are hiding in a bunker?

    How do you define "killer robot"? Do you define it as a robot that can only be used to hurt people, not to help people? Just about every invention can be used both for helping and also for hurting people.

    • by Misagon ( 1135 )

      The critical difference between a "drone" and a "killer robot" is in who makes the decision to kill: a human operator, or a computer program.
      The keyword is in the quote in the description: "autonomous".

      The difference between a peaceful robot and a war-robot is in which kinds of weapons you employ and what kind of programming you give it. Does it only break walls, or does it break down walls to find humans to kill? Does it have guns, that are only useful for killing?

      And... the difference between a landmine/e

      • by Nidi62 ( 1525137 )

        The critical difference between a "drone" and a "killer robot" is in who makes the decision to kill: a human operator, or a computer program. The keyword is in the quote in the description: "autonomous".

        The difference between a peaceful robot and a war-robot is in which kinds of weapons you employ and what kind of programming you give it. Does it only break walls, or does it break down walls to find humans to kill? Does it have guns, that are only useful for killing?

        And... the difference between a landmine/explosive rocket and autonomous robot is that the killer robot is active in how it searches for its targets.

        Seriously, making the distinction is not that hard. Don't muddle it up!

        How about an autonomous UAV that works as a CAP with an ROE that only allows it to engage targets when something it is assigned to protect comes under attack? It could be guarding an outpost, or be dispatched to watch over a group of civilians that are fleeing occupied territory that ground troops can't reach. It's only killing to save lives.

    • by AmiMoJo ( 196126 ) on Monday August 21, 2017 @06:44AM (#55055881) Homepage Journal

      A robot that can shoot lasers and punch through walls should probably require human oversight, especially in an unpredictable and volatile emergency situation. If such a robot were used in war under close human direction, it would be like any guided missile or drone.

      What Musk is talking about is robots with enough AI to go into an area, decide on targets for itself and decide if it is going to kill them or not. There isn't really any reasonable civilian use for such a robot. A robot that is designed to look for disaster survivors and then definitely avoid killing them at all costs would need a lot of modification to be more than a improvised booby-trap type device, which again is little different to existing improvised weapons.

  • It's a bit late... (Score:2, Insightful)

    by Anonymous Coward

    ... given his company already built and sold a killer robot that drove a guy straight into a truck and decapitated him.

  • by Subm ( 79417 ) on Monday August 21, 2017 @06:27AM (#55055815)

    The Nash equilibrium of a prisoners' dilemma is that everyone defects. This game isn't exactly a prisoners' dilemma, but the equilibrium is that everyone builds the robots. A ban won't change the nature of the game. It may partly solve it, but not completely.

    Political leaders need only tell their constituents that building the robots saves their lives and that the other side will do it even if they don't.

  • There's no stopping it until it stops. Then the survivors will picks up the pieces and do it all over again.
  • by asylumx ( 881307 ) on Monday August 21, 2017 @07:13AM (#55055983)
    If you make killer robots illegal, only criminals will have killer robots! Then, us law-abiding killer robot owners will be at a disadvantage in the streets!
  • Ban them. After the ban it would be just easy peachy to enforce the ban.

    We have such great success in enforcing diesel engine emissions from small passenger cars from well established reputed companies. That gives us some great confidence that the ban can be easily enforced.

  • Good luck with that Elon. It already is trivial to cobble together "autoturrets" that can lock onto living targets using an Arduino, a modern firearm, a camera, and a few thermal sensors. Want it only to shoot at people hack a bit of facial recognition software. Personally I love the idea of mobile land mines (killer robots) that only kill specific targets rather than indiscriminately murdering anything that happens upon them.
    • So my Guy Fawkes mask now protects me from killer robots using facial recognition? Awesome!
      • by Shotgun ( 30919 )

        Unless some of the guys attacked in Charlottesville are the ones doing the programming. In that case, they'll be looking for you specifically.

  • This is about as effective as gun control. Ban killer robots so that only the countries who ignore the ban will have killer robots. You can't uninvent things, sadly. What you can do is try to be strong enough to destroy those who use them against you.
  • That's a relief.

    I was afraid Elon was going to join the "pro killer robot" faction.

  • by hey! ( 33014 )

    I thought killer robots were a pretty neat idea, but if Elon Musk says they're bad I must have got things wrong somewhere..

    • Musk just realized what an awesome killeing machine a Tesla car would make? 0 to 60 in 2.3 seconds... straight into a crowd of pedestrians!
  • See Samsung SGR-1. It does require an OK from a human to fire, but really, how hard would it be to bypass that switch? As someone else said, it is trivial to cobble together a system that connects a gun to sensors and have a arduino fire as needed. The Roomba people probably have something that walks with weapons. Sorry Elon, you are way behind the curve on this one.

  • Let us get a group of people to ban the creation of Ice-Nine.
    It might not be such a great idea.
  • There's nothing wrong with killer robots (from Venus).

    You might disagree, but I think they're A-Ok.

    • "Her name is Yoshimi She's a black belt in karate Working for the city She has to discipline her body Cause she knows that It's demanding To defeat those evil machines I know she can beat them Oh Yoshimi, they don't believe me But you won't let those robots eat me Yoshimi, they don't believe me But you won't let those robots defeat me Those evil-natured robots They're programmed to destroy us She's gotta be strong to fight them So she's taking lots of vitamins Cause she knows that It'd be tragic If thos
    • "I, for one, welcome our new killer robot overlords!"
    • FYI, for those that don't get the reference: https://www.youtube.com/watch?... [youtube.com]

  • Horizon Zero Dawn with his kids. ;)
  • Since his cars don't "see" motorcycles, his own autopilot - a car-based, basic robot system - is a killer for fellow users of the road...
  • Build portable EMP devices, powered by the new Tesla 2170 cells.

  • who are calling on the United Nations to ban autonomous weapons.

    Wait... Eon.... don't you remember Autopilot? What are you smoking? Your own Tesla products qualify as autonomous weapons.

    Because a Car/Vehicle is definitely a weapon if operated by someone drink/incompetent, or operated when a mistake is made, or if something goes wrong mechanically causing a loss of control, or if a malfunction mistakes a pedestrian for a non-obstacle.

    Or when your Autopilot fails to recognize a hazard, and the inatt

    • Oh shit, valid point. It's only a matter of time before a terrorist removes the serial numbers from a self-driving car and programs it to run down pedestrians until it can no longer move. But then, drone-mounted weapons are already cheap and easy to make.
  • The problem with restricting their use is some countries already have these things in wide use.

    For example: Missiles are an autonomous weapon. You program the computer with a guidance system, load up a nuclear warhead, and you can launch and annihilate a target from a continent away.

    So what exactly do they want to ban; smarter weapons that specifically target opposing forces or key individuals with lower collateral damage?

    Hard-to-detect intelligence weapons that kill small targets with high precisi

    • Simply put, international bans on killer robots are completely unenforcable. Look at how well the ban on use of chemical weapons is working...
  • "When killer robots are outlawed, only outlaws will have killer robots!"
  • Aren't fully autonomous drones already banned? So this is a no brainer.

    People will confuse things, but it's undoubtely a treaty that should be made, much like several others already in place.

    This isn't dissimilar to treaties around land mines, chemical weapons, biological warfare and others. Yes, there will be countries that won't adhere to it, killer robots will end up being developed, and we'll have violations of treaties over the years... but this is a call for a coalition against development and deploym

  • I see a conundrum arising.

    NASA has been working on autonomous systems for years. It is hard to control a robot in real time, when there is a delay of minutes to hours inserted into the message loop. Some autonomy greatly improves the ability of probes to gather interesting data.

    But, it is not a far leap to move from "Select and drill a rock" to "Select and drill a head".

    It is not a far leap from designing a car that will detect and drive in a given lane, to designing a car that will select and drive in a

There are two ways to write error-free programs; only the third one works.

Working...