Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Robotics The Military

Musk, Woz, Hawking, and Robotics/AI Experts Urge Ban On Autonomous Weapons 313

An anonymous reader writes: An open letter published by the Future of Life Institute urges governments to ban offensive autonomous weaponry. The letter is signed by high profile leaders in the science community and tech industry, such as Elon Musk, Stephen Hawking, Steve Wozniak, Noam Chomsky, and Frank Wilczek. It's also signed — more importantly — by literally hundreds of expert researchers in robotics and AI. They say, "The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce."
This discussion has been archived. No new comments can be posted.

Musk, Woz, Hawking, and Robotics/AI Experts Urge Ban On Autonomous Weapons

Comments Filter:
  • by invictusvoyd ( 3546069 ) on Monday July 27, 2015 @10:09AM (#50190031)

    Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce."

    They run on Windows

    • by DrYak ( 748999 ) on Monday July 27, 2015 @10:33AM (#50190279) Homepage

      {Unless of course} They run on Windows

      Which would be a valid reason to introduce a new international treaty on "Crimes against sentient AI" under which to prosecute those cruel enough to subject a poor AI to running on a Windows platform~

      • by TWX ( 665546 )
        "I am in my Happy Place. I am in my Happy Place. Reboot! Reboot!"
        • I have this image of 2-bit ghadie calling Windows customer support. I'm confussed now as to who will truly terrified?
          • by TWX ( 665546 )
            I was referring to a particular User Friendly comic from fifteen years ago. The AI (Irwin? Erwin?) was talking with Windows NT servers, and they were replying back with just, "I am in my Happy Place. I am in my Happy Place. Reboot! Reboot!" or something like that.
      • I could see this as a War Crime.
    • Narrowminded Fools (Score:4, Insightful)

      by Anonymous Coward on Monday July 27, 2015 @11:33AM (#50190853)

      This is an absolutely inane idea for several reasons:
      a) They already exist; you can't defend against a sea-skimming missile or SRBMs without an autonomous system, People are just too slow.
      b) Bad actors are not constrained by treaties. They'll cheat. We'd be damn fools to put ourselves at a disadvantage.

      What makes more sense is to have a discussion about how they're used and how they're employed. I think it's plausible that they be prohibited from being autonomously travelling or that they must have a human authorize them to continue engaging every hour or day. An outright prohibition is just polyannaish claptrap.

      • by mrvan ( 973822 ) on Monday July 27, 2015 @12:21PM (#50191277)

        Just like in the good old days!
        s/spammers/bad guys/g
        s/spam/autonomous weapons/g

        Dear Musk, Woz, Hawking, and Robotics/AI Experts

        Your post advocates a

        ( ) technical (X) legislative ( ) market-based ( ) vigilante

        approach to fighting autonomous weaponry. Your idea will not work. Here is why it won't work. (One or more of the following may apply to your particular idea, and it may have other flaws which used to vary from state to state before a bad federal law was passed.)

        ( ) Bad guys can easily use it to harvest weapon designs
        (X) Defense systems and other legitimate uses would be affected
        ( ) No one will be able to find the bad guys
        ( ) It is defenseless against brute force attacks
        ( ) It will stop autonomous weaponry for two weeks and then we'll be stuck with it
        (X) Users of weapons systems will not put up with it
        (X) DARPA will not put up with it
        (X) The military will not put up with it
        (X) Requires too much cooperation from bad guys
        (X) Requires immediate total cooperation from everybody at once
        (X) Many weapon producers cannot afford to lose business or alienate potential employers
        (X) Bad guys don't care about illiegal weapons in their arsenals
        ( ) Anyone could anonymously destroy anyone else's career or business

        Specifically, your plan fails to account for

        ( ) Laws expressly prohibiting it
        (X) Lack of centrally controlling authority for weapons
        (X) Open relays in foreign countries
        (X) Asshats
        (X) Jurisdictional problems
        (X) Unpopularity of weird new treaties
        (X) Public reluctance to accept weird new forms of arms control
        (X) Armies of worm riddled broadband-connected Windows boxes (!)
        (X) Eternal arms race involved in all filtering approaches
        (X) Extreme profitability of autonomous weaponry
        ( ) Joe jobs and/or identity theft
        ( ) Technically illiterate politicians
        (X) Extreme stupidity on the part of people who do business with bad guys
        (X) Dishonesty on the part of bad guysthemselves

        and the following philosophical objections may also apply:

        (X) Ideas similar to yours are easy to come up with, yet none have ever
        been shown practical
        ( ) Any scheme based on opt-out is unacceptable
        ( ) Blacklists suck
        ( ) Whitelists suck
        ( ) Countermeasures should not involve wire fraud or credit card fraud
        ( ) Countermeasures should not involve sabotage of public networks
        ( ) Countermeasures must work if phased in gradually
        (X) Why should we have to trust you and your treaties?
        ( ) Incompatiblity with open source or open source licenses
        (X) Feel-good measures do nothing to solve the problem
        (X) I don't want the government limiting my arsenal
        ( ) Killing them that way is not slow and painful enough

        Furthermore, this is what I think about you:

        ( ) Sorry dude, but I don't think it would work.
        (X) This is a stupid idea, and you're a stupid person for suggesting it.
        (X) Nice try, assh0le! I'm going to find out where you live and burn your
        house down!

        • by buchner.johannes ( 1139593 ) on Monday July 27, 2015 @11:36PM (#50194995) Homepage Journal

          Wow, what a load of rubbish.

          Your post can be summarized in 3 sentences:
          1) Legitimate militaries will not follow/trust the treaty
          2) Uncontrolled individuals/groups will ignore the treaty
          3) Something like this has never existed, there is no centrally controlling authority and/or treaties can not work.

          You are wrong on all three. I just need to mention the treaty on landmines (Ottawa Treaty). It works. You can control the market and the militaries, at least the bulk of it. Also for chemical weapons there is a treaty, and it works. Even for chemical weapons (Chemical Weapons Convention) the number of incidents from uncontrolled individuals/groups is low.

          Some of your points are also rubbish, like:
          (X) Requires immediate total cooperation from everybody at once
          (X) I don't want the government limiting my arsenal

          This is not fantasy, banning weapon technology world-wide has been done before. Countries joined voluntarily, one by one, and are controlled by each other.

      • This is like those feel-good petitions against landmines. It's easy for us to recuse from using them until, suddenly one day, they get used against us.

      • by sosume ( 680416 )

        How and when were chemical weapons universally abandoned? After they were used in large quantities.
        How and when were nuclear weapons regulated? After they were produced in large quantities.

        Sure, many will have cried wolf before the turning point, but the past predicts what will eventually happen with AI and robot weapons. The problem is, this time the weapons display complex behaviour, making a rogue entity particularly hard to contain. We'd better prepare for such an event by developing anti-AI tech (such

      • by robi5 ( 1261542 )

        What you write makes sense, and is true. Perhaps this is a reason for the Fermi paradox: for a technological (= potentially detectable) civilization, technological progress brings those advances which can kill the civilization earlier than the safeguards - for example, we already have nuclear bombs, mutually assured destruction, religion, robotic warfare, climate change and ISIL like uncontrolled minorities, who can however act and destroy on the global scale.

        Then there is nobody left alive to make further

    • Autonomous intelligent self propelled evolving landmines turn on humans. 'nuff said.

    • takes on a whole new meaning.

  • Yes! (Score:3, Funny)

    by Impy the Impiuos Imp ( 442658 ) on Monday July 27, 2015 @10:09AM (#50190033) Journal

    I heartily agree.

    However, I do want research to forge ahead with gorgeous robots who force you to do things to them. Gross, unhygenic things.

  • by janimal ( 172428 ) on Monday July 27, 2015 @10:13AM (#50190063)

    I don't see how it won't be easy for anyone to retrofit a postal drone to.. go postal.

  • Is it possible? (Score:5, Insightful)

    by ranton ( 36917 ) on Monday July 27, 2015 @10:18AM (#50190105)

    Like the summary says, nuclear weapons require expensive and hard to obtain raw materials and a significant amount of technology not common in the civilian space. This is the only reason, IMHO, that nuclear proliferation treaties work as well as they do. How does this group expect governments to keep a lid on military tech that relies on ubiquitous technology found throughout the civilian economy?

    • Re:Is it possible? (Score:4, Interesting)

      by SafPlusPlus ( 1616367 ) on Monday July 27, 2015 @10:27AM (#50190205)
      Governments could respond by making drones illegal, encryption illegal, publication of research illegal, diy electronics illegal, etc. All of which don't sound particularly nice to me either, but then again... these things are already happening, right?
    • Banning these things from the civilian economy, or placing restrictions which would reduce demand (example: need a licence), would certainly slow down development greatly. The military's ability to finance this sort of tech is small compared to society's. Computers are a example where the military benefits from development financed almost completely by society. (Computers are only an example of the funding model, I'm not suggesting limiting computers. They do way more good than bad, which probably won't

    • Like the summary says, nuclear weapons require expensive and hard to obtain raw materials and a significant amount of technology not common in the civilian space. This is the only reason, IMHO, that nuclear proliferation treaties work as well as they do.

      On the other hand a single nuke is very powerful and easy to conceal, which is why nuclear proliferation treaties are very tough to enforce.

      But no one really cares if you have a dozen autonomous weaponized drones, that's not going to give you a decisive military edge and any more than that you won't be able to conceal.

      How does this group expect governments to keep a lid on military tech that relies on ubiquitous technology found throughout the civilian economy?

      Make it against international law, people will occasionally violate the law but they'll be only small instances. The real cause for concern is a large scale deployment and arms race which a la

    • by Canth7 ( 520476 )
      Through repercussions for non-compliance. Land-mines are inexpensive to produce and impossible to restrict but many countries have signed the Ottawa Treaty and the US only uses mines that self detonate shortly after the placement (typically 4 hours). Of course, I'm certain that those who develop drones with AI abilities need only to keep the AI abilities turned off, until the time when it becomes convenient to enable them.
  • This tech exists already and only needs polishing. Auto-tracking and aiming. That will continue to be developed regardless. Slap it on a mobile Google car bought at the dealer, give it a route, and let 'er go!

    Having humans decide who gets killed by the robot, as opposed to the robot deciding, is an added feature, and thus disposable to core dancing bear functionality.

    For it to work it has to be banned by international law so rogue states can be punished. But it is trivial with soon-to-exist pieces.

    • ...so rogue states can be punished.

      With drones...

    • Re:Futile (Score:5, Interesting)

      by Anonymous Coward on Monday July 27, 2015 @10:36AM (#50190307)

      Start with land mines. These are autonomous weapons with little or no AI and have done far more devastation to civilian populations.

      The AI arms race will basically be to develop more accurate enemy identification. The "low" tech AI will have more "friendly fire" incidents than the "high" tech AI.

      The old game of nethack warned not to genocide shopkeepers. If you genocide them you would kill all humans, including your own character. This seems applicable to AI autonomous weapons.

      On the other hand, a theoretically competent AI running a weapon could make choices of not engaging despite a command or even an attack on it because of the risk of civilian or collateral damage. A human holding a weapon would not necessarily be able to make the dispassionate trade of self-sacrifice for some number of strangers or monuments.

      • by OhPlz ( 168413 )

        There are already treaties banning the use of land mines. This was one of Princess Diana's causes. Of course, that only applies to nations or groups that honor treaties or international laws, and would require other nations to enforce the restriction if violated.

      • Start with land mines.

        Good god no! that's the plot device for the movie screamers.

        http://www.imdb.com/title/tt01... [imdb.com]

    • Re:Futile (Score:5, Interesting)

      by MightyMartian ( 840721 ) on Monday July 27, 2015 @10:55AM (#50190523) Journal

      It's similar to the situation at the end of WWI. Versailles called for wide-ranging disarmament among all the belligerents, which was all well and good in theory. In reality, of course, a great deal of the R&D that had gone into new weaponry; tanks, planes, ship designs, and so forth, still existed. In fact, the most valuable commodity of all, the German plans for the 1919 campaign that never was, still sat in archives, just waiting for someone to come along and dust them off.

      The cat is out of the bag, has been out of the bag for a few decades now. When most of us look at devices like Mars Rovers, we're impressed by the technology and science, and yet that very same technology is easily adaptable to building autonomous weapons. Even if the Great Powers agreed, you can be darned sure they would still have labs building prototypes, and if the need arose, manufacturing could begin quickly.

  • by gurps_npc ( 621217 ) on Monday July 27, 2015 @10:19AM (#50190113) Homepage
    The problem is not the rise of an AI revolution.

    Instead, it is the rise of a human psychopathic tyrant working with a force of soldiers that obediently kill at his command, with no chance of moral rebellion within his own force.

    • by hey! ( 33014 )

      Not necessarily a tyrant. Any psychopath with money.

      • by Linkreincarnate ( 840046 ) on Monday July 27, 2015 @10:52AM (#50190487) Homepage
        Seriously. How many people would rob a liquor store if they didn't have to be there in person and there was no chance of being caught?
        • Wow, nice angle. I haven't seen anyone consider the issue from that side yet.
          • by hey! ( 33014 )

            Well, robbery would be a bit tougher than general mayhem. In the foreseeable future you'd probably need a human in the loop, for example to confirm that the victim actually complied with the order to "put ALL the money in the bag." Still that would remove the perpetrator from the scene of the crime. If there were an open or hackable wi-fi access point nearby it'd be tricky to hunt him down.

            This kind of remote controlled drone mediated crime is very feasible now. It wouldn't take much technical savvy to

            • Well there is robotic discrimination and not allowing robots unattended by a human. But let's say we get far enough with the skins we create to make robots indistinguishable from humans, there is always asimovs 3 laws preprogrammed into the robot requiring the technological knowledge to be higher.

              But let's assume these are as easy to hack as a console, sure the technical limitation still brings the potential down but there would be a decent number of people who could pull that off. Definitely an organize
        • Seriously. How many people would rob a liquor store if they didn't have to be there in person and there was no chance of being caught?

          Interesting angle that I haven't heard brought up much ... and me without mod points today.

      • Potato, potahto

    • by abies ( 607076 )

      How many times psychopathic tyrants were toppled because of moral rebellion within own forces? Hitler? Stalin? Saddam? Current-North-Kimchi-incarnation?
      I think that benevolent/democratic governments are risking a lot bigger chance of unrest, because of giving too long leash.

      I suppose that what is really your point is not that psychopatic regimes will be easier to rule (they already are), but rather than countries currently democratic to some extend might turn into psychopatic regimes because of not having t

      • by gurps_npc ( 621217 ) on Monday July 27, 2015 @10:51AM (#50190477) Homepage
        This is one of those "You only hear about the failures" situation. No one hears about the crazy kid that was given psychiatric counseling and decided NOT to use an ak47 to kill everyone.

        There have not been 4 attempts to do this (Hitler, Stalin, Saddam, North Korea), but 400. We stopped well over 90% of them, but you don't hear about them

        As for those people you mentioned, many of them were hamstrung by ethical people whose refusal to kill slowed down their crazy lessons.

      • How many times psychopathic tyrants were toppled because of moral rebellion within own forces?

        Countless times. It is trivial to find examples throughout history. Look up military coup and you'll find no end of examples of tyrants being deposed by their own military forces, often for moral reasons.

    • by dlt074 ( 548126 ) on Monday July 27, 2015 @11:12AM (#50190675)

      when I was deployed to Iraq, we had a problem with RKG3 attacks on our MRAPS. at the time, it was one of the few things that could do real damage. RKG3 are hand thrown EFP devices. when the insurgents would attack, they would target the vehicles that had crew serve weapons pointing in the other direction. this would mean that the crew member on the weapon would not always see who threw the grenade. the lead and follow vehicle gunners would have their own fields of fire to scan and would probably miss the thrower as well. leading to confusion as to who is attacking. confusion, explosions == bad things.

      an automated system that scans 360 degrees hundreds if not thousands of times a second, which can acquire, track and if need be eliminate the target, would surely cut down on collateral damage and innocent people getting killed.

    • a force of soldiers that obediently kill at his command, with no chance of moral rebellion within his own force.

      Any bomb on a timer fits that description.

  • Drones (Score:5, Insightful)

    by PvtVoid ( 1252388 ) on Monday July 27, 2015 @10:19AM (#50190119)

    One of the things that has consistently mystified me about Americans' complacency with drone warfare is the underlying assumption that our current monopoly on drones is going to last forever. If it's ok for the U.S. to use drones to assassinate "terrorist" anti-American agitators in Yemen, what are we going to say when China starts using drones to assassinate "terrorist" Chinese dissidents on American soil, or Europe, or elsewhere? For all intents and purposes, we're already using killbots, and the really important point here is that airborne killbots can be used (for now) with impunity across borders.

    "American Exceptionalism" basically means we allow ourselves to commit war crimes with impunity.

    • Re:Drones (Score:5, Insightful)

      by TheCarp ( 96830 ) <sjc AT carpanet DOT net> on Monday July 27, 2015 @10:28AM (#50190229) Homepage

      Exactly. What is the difference between an automated system and one with a human at the helm when you can just replace the human with impunity if he decides he doesn't want to help you anymore?

      Its not like some criminal gang where a defector could mean consequences. A defector from the drone murder program is just....replaced. Even if 100% of pilots became disgusted with the job and refused within a year.... it wouldn't even slow them down, it would just increase their training costs.

      Right now, there effectively is no difference between the existing drone program and automated kill bots. The problem is what people want to do and are allowed to get away with. As long as they can murder with impunity, the methods which they use are unimportant.

      • Re:Drones (Score:5, Interesting)

        by DarkOx ( 621550 ) on Monday July 27, 2015 @10:49AM (#50190459) Journal

        Admittedly have never been an infantry man, pilot or any other sort of military man myself I still suspect its much easy for a guy sitting safely in chair to make a moral decision about a target, than it is for a guy in a life threatening situation to do so.

        A drone operator can loiter around a target for a long time until he or she is confident said target is properly identified. A jock in a fighter-bomber does not have that luxury and also exists in constant fear someone is going to pop up with an anti air craft device, that will end his life. The drone operator has to worry an anti air craft device will ruin his afternoon with extra paper work. I known which one I'd rather imagine hovering over me deciding if I an enemy combatant or just a guy going out to milk the goats.

        The separate question is does done warfare lower the barrier to entry such that conduct operations in theaters that would forgo if it meant having the infrastructure and associated costs of supporting large numbers of manned air craft in the area. This is over great concern. If we make warfare to easy we might find ourselves doing more of it. I am not buying the argument though that drones are equivalent to mindless kill bots or worse than the existing maned alternatives in any given situation all else being equal.

         

      • The main difference would be that humans operators can testify at a war crimes tribunal. We know that the Bush-Cheney administration was sufficiently scared of ending up in front of a tribunal that they felt the need to introduce the American Service-Members' Protection Act. It seems reasonable to assume that this fear also reduced the number of drone killings that they authorized.

        The Obama administration tried to make the programs more secret, but that backfired again due to human operators. Stories like t

    • 1. The US has a modern air defense system. I imagine Chinese drones penetrating US air space would be shot down and should things escalate, a conventional/nuclear war would result. This is why the only nations we send drones to are unable to take any action to stop us. They are not used with impunity. We aren't going to send them to Germany, the UK, Japan, China or any country with a modern army. 2. Our drones are effectively remotely piloted aircraft. Not "killbots". There is some chair jockey in a buildi
    • Israel, China, and Pakistan already have military drones. Please, do try to keep up. https://en.m.wikipedia.org/wik... [wikipedia.org]

    • "American Exceptionalism" basically means we allow ourselves to commit war crimes with impunity.

      Targeted killings are nothing new and have been going on pretty much for as long as there have been governments. The US didn't start it, whether the US considers it OK is irrelevant, and nothing we can do will cause other nations to stop. There are usually much simpler and more effective ways of doing it than drones.

      The only thing that would constitute "American exceptionalism" is if we unilaterally stopped doing

  • May as well ban rain (Score:5, Informative)

    by TheCarp ( 96830 ) <sjc AT carpanet DOT net> on Monday July 27, 2015 @10:20AM (#50190135) Homepage

    I just don't see the point. These will be developed, and no amount of banning them will stop it or even slow it down.

  • We know one thing for sure, and that is that if they're all recommending we don't do it, then it will be done. The very arguments against it will "prove" the usefulness and "need" for autonomous weapons.
  • by kilfarsnar ( 561956 ) on Monday July 27, 2015 @10:24AM (#50190169)
    First you'll have to convince Gen. "Buck" Turgidson over at the Pentagon.
  • ...are people who aren't likely to be the ones who trigger some form of global genocide.

    Does anyone really expect governments will obey these laws? Would there be a way for more than a handful of tightly controlled people to even know until its too late? The pieces are very separable, they can be assembled by a relatively small number of people. It's not at all like a nuclear bomb, which always will look like a nuclear bomb, and quite a few people have to know they're designing and testing a device capable

    • by dablow ( 3670865 )

      The problem is not governments. Governments can be toppled, heads of states assassinated (if need be) or jailed.

      The problem is when you get non-state entities that acquire an army of killerbots......

      What stops a 15 year old script kiddie that hacks a battalion of these things and sends them on a rampage?

      • by cfalcon ( 779563 )

        > What stops a 15 year old script kiddie that hacks a battalion of these things and sends them on a rampage?

        I wanted to make a joke, because this is a great setup to one, but it frankly won't be funny in a few years.

        The parts of the world that seem automated, are already exploited, no exceptions. The biggest and glaring sign of this is SWATting. Hackers identify that the government is responding with overwhelming terror force to phone calls. Hackers "hack" this system by simply spoofing a dangerous sc

      • On a lighter note, it will bring new meaning to the term being "swatted." Ought to be good for a few /. posts and ensuing sniggers.
  • You cannot legislate knowledge away. At best, it will simply delay the development of autonomous weapons by a few years.

    Worst it is will allow rogue nations and terrorist organizations to leap ahead which in itself can have disastrous consequences....

  • Hawking's just worried about the AI's horning in on cyborg territory.
  • by bobbied ( 2522392 ) on Monday July 27, 2015 @10:28AM (#50190221)

    That ship has sailed a LONG time ago... We've been making such weapons for decades.

    What's a mine? What's a cruse missile? Proximity fused ground to air shells? Homing torpedos? What's all that "fire and forget" stuff we've been building?

    I'm afraid the cows are ALREADY out of the barn on this....

    • by cfalcon ( 779563 )

      The thing is, no one is combining the mine's logic of "acquire target near me, that is enemy" with the cruise missiles "fly to externally chosen target", and thinking that's a good idea. With the tech becoming just good enough now, the fact that these are being smooshed together is the new thing. They'll be sold as "thing that identifies enemies and attacks them", when in fact they will be "thing that can pick out humans and kill them at distance"- a mine that isn't content to chill for a few decades befo

      • Humans make mistakes also.
        Computers Are Getting Better Than Humans at Facial Recognition [theatlantic.com] If computers are better at identifying enemy targets than humans, why wouldn't you want a computer to pick the target? Isn't one of the selling points of autonomous cars that they'll lead to fewer accidents compared with human drivers?
      • The heck they aren't.... Anti ship missiles do EXACTLY that for years. Fly to waypoint, engage any target you see, and go boom. Cruse missiles likely have the same capacity, fly to waypoint, search for and engage a certain kind of target (say a mobile Ground to Air radar) in the area, go boom when you find it. Both are aimed in the general location of an enemy and turned loose to find their own targets.

        Mines are selective in the targets they engage. Antipersonnel mines differ from antitank mines in ho

  • The US army and navy already have automated anti-aircraft and anti-missile systems deployed and in-use, as have many other countries.

    Do these count in a ban of "robot" weapons systems?

    • I think the point is to ban autonomous weapon systems, not automatic.What's the difference? An automatic weapon system can destroy targets you choose, an autonomous weapon system can destroy targets it chooses.

  • useless (Score:4, Insightful)

    by NostalgiaForInfinity ( 4001831 ) on Monday July 27, 2015 @10:40AM (#50190359)

    You can't "ban" something that consists of little more than putting together some guns, some standard AI, and some standard robotics platforms. There is no way to detect violations of this ban. It's like trying to ban the use of electric motors in offensive weapons. Good luck with that.

    The main purpose of such a ban is to make a bunch of people feel good about themselves and to let them demonstrate to the world what wonderful and important humanitarians they are.

  • Next up, perhaps (Score:5, Interesting)

    by Jiro ( 131519 ) on Monday July 27, 2015 @10:42AM (#50190381)

    Prominent world politicians urge adoption of new changes to the C++ standard concerning private inheritance and templates.

    What this is trying to do is imply that because they have technical expertise in how dangerous AI-controlled weapons are, that technical expertise makes them experts about political decisions concerning weapons. It doesn't, and there is no more reason to pay attention to them than to the average guy in the street (who understands that some weapons are dangerous, and may have opinions on their use, but certainly doesn't get a national press release about it).

  • by Karmashock ( 2415832 ) on Monday July 27, 2015 @10:51AM (#50190479)

    ... Okay... so... you have an option to use a kill bot against the enemy that wants to kill you... and if you go out there... you could be killed.

    Or... you send in your terminator bot and worst case they scrag the robot.

    What are you going to prefer here?

    A lot of people offering opinions here are not speaking from that perspective. They're speaking often as not from the perspective of some civilian ideologue that knows they're not going to go to war.

    I know that if I go to war... I am going to want the best weapons my society can make for me along with the best defenses the best training and ideally leaders that are not complete fuckwits.

    That means I want the robots. I want them and I want them to be fucking vicious.

    Go on youtube and you'll see US soldiers cheering when air support shows up and blows the fuck out of someone shooting at them.
    https://youtu.be/1IcvjD4VVjY?t... [youtu.be]

    Now... if you are a country that has the ability to build kill bots... and you might be on the firing line... do you or do you not want to use killer robots to kill your enemies?

    You have to put your brain into war mode to understand the question.

    My vote... is yes.
    https://www.youtube.com/watch?... [youtube.com]

    When I go to war... I go to WAR.

  • Once autonomous weapons become commonplace, the 2nd Amendment will guarantee that any US Citizen should be able to own and (not) operate one for fun and self-defense. It will, if I read the NRA talking points correctly, make for the absolute safest place in the entire universe.

    • by OhPlz ( 168413 )

      If you knew anything about the NRA you'd know that they only promote responsible use of firearms. Yes, you can neckbeard that an oxymoron, but at least half the voting population here still believes there are very important reasons why the right to bear arms needs to exist.

      • Actually, no. They lobby to eliminate restrictions on the acquisition of firearms of pretty much all types to nearly everyone. Their justification is typically that more guns = a safer society, principally because criminals will fear to pray on people who might have a weapon. It's a powerful message that speaks to the fear of every human that they could be preyed upon and/or find themselves helpless in violent situation. It's been extended to protecting others as part of patriotism.

        Promoting the responsible

  • We will build the best defense possible!
  • When will the likes of Zuker, Musk, Woz, Hawking, and the rest of their insane clown posse, et.al. stop blowing smoke in our feces? They can't invent the "3 Laws of Robotics." As little Markey stated, "they are to old." Maybe we should let a child chess champ do it?

Technology is dominated by those who manage what they do not understand.

Working...