Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Robotics Technology

Killer Robots Would Be 'Dangerously Destabilizing' Force in the World, Tech Leaders Warn (washingtonpost.com) 163

Thousands of artificial intelligence experts are calling on governments to take preemptive action before it's too late. The list is extensive and includes some of the most influential names in the overlapping worlds of technology, science and academia. From a report: Among them are billionaire inventor and OpenAI founder Elon Musk, Skype co-founder Jaan Tallinn, artificial intelligence researcher Stuart Russell, as well as the three founders of Google DeepMind -- the company's premier machine learning research group. In total, more than 160 organizations and 2,460 individuals from 90 countries promised this week to not participate in or support the development and use of lethal autonomous weapons. The pledge says artificial intelligence is expected to play an increasing role in military systems and calls upon governments and politicians to introduce laws regulating such weapons "to create a future with strong international norms."

"Thousands of AI researchers agree that by removing the risk, attributability, and difficulty of taking human lives, lethal autonomous weapons could become powerful instruments of violence and oppression, especially when linked to surveillance and data systems," the pledge says. "Moreover, lethal autonomous weapons have characteristics quite different from nuclear, chemical and biological weapons, and the unilateral actions of a single group could too easily spark an arms race that the international community lacks the technical tools and global governance systems to manage," the pledge adds.

This discussion has been archived. No new comments can be posted.

Killer Robots Would Be 'Dangerously Destabilizing' Force in the World, Tech Leaders Warn

Comments Filter:
  • by xxxJonBoyxxx ( 565205 ) on Thursday July 19, 2018 @02:40PM (#56976144)
    >> introduce laws...with strong international norms

    Why not structure it like the international mine treaty? You know, the one that pretty much everyone except the USA has agreed to.

    (rolls eyes at idea that Russians/Chinese won't use AI-augmented weapons to their full advantage)
    • (rolls eyes at idea that Russians/Chinese won't use AI-augmented weapons to their full advantage)

      They won't have to.

    • It's different. Someone in control of the area laid the mine. You can mostly figure out who laid the mines and I can't easily surprise you with a mine field and let someone else take the blame. A very small autonomous drone equipped with a nerve agent could easily assassinate any world leader and tracing the attack back with 100% certainty to a state actor would be almost impossible.
    • by Anonymous Coward

      The Korean DMZ is the reason why the US isn't on board with the mine treaty, other than that the US is fully complaint. Civilians aren't going to accidentally wander through it. Quite frankly the mine are no where near the most dangerous thing in the area.

    • Why not structure it like the international mine treaty? You know, the one that pretty much everyone except the USA has agreed to.

      You know that you're full of shit when your "almost everyone" doesn't include China, India, Pakistan, Russia, and both Koreas, amongst a list of 30+ countries. You know, only about half of the planet.

      You know you're doubly full of shit when out of that list of 30 countries the only one you specifically name is also the only one which has publicly stated that they will abide by the terms of the treaty even though they will not officially ratify it.

      Amazingly enough, despite all that, you're still not as full

  • Do these dummies have any idea that the words they're saying are not going to discourage anybody making decisions, but simply encourage them to increase investments?!

    If that is the goal, they could also probably do it honestly by talking about the benefits.

    If you use a word like "disrupt" or "destabilizing" to an average person on the street, they might think the word sounds bad, but people who dream of ruling the world don't feel the same way about these words.

    • Do these dummies have any idea that the words they're saying are not going to discourage anybody making decisions, but simply encourage them to increase investments?!

      If that is the goal, they could also probably do it honestly by talking about the benefits.

      If you use a word like "disrupt" or "destabilizing" to an average person on the street, they might think the word sounds bad, but people who dream of ruling the world don't feel the same way about these words.

      Just a snippet of my stream of thought, here. Just some random thoughts, not saying I believe or disbelieve anything, simply some fanciful "what if?".

      Could Biblical descriptions of Armageddon and the protagonists actually be foretelling the crescendo of an apocalyptic war between the two most powerful super-AIs rather than nation-states as has traditionally been thought? Gog and Magog as warring super-AIs? A warning from the distant past, possibly because it's happened before?

      Discuss! :)

      Strat

      • Darmok and Jalad, at Tanagra.

        Obviously.

        • Darmok and Jalad, at Tanagra.

          This is what happens when you let English majors try to write science fiction. A warp-capable civilization whose language is based on folklore and metaphor? Absurd to the point of stupidity. One of the more pathetic attempts to justify the writer's own feeble existence. You can't do engineering with memes.

          (I now wait for Slashdot to attempt it.)

          Also, obligatory XKCD [xkcd.com].

          • Yeah, your comment is what happens when you let computer nerds talk about storytelling, though.

            So they could easily have done worse.

          • I'm with you. That was one of the weakest scripts in Star trek history. If not for a spectacular performance by Patrick Stewart, that's how it would be remembered.

      • Alien astronaut theorists say "yes."
      • by cas2000 ( 148703 )

        No. Biblical descriptions of Armageddon are bullshit stories by some guy who ate too many magic mushrooms on Patmos.

        Making references to Armageddon is the crackpot's argument from authority. Doesn't matter how insane the Bible is, there's enough other cretins who value it that it bolsters many lunatic arguments.

  • what about an no nuke back? but then skynet can just wipe out
    USSR
    China

  • by Ol Olsoc ( 1175323 ) on Thursday July 19, 2018 @02:47PM (#56976194)
    "Thousands of artificial intelligence experts are calling on governments to take preemptive action before it's too late."

    It is too late.

    • HAL-lelujah and pass the robot oil.
    • You're not wrong. Perhaps in first-world countries we'll say 'no' to this, but countries like China or Russia? They won't refrain from it, they'll mass-produce them, and considering the shitty excuse for 'AI' everyone keeps trotting out, they'll all screw up and run rampant as likely as not. Guess we'll have to develop strong EMP weapons against them.
      • You're not wrong. Perhaps in first-world countries we'll say 'no' to this, but countries like China or Russia? They won't refrain from it, they'll mass-produce them, and considering the shitty excuse for 'AI' everyone keeps trotting out, they'll all screw up and run rampant as likely as not. Guess we'll have to develop strong EMP weapons against them.

        Yes - something will need to be done. I can imagine civilian deaths will be higher than WW2.

        • Worse, it'll completely insulate people from the horror that is war -- all except the victims, of course. When you don't even have a drone pilot pushing the button to drop a bomb, who is there to have any sort of conscience? My main complaint about the poor excuse for 'AI' everyone keeps trotting out to do this-or-that, is that it's incapable of actual thought, and everything is just an object to it, it doesn't know the difference between a living being (like a human) or a rock or fencepost -- therefore 'co
          • Worse, it'll completely insulate people from the horror that is war -- all except the victims, of course. When you don't even have a drone pilot pushing the button to drop a bomb, who is there to have any sort of conscience

            Machines with no conscience, ordered by leaders who possibly have no conscience. A nasty-ass recipe. Coupled with say a country that is staring down it's own defeat. Oh yeah - they will be used. And lest any readers think I want these things - I do not. I just don't see any way they won't be built.

            And used.

            • Coupled with say a country that is staring down it's own defeat.

              In more than one science fiction story I've read or watched (and I suspect, you as well) the 'doomsday weapon' gets used out of spite. Think Saberhagens' planet-killing, life-destroying berzerkers. Or, everyones' more recent favorite, Skynet/Terminators. Or The Matrix.

              There are reasons I have well beyond the fascinating idea of having a conversation with a truly sentient, self-aware, fully cognitive and reasoning machine with a real personality: The current poor excuse for 'AI' they keep trotting out ca

            • Machines with no conscience, ordered by leaders who possibly have no conscience.

              More likely ordered by leaders advised by AI on how to most efficiently achieve their desired outcome. Maybe even with a choice of civilian casualty levels, and the difference in outcome for each one.

              "Humm, do I pick 20% casualties and it ends in three weeks, or do I go for 50% and end it in one week. Well, two weeks from now I'm supposed to be going on that trip to Hawaii, so I guess I'll go for 50% and wrap this up faster."

              "Siri, please order the 50% casualty strategy."

          • Worse, it'll completely insulate people from the horror that is war -- all except the victims, of course.

            I keep seeing people say this as if it were some horrible thing to be avoided at all costs but, other than that their gut feeling that it's "bad" and "will leas to more wars", I've seen nobody offer any actual evidence that it's likely to lead to a negative outcome.

            The fact of the matter is that we have been progressively "insulating people from the horrors of war" for centuries now, yet the number of war deaths per capita have steadily declined. This isn't inherently causitive, of course, but it certainly

            • The fact of the matter is that we have been progressively "insulating people from the horrors of war" for centuries now

              Are you somehow saying this is a GOOD thing? Because if you are then you're a horribly broken excuse for a human being. We should hit everyone full-on in the face with what war really is. No more desensitization!

              • Are you somehow saying this is a GOOD thing? Because if you are then you're a horribly broken excuse for a human being. We should hit everyone full-on in the face with what war really is. No more desensitization!

                That's a rather psychotic line of reasoning. I've argued that isolating people from the horrors of war seems to have either led to a decrease in the lethality of war, or have at least been correlated with it. In response to that, you've basically said "oh yeah? Well I WANT people to be scarred by war!"

                Why? Because it will make you feel better about your preexisting beliefs?

                That's more than a little nuts, and insanely inhumane.

          • Comment removed based on user account deletion
      • My essay: https://www.pdfernhout.net/rec... [pdfernhout.net]
        "Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?
        Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar panels

    • Or we could just not do it. It really is that simple. Of course, let's ask ourselves why we want killer robots. We're pretty much past the stage of "defense". Nukes & MAD make that pointless. And hell, so did globalization. You don't shit in your own backyard and for the same reason we're not going to go off and start blowing each other away. The damage done would outweigh the benefit. There'll be brush fires here and there but big scale wars are a thing of the past, if only because they rich won't let
      • Or we could just not do it. It really is that simple. Of course, let's ask ourselves why we want killer robots. We're pretty much past the stage of "defense". Nukes & MAD make that pointless. And hell, so did globalization. You don't shit in your own backyard and for the same reason we're not going to go off and start blowing each other away. The damage done would outweigh the benefit. There'll be brush fires here and there but big scale wars are a thing of the past, if only because they rich won't let us wreck their stuff anymore.

        Oh, I wish. But I surely do not trust leadership. And there is at least one - but no doubt a few more - who are itching to use nucs. Crazy thing is there are a fair number of people who are actively wishing for a world ending conflict.

        https://www.amazon.com/Have-Ni... [amazon.com]

        https://www.livescience.com/14... [livescience.com]

        http://www.signs-of-end-times.... [signs-of-end-times.com] Dr. Thomas B. Slater, Professor of New Testament at Mercer University On end times date:

        “The end of times is something that we all expect and hope for and look forw

      • That leaves the other reason for killer robots: so a small group of people can police the impoverished population without risking them turning to a charismatic strongman.

        You missed the big reason: So a faceless group of people can sow turmoil and remain anonymous. Not being able to set up your industry in the african country you've got your eye on? Whoops, leader just got liquefied by a kill-bot. That 3rd world dictator stirring shit up again? His wife just had a terrible accident. Wife going to divorce you? Damn, she accidentally got her skull cracked by a drone that lost power and fell out of the sky! Tragic!

        It's malware all over again. This time, however, it impacts hard

  • And Boston Dynamics happily carries on creating robots that will kill us all. They don't care about treaties or anything. They have way too many billions invested to stop.
  • by olsmeister ( 1488789 ) on Thursday July 19, 2018 @02:53PM (#56976226)
    Is a good robot with a gun. If you take the guns away from the good robots, then only the bad robots will have the guns.
    • by Anonymous Coward

      what if the gun is the robot? Why does the robot have to have the gun why not the other way around

  • These folks have at least figured out that if it isn't humans directly killing other humans, there isn't much point to warfare.

    Humans have an innate need to kill other humans. Call some group the "other" and it's off to war. We've done it since we came down from the trees. We've never stopped doing it. We will continue to do it. It will probably be the source of our extinction as our Lizard brain uses our higher thought process and the technology that enables our technology to send us out in a blaze of

    • by Anonymous Coward

      Have you killed other humans? I imagine the vast majority of all humans to ever live never killed anyone.

      Seems to me that's a pretty broad brush you are painting with and all it does it serve to normalize the behaviour of those that benefit from it.

      • Have you killed other humans?

        Certainly not directly. A fair bit of my work has been military oriented. So depends on one's definition.

        I imagine the vast majority of all humans to ever live never killed anyone.

        You are analyzing this incorrectly. Direct killing is not the main metric. Support of it is. Most females are not terribly interested in making war. Most old men are not interested - the exception being old politicians that seem to look for excuses to send young men off as cannon fodder. Regardless, it isn't the specific numbers of people doing the actual killing - its the acceptance of it, and the fact

  • by u19925 ( 613350 ) on Thursday July 19, 2018 @03:00PM (#56976256)

    The biggest autonomous "robots" deployment is going to be fully autonomous cars. Many pundits are predicting that autonomous cars are the future and once you have autonomous cars, owning a car would be lot more expensive than ride sharing services. They are predicting that US car ownership can reduce by 50-80% and ride sharing services running millions of autonomous cars. Imagine if someone can break into a large ride sharing service and start all the autonomous cars and direct them on pedestrians... It can be far worse than 9/11.

    • death race!

      • by BranMan ( 29917 )

        And if *I* were the one to do the hacking..... I'd ... get ALL the points (!!!)

        Cue the evil cackling - I think we're on to something here!

    • Imagine if someone can break into a large ride sharing service and start all the autonomous cars and direct them on pedestrians... It can be far worse than 9/11.

      In the US anywhere outside of New York City it's not a problem at all. First there would have to be pedestrians for them to be in any danger.

      But I guess 9/11 happened in New York City, so there's that.

    • I've been saying this all along. We can't even seem to manage to secure our basic data systems, personal computers, and smartphones, do they really think they'll be able to keep anyone from hacking SDCs and taking remote control of them? Especially considering that, in the SDC Future Utopia they're trying to sell us, remote control capability will be baked right into the cars' software, so that law enforcement can take control any time they want, and the occupants won't have any say in it, they'll be locked
    • When people say autonomous car I think of all the times I've seen GPS glitch. If this is used in autonomous cars they could be turning early or late and hitting sidewalks, buildings, and pedestrians all the time.

    • Many pundits are predicting that autonomous cars are the future and once you have autonomous cars, owning a car would be lot more expensive than ride sharing services.

      Many pundits are also stupid.

    • there's plenty of countermeasures if we just bother with them, which post 9/11 we will. 911 wasn't an inside job, but that doesn't mean we didn't let it happen. The various agencies knew there were terrorists preparing for 911 and let them swim. The only question is did they let them swim hoping to catch bigger fish or so they could get a rise out of the public. If it was the latter, well, they got their rise all right. Post 911 we threw away our rights and acted like "everything changed" when nothing chang
    • Comment removed based on user account deletion
    • Google "roomba dog shit". I guarantee people fear that far more than an autonomous car.

  • Comment removed based on user account deletion
    • I can fight killer robots, but I cannot fight a nuke. Just saying.

      And a killer robot with a nuke? Just saying.

    • I look forward to fighting something that can fly and see in the dark with super human agility and accuracy.
      Fuck yeah!!
      Beings these can be more localized it would be really easy to cover up their use. Also the local area and resources are still usable. So I can see govts more willing to use these then nukes.
  • by Zorro ( 15797 ) on Thursday July 19, 2018 @03:14PM (#56976356)

    Create killed Robots.

    Putin pinkie promised!

  • Right now, today,
    the most 'dangerously destabilizing force in the world' is
    American Democracy.
    And the technology that manipulates it.

    Not killer robots.
    Or genetically modified dinosaurs.
    Or sharks with friggin lasers.

    It's the tech that makes US voters stupid.

  • Ultimately futile (Score:4, Insightful)

    by OpenSourced ( 323149 ) on Thursday July 19, 2018 @03:25PM (#56976444) Journal

    You could rewrite that 70 years in the past and substituting "Killer Robots" by "Nuclear weapons", and make the same arguments, and even some arguments being proven right in hindsight.

    But technology will mature when it's ready to do so, and no amount of hand-wringing will change that. And anyway, if I they want to choose a dangerous tech to get all anxious about, hands down it should be "genetically engineered viri" rather than "killer robots". At least with a robot you can shoot back.

  • I mean random hackable little killers wandering the planet? I LOVE Black Mirror
  • I'm not even kidding or trying to be funny. We, as a species, should go back to warfare using only hand-to-hand weapons like swords, daggers, pikes, and bows and arrows. No more guns, explosives, bombs (conventional or nuclear), planes, weaponized ships of war, and so on. You can have ships that carry troops. It'll be more civilized.

    But Rick, hand-to-hand combat was brutal and bloody and and and.. so uncivilized! Gaping bloody wounds, severed limbs and heads, OMFG THE HORROR!!!

    EXACTLY. Let's make war so horrifying, so extreme, so UGLY, that no one will be able to stand it. Since there won't be modern things like cannons or other explosive projectil

  • War is war.. you can put all the rules on it that you like, but when the chips are down war has no rules.. period. Someone WILL do this.. and the technology will get out. The world changes and we must adapt to it.
    • War. War never changes.
      The Romans waged war to gather slaves and wealth. Spain built an empire from its lust for gold and territory. Hitler shaped a battered Germany into an economic superpower.
      But war never changes.

  • Captain Obvious spoke true words once again.

  • Doom and gloom (Score:4, Insightful)

    by taustin ( 171655 ) on Thursday July 19, 2018 @03:36PM (#56976512) Homepage Journal

    Warnings about military and political catastrophe by people who know nothing about the military, or politics, and automatically assume that anyone who does must be evil and stupid.

    How could we possibly survive with the infinite wisdom of Silicon Valley! Just ask them.

    In other words, it's a day that ends in "y."

    This is a complete non-story. There's no content. They're not wrong, so much, as they have nothing to say at all.

  • by rsilvergun ( 571051 ) on Thursday July 19, 2018 @03:44PM (#56976586)
    FTFY.

    Seriously, the rich are already thinking about a post capitalist society where the working class doesn't factor in. Automated weapons are the way to go. You keep a few engineers on staff to monitor them and pay the engineers well. Unlike the captains of your private military they lack the ambition and charisma to overthrow you.

    Meanwhile the rest of us will just be screwed. Think living like the American Indians on the res except without the casinos. If you want to prevent that now's the time. Start demanding a decent quality of life for everyone. Establish it as a basic human right. Or cast your eyes to the reservations circa the 1900s because that's your future. Me? I'm 40 and come from a short lived family. I'll be long gone.
    • The NSA has a hard time finding recruits... Because too many of the people with talent are too weird for them, and/or on drugs. Consequently, the hackers are more talented than they are. The ultra-wealthy can no more interface with those people than the NSA can, and their weapons will simply be hacked and used against them. Killer humans barely care who they kill, killer robots do not care even that much.

      • it'll be done by private individuals. And it's not about humans caring who they kill. If fact that makes humans worse. One charismatic leader can turn your entire security force against you. Happened to the Tzars. Happened to the Chinese. Happens all the damn time. So much so we have a word for it (coup).

        The machines might get taken over, but not if you pay your engineers well. And engineers aren't a charismatic lot. They'll collect their pay for the killer robots without ever bothering to overthrow the
    • by mentil ( 1748130 )

      I'm planning on being a pet for our killbot overlords. I've had a fuzzy collar fitted and everything!

  • grade arguments.

    ....a circlejerk of celebrity scientists someone else might say.

    I'd suggest to all of them, they'd better fix their products before dreaming in some sci-fi movie concept from the 80s.

  • We know it is going to happen so just build IG-88 already ðY
  • There are already sentry guns. We can already program drones to drop payloads autonomously. We already have killer robots, and we do not even need AI to make them.

    • by mjwx ( 966435 )

      There are already sentry guns. We can already program drones to drop payloads autonomously. We already have killer robots, and we do not even need AI to make them.

      This.

      I do not fear artificial intelligence... I fear the natural idiots who are giving them orders.

  • by RhettLivingston ( 544140 ) on Thursday July 19, 2018 @05:58PM (#56977238) Journal
    Very few devices today are truly open source software free. The open source community should start rolling anti-weaponization provisions into all open source licenses. Problem solved.
    • by zenbi ( 3530707 )

      Except that violates a fundamental reason behind the philosophy.

      Freedom #0 of the Four Freedoms of the Free Software Definition [gnu.org]:

      The freedom to run the program as you wish, for any purpose (freedom 0).

      Item #6 of the Open Source Definition [opensource.org]:

      6. No Discrimination Against Fields of Endeavor

      The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

  • Unless AI robots get vastly more flexible rapidly, seems to me that large armies of nasty humans are still a much bigger threat (albeit one we've lived with since time immemorial).

    But I suppose in the end, folks think really clever, self improving robots will win the day: https://en.wikipedia.org/wiki/... [wikipedia.org]

    Years ago Bill Joy warned everyone about self replication ("grey goo"). Self replicating *and* self improving seem like much worse ideas than simply arming them.

    Use case; Consider some sort of waste reposit

    • "Unless AI robots get vastly more flexible rapidly, seems to me that large armies of nasty humans are still a much bigger threat (albeit one we've lived with since time immemorial)."

      Robots don't have to live up to our abilities, because they don't have to cope with our particular drawbacks. They can be smaller and more numerous, for example. They don't have to be individually more capable.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...