Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Robotics The Military

What Will Happen When Killer Robots Get Hijacked? (marketwatch.com) 157

"Imagine an artificial-intelligence-driven military drone capable of autonomously patrolling the perimeter of a country or region and deciding who lives and who dies, without a human operator. Now do the same with tanks, helicopters and biped/quadruped robots." A United Nations conference recently decided not to ban these weapons systems outright, but to revisit the topic in November.

So a MarketWatch columnist looked at how these weapons systems could go bad -- and argues the risks are greater than simply fooling the AI into malfunctioning. What about hijacking...? In warfare, AI units can function autonomously, but in the end they need a way to communicate with one another and to transfer data to a command center. This makes them vulnerable to hacking and hijacking. What would happen if one of these drones or robots was hijacked by an opposite faction and started firing on civilians? A hacker would laugh. Why? Because he wouldn't hijack just one. He would design a self-propagating virus that would spread throughout the AI network and infect all units in the vicinity, as well as those communicating with them. In a split second, an entire squad of lethal autonomous weapons systems would be under enemy control... Every machine can be overridden, tricked, hijacked and manipulated with an efficiency that's unheard of in the realm of human-operated traditional weaponry.

However, the U.S. government remains oblivious. DARPA (Defense Advanced Research Projects Agency) has already announced a $2 billion development campaign for the next wave of technologically advanced AI (dubbed "AI Next"). One of the goals is to have the machines "acquire human-like communication and reasoning capabilities, with the ability to recognize new situations and environments and adapt to them." I may be overreaching here, but the UN meeting on one end and this announcement on the other, make me think that the U.S. government isn't just pro-robotic -- it may already have a lethal autonomous weapons ace up its sleeve.

The article ends with a question: What do you think about killer robots replacing human combatants?

And what would happen if killer robots got hijacked?
This discussion has been archived. No new comments can be posted.

What Will Happen When Killer Robots Get Hijacked?

Comments Filter:
  • by Krishnoid ( 984597 ) on Saturday September 29, 2018 @11:54PM (#57397942) Journal

    A hacker who identifies with Thanos' agenda could start 'solving' the overpopulation problem by themselves.

  • by Anonymous Coward

    So any reasonable person would say that we should be able to design and implement inexpensive and ubiquitous secure systems before trying our hand in weapons? That we shouldn't run Wintel ecosystem as it is on those killer bots and always prefer feature creep of insufficiently engineered, too complex features over security?

    • Security is so far from a priority for most companies that it's not even an afterthought. Talk to administrators and you'll hear people say, "we're just not a big enough target for hackers." They will keep that attitude even when the company is a big enough target. No one cares if their system gets hacked, even credit agencies. They just put a vague fix in and move on. No questions about whether their problem is more systemic, and how to avoid those problems in the future.
  • Or cars (Score:4, Insightful)

    by Anonymous Coward on Saturday September 29, 2018 @11:58PM (#57397960)

    You don't need to steal weapons from the military to get killer robots. You just need to botnet cars. They already kill people without help.

    Let me driver-assist you to safety.

  • skynet will just get too smart and nuke us all

  • let's play global thermonuclear war!

    • We should designate an official war zone over an uninhabited part of the Pacific Ocean where million dollar robots can fight each other to the death. At least that way instead of having an expensive chess match where real people die, we just have the expensive chess match.

      • by Anonymous Coward

        We do that with the economic model already, it's giant economic robots tackling each other.

      • by NicBenjamin ( 2124018 ) on Sunday September 30, 2018 @02:01AM (#57398196)

        Like most ideas about revolutionizing warfare, this is something that's very old. Many many cultures would have champions fight to the death so the army did not have to.

        For that to work couple things have to be true:
        1) Both sides have to have a roughly equal chance of success. That is the champions have to have comparable combat power. Back then, nobody's going to agree to wager their war on a naked guy with a pointy stick vs. a Knight in full plate. With robots, many countries won't have a robot capable of going toe-to-toe against us.
        2) Both sides have to want whatever they're fighting enough to raise an army to fight for it, but not so much that they'll actually die for it. If they're gonna die for it regardless of the outcome of the fight, the fight is pointless. If the result of losing a robo-war is that the President gets executed, his top guys imprisoned, and everyone else gets fired, your worst enemies get to run the country, etc. then when the champion loses you fight anyway.
        3) Both asides have to trust that the other side will abide by the deal. You're not spending $1 Billion on robo-champion if you think those bastards will fight even if their robot loses.

        Just go through recent US-involved wars in your head. how many are all three things true of both sides?

        The Nazis and Japanese could have made some pretty cool robots, but we'd have fought the actual war regardless of who won the robo-battle. The Koreans, Vietnamese, Iraqis, Taliban wouldn't have robots anywhere near our league, so they'll lose the war and then fight anyway.

        • The Nazis and Japanese could have made some pretty cool robots, but we'd have fought the actual war regardless of who won the robo-battle. The Koreans, Vietnamese, Iraqis, Taliban wouldn't have robots anywhere near our league, so they'll lose the war and then fight anyway.

          This touches upon the final question of the article.

          A war of robot versus robot is pointless other than as entertainment.

          War is specifically designed to kill humans. Humans, by virtue of their genetic based hyper aggressiveness, have an innate need to define an "other", then to do what they can to end the lives of the other.

          There are often reasons - resources, different Gods, but not to worry - the reason is just a casus belli, the excuse for killing the other. One will always be found.

          The only thin

      • I have this as item #3 on my list of "Top Geek Myths".

        Counterargument: People don't submit to perceived tyranny because their material stuff got destroyed; rather, the opposite.

        Also: "What robot soldiers could do is just as scary, though: Make outright colonialism a practical option again." War Nerd, 2014 [pando.com].

    • I'll just leave this here.

      https://youtu.be/wv6pfQTl-d4 [youtu.be]

      "My boy!"

      Strat

  • Wrong question (Score:5, Interesting)

    by rsilvergun ( 571051 ) on Sunday September 30, 2018 @12:03AM (#57397978)
    the question we should be asking ourselves is what will happen when the ruling class doesn't need the working class to keep the military class in check? Right now we've got a bit of a balance going on. The Army protects the ruling class but the working class keeps an eye on the army and going into a decent civilian life after some time in the army gives them something to do besides run a Junta. Because of that there's a floor on how bad the ruling class can treat the army and the working class.

    All that goes out the window when they've got a robot army. The robot army will never betray them. Sure, there's some engineers keeping it running, but they're nerds and typically lack the drive and charisma to overthrow the ruling class. Those kinds of coup are pulled off by charismatic generalissimos. So you're gonna have the ruling class, a small, well paid merchant class to keep the killer robots going and everybody else. That's you and my, btw. And the ruling class won't need us to buy their crap to be rich either. They'll own everything and have factories to build it. I suppose there'll be a few positions for their doctors and sex slaves. The rest of us get abandoned, sorta like how we ignore starving people in Africa.
    • Do they have more of an incentive to abandon people, and look like jackasses, or provide for them, with their abundance of resources, and be seen as heroes? If the premise is that wage slaves are now obsolete, then the ruling class no longer needs to keep them poor. That was only necessary when the only way to get someone to do the less desirable jobs was to keep them poor enough that they'd otherwise starve. In this scenario, there are robots to fill this gap, so the ruling class may as well keep people
      • because if nobody is poor than nobody is _rich_. Being wealthy isn't just about material wealth. It's about the political power that comes from deciding how resources are distributed and, more often than no, who lives and who dies.
        • In the scenario I described, there was a 'ruling class' with political power who controlled how resources were distributed, and could potentially control who lives and who dies. My question is why would they use that power in the way you're describing? What do you think their goals are exactly, and how does 'abandoning' people help toward these goals? Do you not think you have more power over people if they're dependent on you than if you cut yourself off from them?
          • and I don't mean materially. I mean being treated like a God. To have the entirety of human civilization bent not just to your will, but to exist for the sole purpose of improving your personal quality of life.

            It's a level of narcissism and power that's hard to really imagine. It's like trying to get a grasp on a google (the number, not the company). The human brain isn't well equipped for it.

            And they're not going to cut themselves off from everyone, just most everyone. Also, they're not intentional
          • Sex slaves. Monuments built in your name and image. Worship. Scientific advancements to improve your life and longevity. Good 'ole sadism, or maybe just the opposite: feeling like a white knight when you swoop in to save the day with food donations.

            Power isn't a means to an end, it's an end in itself. If you're not smart enough to immerse yourself in the wonder of the universe like Einstein did and you don't need to work for a living there's not much else left.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      This is a great point - it's unclear to me what would stop them. By now we know a few things about the bulk of the owners of capital:
      1. They don't believe in Democracy
      2. They are motivated, and don't allow pesky things like laws to stand in their way
      3. They have the patience to work across generations

      Think of Trump and his children. These people want to bring about a new feudalism and bring us all back to Game of Thrones, and the Dragons are the killer robots. Human nature doesn't change.

      We need to outla

  • CARNAGE... Lots and Lots of Carnage. Its so easy to kill people. :)
  • There is no new risk here. Virtually every modern weapon system is robotic and many are mostly autonomous with heavy usage of data channels for control. If hackers could take them over, we'd have long ago suffered these imagined catastrophes.

    • by mentil ( 1748130 )

      But... but... all it takes is 2 cans of Mountain Dew and 30 seconds of rapid-fire typing and then you say "I'm in!" and now your bar tab is in the negative and Terminatrices warm your bed at night.

    • Comment removed based on user account deletion
  • <tars>Nothing Good</tars>

  • There's nothing that can be done and I'll be dead by then so have with this.
  • Robo-Coup (Score:5, Insightful)

    by mentil ( 1748130 ) on Sunday September 30, 2018 @12:45AM (#57398060)

    And what would happen if killer robots got hijacked?

    When killer humans (i.e. a military) gets 'hijacked', it's called a coup.
    It's unlikely the autonomous weapons would be hacked to fire on civilians. It'd be MUCH more effective to turn them into robotic sleeper agents that target military officials/visiting politicians (thanks to facial recognition). OTOH, firing on civilians would ensure a) civilians demand their removal, ensuring no further hacking can take place; b) their hacking is discovered immediately, ensuring the vulnerability is quickly patched, and c) an opportunity to take out more-important targets is squandered.

    • by umghhh ( 965931 )

      Is it all that simple?
      As for people (being targets etc) can have control - that is a fairy tale. This does not work properly in majority so called democracies in the West where democracy has been replaced (for good reasons) by representative type of it. The result in US is for instance that majority probably knows that war on drugs is senseless and rather counterproductive. It goes on however even if in some places you can sell and buy ganja now and do not fear immediate legal consequences. There are other

    • by Kjella ( 173770 )

      It'd be MUCH more effective to turn them into robotic sleeper agents that target military officials/visiting politicians

      How do you figure that? The first time you use it there'll probably be eye witnesses, security cameras, ballistics etc. pinning the robot as the guilty one. After that they will take the model out of service and scour through everything until they find out how you did it.

      Compare that to a thousand units going berserk all at once at many different locations. Most of them will probably be stationed at military bases, everyone there is a target as either military service personnel or NGOs. Lots of expensive eq

      • by mentil ( 1748130 )

        All good points. OTOH, if you're able to put them going berserk on a timer, why not time it to coincide with when your intelligence says a VIP is going to be present?
        Bonus points if the killbot has IoT flaws that allow for backdoor access to the military network.

    • by pikine ( 771084 )

      At least when humans are couped, there will be some resistance. Robots don't know their rights and wrongs, so creating an army of killer robots is just a roundabout way to voluntarily give away all your military power to someone else. The money US spends on AI military now is the money Russians will save when they turn the robots back against the US.

      They might as well save everyone the hassle and kiss up to the Russian overlords now.

  • by Todd Knarr ( 15451 ) on Sunday September 30, 2018 @12:47AM (#57398062) Homepage

    Replacing human combatants with robots is a bad policy decision in general. The reason is simple: one of the main reasons restraining governments from waging war is the cost in terms of their own citizens killed in the fighting, removing humans from the fighting removes that restraint. We'll have enough problems with governments that already don't care about their own citizens, we don't need to add every other government to that list until we figure out another way to discourage them from starting wars any time they don't get their way.

    On top of that, the possibility of robots being subverted by attackers isn't in any way overstated nor are the reactions to the possibility over-reacting. Look at our computer networks today and try to convince me that we can somehow make botnets and malware vanish overnight, and then picture a world where "distributed denial of service attack" translates to "security guards shooting any human who enters the shopping mall".

  • People keep talking about this like this is some futuristic threat. Yet we are living with this threat(and managing it) for a while.
    We have weaponize drones and other remotely controlled platforms for many years. I am aware of one incident where the Iranians managed to hijack a US drone, and force it land. But this was presumably done with GPS spoofing and did not give them sufficient control to attack anything.

    Air to air missile systems have fully autonomous modes of operation, and we trust them not to sho

  • "Imagine an artificial-intelligence-driven military drone capable of autonomously patrolling the perimeter of a country or region and deciding who lives and who dies, without a human operator. Now do the same with tanks, helicopters and biped/quadruped robots." A United Nations conference recently decided not to ban these weapons systems outright

    Forgetting for a moment the logistics of actually enforcing such a ban, what would such a ban actually entail?

    Presumably the ban would not apply to the mere act of

  • wrong (Score:5, Insightful)

    by Tom ( 822 ) on Sunday September 30, 2018 @01:35AM (#57398146) Homepage Journal

    Every machine can be overridden, tricked, hijacked and manipulated with an efficiency that's unheard of in the realm of human-operated traditional weaponry.

    It isn't really all that difficult to buy, bribe, threaten or convince a human.

    The difference is scale. Humans are polymorphic, so they are not exact copies of each other and the identical exploit will work on one, but not others. So you need to customize your exploit for each of them, which makes mass hacks difficult. That is the reason social engineering works, but is rarely used large-scale.

    • Every machine can be overridden, tricked, hijacked and manipulated with an efficiency that's unheard of in the realm of human-operated traditional weaponry.

      It isn't really all that difficult to buy, bribe, threaten or convince a human.

      Yes, a human. OP said every machine. Learn to subvert one, and you've got them all. Efficiency. That's the problem.

  • Who really thinks that any one country has any say over whether such weapons will be developed? BTW: How is this different from worrying about the same kind of scenarios involving, say, a bunch of F-35s? Not really, I would say, as those killer robots wouldn't get far either without ample human arming and maintenance.

  • by Max_W ( 812974 ) on Sunday September 30, 2018 @02:15AM (#57398210)
    What if a SUV which has got 500 horse powers and weighs three tons gets hijacked? One needs to study for years to pilot an UAV, but everybody can drive.

    In my opinion there should be a legal limit to personal car power and weight; say 100 hp and 1500 kg.

    And it is not "maybe", it is happening. About one and a half million(!) people are being killed on roads each year by cars globally. Times more badly injured. These are figures of a WW3. These accidents are related to drinking, suicides, mental illness, terrorism, drugs, etc.

    And practically nothing is done about it. Even more powerful and massive cars hit the market. So, please, stop blaming drones. Let us first learn how to handle the real problem at hands.
  • Basically exactly fucking that.

  • Comment removed based on user account deletion
  • What about cars, trucks, trains, airplanes, medical devices, home automation, mobile phones, smart TVs ... you name it?

  • by Hognoxious ( 631665 ) on Sunday September 30, 2018 @03:15AM (#57398294) Homepage Journal

    They will kill something. It'll just be a different thing than if they hadn't got hijacked.

  • There is one sinple general rule I've learned about life in general over 70+ years. It reads, "If it can be done, it will be done." It's one of Joanne's laws, I guess. I have many I've derived over the years. This one is the most generally applicable. If there is a market of people willing to pay for something and any hole of any size through which to slip product, the market will be fulfilled. Drugs are the most obvious such. Guns are another obvious market filled because there was a way to fill it, a way

  • Here in Germany, our population is only about 80 million.
    Regrettably, as we have shown in the past, this number is sufficient to achieve world, or even regional domination for the long term.
    It seems to me these robots you speak of is the answer to our past failings.

    Sign up for Duolingo my friends and start learning German now.

  • Vehicles can be used as lethal weapons. Imagine all those self driving vehicles which suddenly flip one "if" statement, and rather than avoiding pedestrians they aim to hit them. Better yet, imagine a more advanced hack which will perform face recognition and only target specific pedestrian, or even a group of pedestrians. Organized car attacks could be used to attack infrastructure too. 100 million cars suddenly used as weapons might present more danger to the people than a few thousands border patrol robo

  • Comment removed based on user account deletion
  • Locking down a platform to respond only to authorized access is pretty easy. Microsoft is, ironically, a perfect example. Using cryptographic signing, its mass-update feature has never been compromised. The same robust authentication mechanisms are almost certainly in place for military hardware, making it statistically improbable that anyone can "hijack" the controls without physically taking them over. And if the enemy can take over your command center, then all bets are off anyway.

  • The most startling fact I've heard about the robot army scare is that Boston Dynamics, or whatever they're called now, has as a goal, making that mule robot so fast that a human will need to use a strobe light to see its movements clearly - it'll be a blur to the human eye otherwise, when it's done. Keep that in mind when you consider the idea of a human army fighting a robot army. Now make those robots autonomous and more intelligent than humans.

  • Between a hijacked " killer " robot and today's method of calling the police for the sole purpose of sending a Swat team to the targets home.

    One is directly under your control, the other indirectly. The outcome is pretty much the same.

  • They will fight Chuck Norris and loose!
  • I, for one, welcome our new robot overlords.
  • Comment removed based on user account deletion

Real programmers don't comment their code. It was hard to write, it should be hard to understand.

Working...