What Will Happen When Killer Robots Get Hijacked? (marketwatch.com) 157
"Imagine an artificial-intelligence-driven military drone capable of autonomously patrolling the perimeter of a country or region and deciding who lives and who dies, without a human operator. Now do the same with tanks, helicopters and biped/quadruped robots." A United Nations conference recently decided not to ban these weapons systems outright, but to revisit the topic in November.
So a MarketWatch columnist looked at how these weapons systems could go bad -- and argues the risks are greater than simply fooling the AI into malfunctioning. What about hijacking...? In warfare, AI units can function autonomously, but in the end they need a way to communicate with one another and to transfer data to a command center. This makes them vulnerable to hacking and hijacking. What would happen if one of these drones or robots was hijacked by an opposite faction and started firing on civilians? A hacker would laugh. Why? Because he wouldn't hijack just one. He would design a self-propagating virus that would spread throughout the AI network and infect all units in the vicinity, as well as those communicating with them. In a split second, an entire squad of lethal autonomous weapons systems would be under enemy control... Every machine can be overridden, tricked, hijacked and manipulated with an efficiency that's unheard of in the realm of human-operated traditional weaponry.
However, the U.S. government remains oblivious. DARPA (Defense Advanced Research Projects Agency) has already announced a $2 billion development campaign for the next wave of technologically advanced AI (dubbed "AI Next"). One of the goals is to have the machines "acquire human-like communication and reasoning capabilities, with the ability to recognize new situations and environments and adapt to them." I may be overreaching here, but the UN meeting on one end and this announcement on the other, make me think that the U.S. government isn't just pro-robotic -- it may already have a lethal autonomous weapons ace up its sleeve.
The article ends with a question: What do you think about killer robots replacing human combatants?
And what would happen if killer robots got hijacked?
So a MarketWatch columnist looked at how these weapons systems could go bad -- and argues the risks are greater than simply fooling the AI into malfunctioning. What about hijacking...? In warfare, AI units can function autonomously, but in the end they need a way to communicate with one another and to transfer data to a command center. This makes them vulnerable to hacking and hijacking. What would happen if one of these drones or robots was hijacked by an opposite faction and started firing on civilians? A hacker would laugh. Why? Because he wouldn't hijack just one. He would design a self-propagating virus that would spread throughout the AI network and infect all units in the vicinity, as well as those communicating with them. In a split second, an entire squad of lethal autonomous weapons systems would be under enemy control... Every machine can be overridden, tricked, hijacked and manipulated with an efficiency that's unheard of in the realm of human-operated traditional weaponry.
However, the U.S. government remains oblivious. DARPA (Defense Advanced Research Projects Agency) has already announced a $2 billion development campaign for the next wave of technologically advanced AI (dubbed "AI Next"). One of the goals is to have the machines "acquire human-like communication and reasoning capabilities, with the ability to recognize new situations and environments and adapt to them." I may be overreaching here, but the UN meeting on one end and this announcement on the other, make me think that the U.S. government isn't just pro-robotic -- it may already have a lethal autonomous weapons ace up its sleeve.
The article ends with a question: What do you think about killer robots replacing human combatants?
And what would happen if killer robots got hijacked?
Well, there's the one thing (Score:4, Interesting)
A hacker who identifies with Thanos' agenda could start 'solving' the overpopulation problem by themselves.
Re: (Score:3)
IIRC the Iranians forced one down. The Russians have managed to jam everything, including GPS, over Syria several times. But in terms of actually hacking the drone, so you can take control of it and use it against it's official owner?
Nope.
Re: (Score:2)
Jamming prevents it contacting its base. It doesn't allow you to control it.
Re: (Score:2)
"Never" been hacked before, so must be unhackable.
Famous last words...
Re: (Score:2)
I didn't say they were unhackable. However, the nature of those hacks makes seizing control fo a combat drone from a human operator and shooting at it's troops much harder then most hacks.
The drone is loitering above the combat zone for a limited period of time before it runs out of fuel/weapons/etc. and has to return to base. It has multiple systems you'd need to crack. You'd probably need access to it's cameras to aim it's weapons properly, which means you have to hack the drone's transmission hardware. T
Re: (Score:2)
Someone wanting to take over an autonomous army wouldn't be hacking individual drones (unless maybe they're all in the same spot). The lowest barrier to takeover is to hack central command. This could be done from the inside or out. Protecting against all avenues of attack is incredibly difficult.
Your rationale is exactly how companies get into trouble with security. Consider only the hardest avenues of attack and claim it's secure. Hackers don't go after the the hardest vectors, they go after the most
Re: (Score:2)
Central computer?
Pretty sure the technical name for that is "the pilot's head."
Hacking the central computers of Creech airbase you'd be able to play hell with the contracting system, but that's about it.
Re: (Score:1)
Security Time (Score:1)
So any reasonable person would say that we should be able to design and implement inexpensive and ubiquitous secure systems before trying our hand in weapons? That we shouldn't run Wintel ecosystem as it is on those killer bots and always prefer feature creep of insufficiently engineered, too complex features over security?
Re: Security Time (Score:3)
Or cars (Score:4, Insightful)
You don't need to steal weapons from the military to get killer robots. You just need to botnet cars. They already kill people without help.
Let me driver-assist you to safety.
Re: (Score:2)
Yeah, apart from the tiny detail of having to be physically there those are exactly the same thing.
skynet will just get too smart and nuke us all (Score:1)
skynet will just get too smart and nuke us all
let's play global thermonuclear war (Score:2)
let's play global thermonuclear war!
Re: let's play global thermonuclear war (Score:2)
We should designate an official war zone over an uninhabited part of the Pacific Ocean where million dollar robots can fight each other to the death. At least that way instead of having an expensive chess match where real people die, we just have the expensive chess match.
Re: (Score:1)
We do that with the economic model already, it's giant economic robots tackling each other.
Re: let's play global thermonuclear war (Score:5, Interesting)
Like most ideas about revolutionizing warfare, this is something that's very old. Many many cultures would have champions fight to the death so the army did not have to.
For that to work couple things have to be true:
1) Both sides have to have a roughly equal chance of success. That is the champions have to have comparable combat power. Back then, nobody's going to agree to wager their war on a naked guy with a pointy stick vs. a Knight in full plate. With robots, many countries won't have a robot capable of going toe-to-toe against us.
2) Both sides have to want whatever they're fighting enough to raise an army to fight for it, but not so much that they'll actually die for it. If they're gonna die for it regardless of the outcome of the fight, the fight is pointless. If the result of losing a robo-war is that the President gets executed, his top guys imprisoned, and everyone else gets fired, your worst enemies get to run the country, etc. then when the champion loses you fight anyway.
3) Both asides have to trust that the other side will abide by the deal. You're not spending $1 Billion on robo-champion if you think those bastards will fight even if their robot loses.
Just go through recent US-involved wars in your head. how many are all three things true of both sides?
The Nazis and Japanese could have made some pretty cool robots, but we'd have fought the actual war regardless of who won the robo-battle. The Koreans, Vietnamese, Iraqis, Taliban wouldn't have robots anywhere near our league, so they'll lose the war and then fight anyway.
Re: (Score:1)
The Nazis and Japanese could have made some pretty cool robots, but we'd have fought the actual war regardless of who won the robo-battle. The Koreans, Vietnamese, Iraqis, Taliban wouldn't have robots anywhere near our league, so they'll lose the war and then fight anyway.
This touches upon the final question of the article.
A war of robot versus robot is pointless other than as entertainment.
War is specifically designed to kill humans. Humans, by virtue of their genetic based hyper aggressiveness, have an innate need to define an "other", then to do what they can to end the lives of the other.
There are often reasons - resources, different Gods, but not to worry - the reason is just a casus belli, the excuse for killing the other. One will always be found.
The only thin
Re: (Score:2)
I have this as item #3 on my list of "Top Geek Myths".
Counterargument: People don't submit to perceived tyranny because their material stuff got destroyed; rather, the opposite.
Also: "What robot soldiers could do is just as scary, though: Make outright colonialism a practical option again." War Nerd, 2014 [pando.com].
Re: (Score:2)
I'll just leave this here.
https://youtu.be/wv6pfQTl-d4 [youtu.be]
"My boy!"
Strat
Wrong question (Score:5, Interesting)
All that goes out the window when they've got a robot army. The robot army will never betray them. Sure, there's some engineers keeping it running, but they're nerds and typically lack the drive and charisma to overthrow the ruling class. Those kinds of coup are pulled off by charismatic generalissimos. So you're gonna have the ruling class, a small, well paid merchant class to keep the killer robots going and everybody else. That's you and my, btw. And the ruling class won't need us to buy their crap to be rich either. They'll own everything and have factories to build it. I suppose there'll be a few positions for their doctors and sex slaves. The rest of us get abandoned, sorta like how we ignore starving people in Africa.
Re: (Score:2)
The ruling class always needs poor (Score:2)
Re: (Score:2)
To live like Gods (Score:2)
It's a level of narcissism and power that's hard to really imagine. It's like trying to get a grasp on a google (the number, not the company). The human brain isn't well equipped for it.
And they're not going to cut themselves off from everyone, just most everyone. Also, they're not intentional
The pleasures that come with power (Score:2)
Power isn't a means to an end, it's an end in itself. If you're not smart enough to immerse yourself in the wonder of the universe like Einstein did and you don't need to work for a living there's not much else left.
Re: (Score:2, Insightful)
This is a great point - it's unclear to me what would stop them. By now we know a few things about the bulk of the owners of capital:
1. They don't believe in Democracy
2. They are motivated, and don't allow pesky things like laws to stand in their way
3. They have the patience to work across generations
Think of Trump and his children. These people want to bring about a new feudalism and bring us all back to Game of Thrones, and the Dragons are the killer robots. Human nature doesn't change.
We need to outla
Same as today (Score:1)
the military has much experience with this risk (Score:3)
There is no new risk here. Virtually every modern weapon system is robotic and many are mostly autonomous with heavy usage of data channels for control. If hackers could take them over, we'd have long ago suffered these imagined catastrophes.
Re: (Score:3)
But... but... all it takes is 2 cans of Mountain Dew and 30 seconds of rapid-fire typing and then you say "I'm in!" and now your bar tab is in the negative and Terminatrices warm your bed at night.
Re: (Score:1)
Re: (Score:2)
What about all of our fighters? Most can be slaved to other fighters, like F22s, right now and the new ones have no fully mechanical controls. They have extensive communications so that they can fight as automated teams, utilize each other's sensor suites, etc. If a hacker could get in, they could theoretically have a squadron at their control with pilots along as helpless hostages. I guess the pilots could destroy the flyability by punching out. It would be a very expensive loss if a whole squadron punched
Re: (Score:2)
For the time being I think the risk of such hacking is only a major concern when it comes to other nation states. Sure an individual or small group of hackers might find their way into a .mil system every once in awhile but they aren't likely to find a way into anything mission critical that way. Other nation states though have the resources to both develop intelligence gathering networks and possibly acquire example systems. I suppose if such systems were involved in conflicts in enough numbers with enough
Thats easy (Score:2)
<tars>Nothing Good</tars>
ha ha (Score:2)
Robo-Coup (Score:5, Insightful)
And what would happen if killer robots got hijacked?
When killer humans (i.e. a military) gets 'hijacked', it's called a coup.
It's unlikely the autonomous weapons would be hacked to fire on civilians. It'd be MUCH more effective to turn them into robotic sleeper agents that target military officials/visiting politicians (thanks to facial recognition). OTOH, firing on civilians would ensure a) civilians demand their removal, ensuring no further hacking can take place; b) their hacking is discovered immediately, ensuring the vulnerability is quickly patched, and c) an opportunity to take out more-important targets is squandered.
Re: (Score:1)
Is it all that simple?
As for people (being targets etc) can have control - that is a fairy tale. This does not work properly in majority so called democracies in the West where democracy has been replaced (for good reasons) by representative type of it. The result in US is for instance that majority probably knows that war on drugs is senseless and rather counterproductive. It goes on however even if in some places you can sell and buy ganja now and do not fear immediate legal consequences. There are other
Re: (Score:2)
It'd be MUCH more effective to turn them into robotic sleeper agents that target military officials/visiting politicians
How do you figure that? The first time you use it there'll probably be eye witnesses, security cameras, ballistics etc. pinning the robot as the guilty one. After that they will take the model out of service and scour through everything until they find out how you did it.
Compare that to a thousand units going berserk all at once at many different locations. Most of them will probably be stationed at military bases, everyone there is a target as either military service personnel or NGOs. Lots of expensive eq
Re: (Score:2)
All good points. OTOH, if you're able to put them going berserk on a timer, why not time it to coincide with when your intelligence says a VIP is going to be present?
Bonus points if the killbot has IoT flaws that allow for backdoor access to the military network.
Coup (Score:2)
At least when humans are couped, there will be some resistance. Robots don't know their rights and wrongs, so creating an army of killer robots is just a roundabout way to voluntarily give away all your military power to someone else. The money US spends on AI military now is the money Russians will save when they turn the robots back against the US.
They might as well save everyone the hassle and kiss up to the Russian overlords now.
Re: (Score:2)
So you're saying that in Russia, killer robots hack YOU?!
Bad official policy (Score:5, Insightful)
Replacing human combatants with robots is a bad policy decision in general. The reason is simple: one of the main reasons restraining governments from waging war is the cost in terms of their own citizens killed in the fighting, removing humans from the fighting removes that restraint. We'll have enough problems with governments that already don't care about their own citizens, we don't need to add every other government to that list until we figure out another way to discourage them from starting wars any time they don't get their way.
On top of that, the possibility of robots being subverted by attackers isn't in any way overstated nor are the reactions to the possibility over-reacting. Look at our computer networks today and try to convince me that we can somehow make botnets and malware vanish overnight, and then picture a world where "distributed denial of service attack" translates to "security guards shooting any human who enters the shopping mall".
We already have plenty of killer robots (Score:2)
People keep talking about this like this is some futuristic threat. Yet we are living with this threat(and managing it) for a while.
We have weaponize drones and other remotely controlled platforms for many years. I am aware of one incident where the Iranians managed to hijack a US drone, and force it land. But this was presumably done with GPS spoofing and did not give them sufficient control to attack anything.
Air to air missile systems have fully autonomous modes of operation, and we trust them not to sho
Re: (Score:2)
Not only is the future here, so is the video [youtube.com].
What does it mean to 'ban' them? (Score:2)
Forgetting for a moment the logistics of actually enforcing such a ban, what would such a ban actually entail?
Presumably the ban would not apply to the mere act of
wrong (Score:5, Insightful)
Every machine can be overridden, tricked, hijacked and manipulated with an efficiency that's unheard of in the realm of human-operated traditional weaponry.
It isn't really all that difficult to buy, bribe, threaten or convince a human.
The difference is scale. Humans are polymorphic, so they are not exact copies of each other and the identical exploit will work on one, but not others. So you need to customize your exploit for each of them, which makes mass hacks difficult. That is the reason social engineering works, but is rarely used large-scale.
Re: (Score:2)
Every machine can be overridden, tricked, hijacked and manipulated with an efficiency that's unheard of in the realm of human-operated traditional weaponry.
It isn't really all that difficult to buy, bribe, threaten or convince a human.
Yes, a human. OP said every machine. Learn to subvert one, and you've got them all. Efficiency. That's the problem.
Re: (Score:2)
Uh, you did read the part of my posting that you didn't quote, where I'm literally saying the exact same thing?
Re: (Score:2)
Re: (Score:2)
Back up and think why there are migrants in the first place as a small example for what is a far larger issue than them and Europe. When has the West missed a chance to ruin what little they have, make enemies with "collateral damage" that from
Re: (Score:2)
It's a bit more complicated than that.
During the Cold War, the West meddled in various countries affairs to prevent that they become too friendly to Russia. This was a huge part of why Afghanistan, Pakistan, Iran and a bit later other countries as well slid into Islamism.
Once Islam had taken over, some smart people understood just to what degree they had fucked up and that Islamic countries can never be allowed to become rich, properous or influential. That is why Saudi Arabia... oh... wait... I'm afraid th
Re: (Score:2)
Everything in the ME is complicated...seems everyone involved likes it that way. Which is yet another mistake.
Re: (Score:2)
I disagree. The Middle East isn't complicated at all.
It's an area where several different tribes, cultures and religions are all mixed up. As long as everyone is tolerant of everyone else, they all profit from the exchange of ideas. Whenever some intolerant version of someones religion becomes dominant, things go to hell.
The primary problem being that the dominant religion for the last thousand years explicitly hates another major religion, and is explicitly intolerant of all the others, so it takes a bit m
Re: (Score:2)
The fact that pretty much any of the interested parties has once controlled that same spot of land, and considers it all-important to get control of it back is a complication...perhaps unwillin
Re: (Score:2)
Though it is interesting that various factions can't seem to let go of the idea that one particular spot there was the "holy home of their religion"
Because when those religions came into existence, all religions were tribal religions. The christian and islamic idea that everyone can become a member is an innovation in the sphere of religion.
I agree that taking it all too seriously - self-important hubris - is probably the largest part of the problem. Overcompensated inferiority complexes?
I'm quite sure there is a strong correlation between strength of religious feelings and dick size, yes.
Re: (Score:1)
Re: (Score:2)
That was not social engineering, it was propaganda. Similar, but not the same thing.
Re: (Score:2)
Re: (Score:2)
Checksums are trivial to defeat. Hashes, a little harder.
The gov remains oblivious to what? (Score:2)
Who really thinks that any one country has any say over whether such weapons will be developed? BTW: How is this different from worrying about the same kind of scenarios involving, say, a bunch of F-35s? Not really, I would say, as those killer robots wouldn't get far either without ample human arming and maintenance.
More realistic scenario (Score:4, Insightful)
In my opinion there should be a legal limit to personal car power and weight; say 100 hp and 1500 kg.
And it is not "maybe", it is happening. About one and a half million(!) people are being killed on roads each year by cars globally. Times more badly injured. These are figures of a WW3. These accidents are related to drinking, suicides, mental illness, terrorism, drugs, etc.
And practically nothing is done about it. Even more powerful and massive cars hit the market. So, please, stop blaming drones. Let us first learn how to handle the real problem at hands.
Horizon Zero Dawn (Score:2)
Basically exactly fucking that.
Re: (Score:2)
Why just killer robots? (Score:2)
What about cars, trucks, trains, airplanes, medical devices, home automation, mobile phones, smart TVs ... you name it?
They will kill something (Score:3)
They will kill something. It'll just be a different thing than if they hadn't got hijacked.
Think a little (Score:2)
There is one sinple general rule I've learned about life in general over 70+ years. It reads, "If it can be done, it will be done." It's one of Joanne's laws, I guess. I have many I've derived over the years. This one is the most generally applicable. If there is a market of people willing to pay for something and any hole of any size through which to slip product, the market will be fulfilled. Drugs are the most obvious such. Guns are another obvious market filled because there was a way to fill it, a way
Start learning German !! (Score:2)
Here in Germany, our population is only about 80 million.
Regrettably, as we have shown in the past, this number is sufficient to achieve world, or even regional domination for the long term.
It seems to me these robots you speak of is the answer to our past failings.
Sign up for Duolingo my friends and start learning German now.
Already did. They voted for Trump. (Score:2)
Sorry, couldn't help but join in. Even though it seems there are more payed trolls posting on Trump side... robots?
Who is regulating killer cars? (Score:2)
Vehicles can be used as lethal weapons. Imagine all those self driving vehicles which suddenly flip one "if" statement, and rather than avoiding pedestrians they aim to hit them. Better yet, imagine a more advanced hack which will perform face recognition and only target specific pedestrian, or even a group of pedestrians. Organized car attacks could be used to attack infrastructure too. 100 million cars suddenly used as weapons might present more danger to the people than a few thousands border patrol robo
Re: (Score:2)
This is easy (Score:2)
Locking down a platform to respond only to authorized access is pretty easy. Microsoft is, ironically, a perfect example. Using cryptographic signing, its mass-update feature has never been compromised. The same robust authentication mechanisms are almost certainly in place for military hardware, making it statistically improbable that anyone can "hijack" the controls without physically taking them over. And if the enemy can take over your command center, then all bets are off anyway.
Strobe Lughts (Score:2)
The most startling fact I've heard about the robot army scare is that Boston Dynamics, or whatever they're called now, has as a goal, making that mule robot so fast that a human will need to use a strobe light to see its movements clearly - it'll be a blur to the human eye otherwise, when it's done. Keep that in mind when you consider the idea of a human army fighting a robot army. Now make those robots autonomous and more intelligent than humans.
Not much difference (Score:2)
Between a hijacked " killer " robot and today's method of calling the police for the sole purpose of sending a Swat team to the targets home.
One is directly under your control, the other indirectly. The outcome is pretty much the same.
easy (Score:1)
First things first (Score:2)
Re: (Score:1)
Re: (Score:2)
We begin our counter-offensive on Patch Tuesday.