The Struggle To Ban Killer Robots 138
Lasrick (2629253) writes "The Campaign to Stop Killer Robots is a year old; the same month is was founded, the UN's special rapporteur on extrajudicial, summary or arbitrary executions called for a moratorium on the development and deployment of autonomous lethal weapons while a special commission considered the issue. The campaign is succeeding at bringing attention to the issue, but it's possible that it's too late, and if governments don't come to a common understanding of what the problems and solutions are, the movement is doomed. As this article points out, one of the most contentious issues is the question of what constitutes an autonomous weapons system: 'Setting the threshold of autonomy is going to involve significant debate, because machine decision-making exists on a continuum.' Another, equally important issue of course is whether a ban is realistic."
Too late. (Score:5, Insightful)
Even if such technology is never deployed, its existence represents a bargaining chip for that nation at the negotiating table. See nuclear weapons for precedent. This is essentially trying to stuff the genie back into the bottle; not gonna happen, no matter who says what.
Machine logic (Score:5, Insightful)
because machine decision-making exists on a continuum.'
No kidding. Depending on how you define it, a cruise missile could be considered a one-use killer robot. It executes it's program as set on launch.
Now consider making it more sophisticated. We now provide it with some criteria to apply against it's sensors when it reaches the target location. If criteria A is met, dive and explode on target, if B, pull up and detonate more or less harmlessly in the air. If neither criteria is met, it depends on whether it's set fail safe/deadly.
This is mixed - on the one hand properly programmed it can reduce innocent casualties, but on the other it encourages firing missiles on shakier intelligence. But then again Predators armed with hellfires are a heck of a lot more selective than WWII gravity bombs. As long as you presume that at least some violence/warfare can be justified, you have to consider these things.
On the whole, I like weapons being more selective, tends to cut down on civilian casualties, but I think that it's a topic more deserving of careful scrutiny than a reflexive ban.
Re:Alarmist much? (Score:5, Insightful)
It's going to be driven by reaction time (Score:5, Insightful)
A robot is going to (or will eventually) react much faster to a threat or other adverse conditions than a human can. If you've got a hypersonic missile heading toward a carrier, are you put a human in the loop? Nope.
There are simply going to be many many situations where a robot will neutralize a threat faster than a human can, and those situations will increase if fighting against another autonomous army.
Is this a good thing? No, it's like atomic weapons. We're heading toward another arms race that will lead us to the brink or over. We barely survived the MAD era.
When Killer Robots are illegal... (Score:4, Insightful)