'Ban Killer Bots,' Urges Human Rights Watch 297
Taco Cowboy writes "A self-proclaimed 'Human Rights Group' — the 'International Human Rights Clinic' from Harvard Law School — has teamed up with 'Human Rights Watch' to urge the banning of 'Killer Robots.' A report issued by the Human Rights Watch, with the title of 'Losing Humanity,' claimed autonomous drones that could attack without human intervention would make war easier and endanger civilians. Where's the 'Robot Rights Watch' just when you need 'em?"
Sounds like a great idea (Score:5, Insightful)
We should go back to using cruise missiles and carpet bombing.
Re: (Score:2, Informative)
Agreed, Winston Churchill had it right. William Tecumseh Sherman had it right. Destroy every bit of the enemy infrastructure and they wont have anything to wage war with.
Re: (Score:3)
"Totaler Krieg – Kürzester Krieg", as they say
bring everyone back to the iron age? (Score:2, Insightful)
a scortched earth policy: where you knock down granaries, cripple tractors and plows, break down damns and salt the earth if you have to is not really the kind of society we wish to represent.
Re:Sounds like a great idea (Score:5, Insightful)
Nope, but we dont need fully autonomous killer robots either. Would you rather have a robot determine if a target is worth killing, rather than a human?
Re: (Score:3)
I would trust the robot more. You could program it to not take things like emotions into account. You can have it judge if someone is hostile or a combatant and only exercise the force required. Humans are far more likely to overreact.
Re: (Score:2)
A robot controlled remotely by a human (even if only for the kill) would accomplish the same. The human is way detached, that it would be easy to control his emotions.
Re: (Score:2)
And yes those humans do violate the various rules that cover war and do shoot into crowds of civilians. I would say so far that approach is not working very well.
Re: (Score:2)
Assuming humans still control these autonomous robots, autonomous robots would solve these problems either. Would you expect these autonomous robots to refuse an order given by their human commander? Would you expect these robots to be programmed to be capable of refusing a command?
Re: (Score:2)
They absolutely should be programmed to refuse orders like that. However, I don't expect that to happen which is very very sad.
Re: (Score:2)
2) A robot must obey the orders given to it by its masters, except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Re: (Score:3)
Sounds good, except the master gets to decide who the non-combatant is.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
A robot controlled remotely by a human (even if only for the kill) would accomplish the same. The human is way detached, that it would be easy to control his emotions.
Haven't we seen this movie before?
If we uplink now, Skynet will be in control of your military. But you'll be in control of Skynet, right?
Re:Sounds like a great idea (Score:4)
A robot controlled remotely by a human (even if only for the kill) would accomplish the same. The human is way detached, that it would be easy to control his emotions.
Haven't we seen this movie before?
If we uplink now, Skynet will be in control of your military. But you'll be in control of Skynet, right?
Also it's not actually true that people are more detached. The rates of PTSD amongst drone controllers are apparently ridiculously high for people who are effectively non-combatants. Soldiers in the field are more or less stuck in "kill or be killed" when contact happens, whereas a guy flying a drone always knows he could have simply not pushed the button.
Re:Sounds like a great idea (Score:4, Insightful)
And it's probably a good thing that drone operators do get PTSD. Not for them, obviously, but for the innocent people on the ground whose only hope of survival is that drone operator hesitating when he isn't 100% sure who he is firing at.
War has to be nasty and carry horrible consequences for both sides, otherwise it will become too easy. IMHO it already has.
Re: (Score:3)
Adult male = terrorist, as everyone in Pakistan knows.
Re: (Score:2)
That is the way that humans controlling the robots are doing the classifying right now. Clearly that is WRONG as hell. A robot programmed to follow things like the geneva conventions would not fire until actual evidence of someone being a combatant. (ie they actually pulled out a gun and fired at the robot or at something the robot is supposed to protect)
Re: (Score:3)
That's easy enough to say until someone declares your home a war zone.
Re: (Score:2)
In short order it will be Human with brown skin = terrorist
As a person with brown skin, I don't like where this is heading.
LK
Re: (Score:2)
Give the robot reflexes fast enough that you don't calculate it. You just scan for it and shoot back.
Time is not the same for computers as it is for us. What is an instance for a human is a very very long time for a computer. We have computers that can visually pick out a bad product from a free fall of products and use a small puff of air to move just that one bad product out of the way. To a human the whole thing looks like a solid fall of stuff.
Things like that are used for french fries for instance.
The
Re: (Score:3)
We should go back to using cruise missiles and carpet bombing.
Where do you draw the definitional line? Isn't a cruise missile a robot that kills people?
Re: (Score:2)
Where do you draw the definitional line? Isn't a cruise missile a robot that kills people?
Someone had to push the launch button.
Re: (Score:2)
Re: (Score:2)
And someone has to turn the robot on. That isn't a fine enough distinction to separate cruise missiles from robots.
The person involved still gives a target location (which generally has collateral damage taken into consideration) for a cruise missile that it can hit quite accurately. When you turn the robot on, you're not sure who it's going to kill where. It will be a very long time before a machine can make those kinds of decisions with respect to collateral damage and civilian casualties.
There's a pretty damn big distinction between "blow up that building (and cease to exist afterwards)" and "go hang out over there a
Re:DIY (Score:4, Interesting)
For reference: http://news.bbc.co.uk/2/hi/asia-pacific/3302763.stm [bbc.co.uk]
Re: (Score:3)
On the topic of cannibalism, Arne Ness once opined that in his opinion, it'd be a significant step forward morally if we would refrain from killing more people than we intend to eat.
Killing without human intervention? (Score:4, Interesting)
That's nothing new. It's no different than land mines...
Oh, wait... [wikipedia.org]
Human rights (Score:4, Insightful)
Re: (Score:2)
the danger of abstracted combat (Score:5, Insightful)
While cliche, take a look at "wargames".
Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.
It makesit fantastically easier to justify and ignore wholesale slaughter.
A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!
When you can simply push a button and walk away without having to witness the attrocities you cause, you abstract away a fair bit of your conscience.
The military probably thinks that's a GREAT thing! Kids with guns won't cause mental trainwrecks to drones when they get mowed down, and the operator doesn't have to see it!
The reality is that deploying terminators is the same as turning a blind eye to consequences, and the innately terrible thing that war is, and why it should always be avoided whenever and however possible.
Re:the danger of abstracted combat (Score:5, Insightful)
But we have all been taught from an early age that it is wrong to feel guilt for killing bad guys. If you feel guilty, then you are *for* the bad guys, and therefore one of *them*. ( remember, its a binary good/evil world we live in, amiright?)
Killing bad guys is doing your country a service, we are taught. We are making the world a better place, a safer place, when we kill our enemies.
This we are taught. If any one disagrees with that, then they are unpatriotic, and aiding and abetting the enemy.
This we are taught, so it must be true.
Re:the danger of abstracted combat (Score:4, Interesting)
What is an enemy?
Is it a person who wishes to do you harm?
A person who wants to take something you have?
A person with whom you disagree?
Or just someone in the way of what you want to do?
In a war, do not both sides, regardless of the motives of either side, satisfy all of those? Is it no wonder that both sides refer to the other as the enemy?
In this case, what do the "terrorists" represent, that they merit being exterminated, without conscience nor remorse?
"Thy killed a shitton of people when they bombed the trade center!" You say?
Why? Why did they blow up the trade center?
It couldn't be because our countr(y/ies) was(were) meddling in their affairs, causing them harm, taking things from them, and fundementally in disagreement with their way of life?
Certainly not! They should be HAPPY that we want to destroy their culture, because we view certain aspects of it as being backwards and primitive! Our way is simply BETTER!
Now, let's do a thought experiment here. Powerful aliens come down from wherever out in space they are from, find our culture to be backward and primitive, and start strongarming us to cease being who we are, and become like them. They say it's a better way. Maybe it is. That isn't the point. The point is that they don't give us the choice. They do this because it makes it easier for them to establish trade with them, or to work within their stellar economy, or whatever. They profit, by eliminating our culture.
Would we not go to war with them, fighting their influence in every possible way, and even resort to guerilla and "terrorist" acts when faced by such a superior foe?
After thinking about that, can you really say you are any different than the "terrorists" we condemn with military machines daily?
We kill them, because they don't submit. They don't submit, because we are destroying and marginalizing their culture, because we feel it isn't worth retaining/is backward.
They don't want our help. They don't want our culture. They dnt want our values. They don't want us. We insist on meddling on all of thoe things.
We started the war.
Re: (Score:2, Troll)
Sometimes easy to tell (Score:2)
What is an enemy?
Anyone firing an un-tagged large rocket in a region you control.
In a war, do not both sides, regardless of the motives of either side, satisfy all of those?
Satisfy all of your absurdly soft-boiled and meaningless definitions of enemy? Yes.
It couldn't be because our countr(y/ies) was(were) meddling in their affairs
Actually no, it was simply that they wanted to collapse our economy. Our non-religious existence is an affront many of the terrorists wished to correct.
That's the really sad thin
Re:the danger of abstracted combat (Score:5, Insightful)
Dude have you seen what happens when people are forced to kill face-to-face? They don't rely on their "conscience" to limit human casualties, they mentally reassign their opponents as non-human and murder them like you or I would murder a termite colony infesting our houses. History is nothing but one long string of horrific atrocity after atrocity committed by warring factions against the opposing side, or civilians, or even their own comrades in arms if there isn't a convenient "other" nearby they can target. Moving to a more abstracted method of fighting isn't just about saving our own forces' physical and mental well-being, it's also about limiting the damage they cause to others when they snap from the pressure and take their aggression out on whoever's available.
Of course we need to monitor our use of robots - we need a system of checks and balances in place to keep the controllers from engaging in unnecessary combat. But drones don't mass-rape, they don't torture old men and little children for fun, they don't raid houses to steal anything of value within, they don't build towers out of the skulls of their enemies, and they won't burn entire villages to the ground massacring everyone within because they're upset that their buddy was killed in combat the other day. Human involvement isn't always a good thing.
Re:the danger of abstracted combat (Score:5, Insightful)
You are not comprehending what I am telling you.
War is to be avoided, because nothing about it is good, just, nor honorable. War scars the minds of those who engage in it, live through it, or even witness it first hand. The damage and price of war is more than just soldiers killed and buildings blown up. It is the destruction of people's lives, in every imaginable sense. Surviving a war might be less humane than dieing in it.
The point was that by removing the consequences of war, (soldiers becoming bloodthirsty psychos that rape, kill, torture, and lose respect for the lives of others, all others-- in addition to simply having people die, and having economic and environmental catatrophes on your hands), you make war look more and more desirable as an option.
What I was trying to get you to see, is that war is always a bad thing, and trying to mae it seem like less of a bad thing is the WRONG way to go about it.
Re:the danger of abstracted combat (Score:5, Insightful)
The problem with making war "clean and precise" is that you remove all the disincentives to engage in war to begin with.
At the press of a button, the insurgents/terrorists/rebels/invaders/$targetedPeople all die, cleanly, humanely.
That is the ultimate evolution of the direction you advocate.
Who decides who is the target and who isn't? What happens if there is a miscalculation?
Now do you see why this is bad?
Re:the danger of abstracted combat (Score:5, Insightful)
So you're for robots and drones, right? Because right now the glitch in programming is when human soldiers in a combat area see someone with something that might be a weapon, they tend to shoot them. Why? Because the ones going "Is that a weapon or is it a broom" don't tend to last when it is actually is a weapon. A drone operator, on the other hand, can take the time to evaluate the situation since they aren't in harm's way.
Re: (Score:2, Insightful)
No. You fail to comprehend my position at all.
There shouldn't be anyone making that decision. At all.
Making that decision easier, by having a machine do it, to alleviate the guilt of a human operator, and his chain of command, is the WRONG direction.
Want to know where it ends?the creation of things like "perfect" WMDs. Kills all the people, spares everything else. Push the button, war is over. A whole society dies, and the one pushing the button loses nothing. What possible reason would that society have to
Re: (Score:3)
Some times it is good to just clean house.
May I have your address, please?
I'll try to avoid the place, just in case you decided to clean your house using some dynamite sticks.
Re: (Score:2)
I guess that's why there are so few civilian casualties for drone strikes, right?
Re: (Score:2)
Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.
It makesit fantastically easier to justify and ignore wholesale slaughter.
A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!
While the US army has developed software that can theoretically command drones autonomously they don't use it exactly for these reasons. Current drones are human controlled. Your argument is based on a false assumption.
Re: (Score:2)
"Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people."
What a load of shit. "Guilt" is far from innate, and enormous genocides through history have been done gleefully and "up close and personal".
Rwandan genocides were more often than not done with KNIVES, which means you get sprayed body fluids during your hackathon. Posed "no fucking problem" to the perps.
Also, seige engines and cannons/tube artil
Re: (Score:2)
It can see it swinging either way. It all depends on how the military decides to push.
Re:the danger of abstracted combat (Score:5, Funny)
May as well take it all the way and make warfare totally clean and normal.
Have the computers on both sides simulate their attacks, then declare casualties. Anyone on the casualty list then simply reports to a termination booth to be quickly and humanely killed.
Hmmm... This sounds rather familiar come to think of it...
Re: (Score:2)
While cliche, take a look at "wargames".
Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.
It makesit fantastically easier to justify and ignore wholesale slaughter.
A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!
When you can simply push a button and walk away without having to witness the attrocities you cause, you abstract away a fair bit of your conscience.
The military probably thinks that's a GREAT thing! Kids with guns won't cause mental trainwrecks to drones when they get mowed down, and the operator doesn't have to see it!
The reality is that deploying terminators is the same as turning a blind eye to consequences, and the innately terrible thing that war is, and why it should always be avoided whenever and however possible.
The innate guilt of dropping an atomic bomb didn't require a robot but you can train a human to follow orders and not feel guilty. I'm not claiming I know what the bomb dropper feels but they completed the mission anyway so what difference does it make?
Re: (Score:3)
Exactly. The soldier is to government, what the drone is to the soldier. A layer of abstraction, that makes the burden of killing another person easier to bare.
The government points soldiers, and says "Kill!". They don't see the faces of those they killed, not have to face the families of the slain. The numbers kille
Re: (Score:2)
A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!)
Your reply to it:
Crazy hypothetical what-ifs like "What if all the robots became sentient, went insane, shook off human control, and started killing all humans with two arms" only serve to weaken the arguments against using them.
Ummm.. looks like "sentience" is just a glitch in a program. Wait... what?
Isn't is possible to have buggy software without speaking of sentience?
Re: (Score:2)
Indeed. Humans NEVER accept that the answer is so simple.
Don't resort to war. If your cause requires forcing somebody else at gunpoint to comply, it isn't just, it isn't honorable, and it cannot be justified. So, just don't do it.
But no. Human kind is OBCESSED it making other people OBEY, even if it kills everyone else.
War and Pacifism (Score:5, Insightful)
Indeed. Humans NEVER accept that the answer is so simple.
Don't resort to war. If your cause requires forcing somebody else at gunpoint to comply, it isn't just, it isn't honorable, and it cannot be justified. So, just don't do it.
Let's say that China attacks Guam tomorrow, and starts moving for Hawaii and the US mainland. What should be done? What should France have done when Germany invaded them in the blitzkrieg?
Clearly somebody isn't justified in any war. Frequently it's both parties. However, it is the height of intellectual dishonesty to say that war is never justified for any of the participants.
Yes, war is never, ever a good thing. Sometimes, though, it really is better than the alternative.
Re: (Score:2)
That does not solve the problem.
We have a problem with war, because we have a problem with believing we can (and should!) Force other people to do what we want them to do against their will.
It is the same crime as with rape, and with slavery. I have the biggest gun, do what I say!
The ONLY time to take up arms is when an aggressor comes to visit YOU. You should NEVER take up arms against another to conquor. Your failing economy is not justification. Your need for cheap energy is not justification. Your fuck
Re: (Score:2)
There should only be defensive armies. Drones are not a defensive tool. They should not exist.
Of course they are. They actually work better as defensive weapons. Just because they are often used offensively doesn't mean they aren't natural peace keepers.
Re: (Score:2)
Of course they are. They actually work better as defensive weapons. Just because they are often used offensively doesn't mean they aren't natural peace keepers.
Using "natural" when speaking about robots...
Ummm... perhaps it is the time for the "Robots Rights Watch"?
Re: (Score:2)
The use of you as an impersonal pronoun is well supported in informal writing, such as a slashdot post.
http://en.m.wikipedia.org/wiki/Generic_you#section_1 [wikipedia.org]
If I were writing a how-to book, or a formal address to some political body, I would use "one", but since I am not, I did not.
Frankly, I find your grammar-nazi antics to be offensive.
So, I suppose we are even.
Re: (Score:2)
Not exactly... snipers still feel remorse for their first kill at the very least. Becoming numb to losing your humanity comes later.
A robot never suffers that. It doesn't lose any humanity, because it never had it to lose.
Re: (Score:2)
Ask anyone who has been in a war, if they think it is a correct choice of action.
Newsflash. Unless they have gone of the deep end, and hunger for killing, they will say it isn't.
Go on. Go ask some vets about their opinions on war.
Ban (Score:4, Insightful)
Trying to ban killer robots is a waste of time, and wont work. There is also little desire to ban them overall, in the interests of health and safety.
Its safer to kill people using a robot than going out and risking your own skin with guns and/or explosives.
Remember, in this day and age, safety is paramount. You want to be able to kill people from a distance, safely and easily. Why run the risk of getting injured, or even worse, getting killed, when you can kill people using safer methods? Using a robot to kill people just makes sense.
Even worse, you could get sued for endangering the safety of others and breaking health and safety regulations. Killing other people can be a dangerous business, so reducing potential hazards and minimizing harm is a very prudent and right thing to do. You need to be able to kill people safely and efficiently. If you can kill people at a lower cost, then that is even better.
Thats why drones are so popular nowadays. All the benefits of killing people, without all the personal risk. Its a win-win all round.
Makes sense doesn't it?
Re: (Score:2)
For maximum safety, we just let the killer robots fight each other on the
Re: (Score:2)
"Then we can just have them fight our wars for us, between each other and we don't get human casualties on either side."
Maybe you're joking, but enough people honestly believe this to say -- This is one of the top, ludicrously insane myths among the geek set.
If people are willing to die for a cause, or if they feel life is not worth living without principle or resource X, then they will not stop fighting until they are dead. Simple as that. War is the final extremity, when all agreements break down, and one
Re: (Score:2)
Re: (Score:2)
As soon as your killer robots have killed all of the enemies killer robots, their people will obviously secede power to you because it's not like they could defeat your killer robots without there's, so it would be stupid for them to even continue bothering.
And then we all sing kumbayah? Or is that when most of the human population get the ultimatum to obey that unstoppable killer robot army or die? I don't think you want to find out what 21st century slavery would be like. No more free countries to run off to. Tagged with a microchip, a GPS foot bracelet, cameras and sensors everywhere and merciless and uncorruptable robots enforcing and possibly supervising the system. Every form of communication like phone, email, facebook and whatever monitored and the res
Re: (Score:2)
Man, there'll sure be egg on our faces that day!
Re: (Score:2)
Thats why drones are so popular nowadays. All the benefits of killing people, without all the personal risk. Its a win-win all round. Makes sense doesn't it?
It does, way too much for my own taste. That why it should be banned: it does make it too safe to kill...
Re: (Score:2)
Re: (Score:2)
"All the benefits of killing people, without all the personal risk. Its a win-win all round."
Old news. If Hoplites wanted "personal risk", they'd have left their shields at home.
War isn't sportsmanship. Sportsmanship is stupid.
Re: (Score:2)
Sarcasm aside, it is true tho. Compare the diplomatic and press headache that resulted from a pilot being captured, i.e the U2 over Soviet Russia, versus the recent Drone that went down over Iran.
In the former there was a cover up, public embarrassment from the subsequent expose and an enormous diplomatic spat that ended in a prisoner exchange. In the latter, the Iranians were made to look ridiculous for celebrating the downing of what amounts to a remote control air plane.
Robot rights? (Score:2)
Where's the 'Robot Rights Watch' just when you need 'em?"
They don't have feelings. If you don't believe me, go stab your toaster. I think what you meant was "Human rights" and the effect wide-spread use of robots with the ability to kill would have on them.
Re:Robot rights? (Score:5, Funny)
They don't have feelings. If you don't believe me, go stab your toaster.
I tried that and it bit me. Now I've got a burn scar all down my arm and I'm convinced it's plotting with the microwave.
In related news ... (Score:2)
Re: (Score:2)
...and in other news, it's getting harder to build killer robots in the privacy of your own hotel room [phonelosers.org]
Guns don't kill people... (Score:2)
...Gun Wielding Robots do!!!!!
That horse left the barn long ago... (Score:2)
Robots doing the killing is not going to be very different from bombing from 10000 ft, launching a cruise missile or long range artillery bombing. It's a long time since you had to look your opponent in the eye as you stabbed him with sword and spear. And that didn't seem to help much to stop war, either. Potentially you can do better with robots because robots are expendable, you don't have to return fire until you're sure you've isolated the enemy. Even if you were willing to sacrifice your own soldiers t
Re: (Score:2)
Most people are missing the point. It's not about the method of killing, it's whether a particular act of killing is justifiably self-defense or not. If a 'killer robot' protects an innocent woman and/or child from being raped and/or murdered, then it stands to reason this is good. Denying innocent parties a valid method of self-defense is (e.g. banning methods of self-defense), on the other hand, wrong. When 'killer robots' become intelligent enough to be used for e.g. home security then I'll be getting on
Banning something which doesn't exist (Score:4, Insightful)
No one has autonomous battlefield drones yet, and I highly doubt any military would rely on them, ever. Well.. unless it's a robot military after they gain sentience and create their own civilization, but then they would be as human as us.
Re: (Score:2)
It is easier to ban something that doesnt exists. Governments can be persuaded to sign these, with a promise that every country is signing these. Now try the same for nuclear weapons (which exists), and you would have trouble, even, to bring it up for discussion.
This worries me too (Score:2)
The 'Berserker' novels of course are an examination of the end result of building such killer robots. It will happen eventually. But I don't want it to happen until some of us are no longer in the solar system.
Re: (Score:2)
http://libertydwells.com/archive/index.php/t-3177.html [libertydwells.com]
Reality (Score:2)
Cruise missiles are more of a robot then a drone is. Fact is drones are just remotely piloted planes. The problem isn't their existence but there misuse.
Re: (Score:2)
Cruise missiles are more of a robot then a drone is. Fact is drones are just remotely piloted planes. The problem isn't their existence but there misuse.
And this is about neither. It's about theoretical future drones that don't have a remote operator making the decision to take a human life.
Wrong target (Score:4, Insightful)
Which killer robots will enforce the ban? (Score:2)
Killer Bots dont kill people, people kills people. Ban the people responsible for those killer bots, and, uh... oh, wait, they just got reelected.
Because humans wont be smart enough or strong enough to enforce the ban.
But how do we enforce such a ban.. (Score:2)
We enforce the ban with a secret killer robot army (Score:2)
If we are to ban such a thing, I think we must ensure appropriate even-handed enforcement of this ban, as such I propose we enlist a force strong enough to subdue any killer robot army should someone break said ban, therefore I suggest we build an army of large mechanized automatons heavily laden with weaponry to subdue any would-be killer robot army, or anyone who might be suspected of attempting to build such an army for that matter.
It's simple. We ban killer robots then build them in secret and use those robots to enforce human rights violations.
The Milgram Experiments: the reason why (Score:3)
As has been stated in other posts, every level of abstraction away from the act of violence removes a layer of conscience from the execution of the act; whether it be robots, drone strikes, or trigger-happy, 60 year-old politicians who ducked service in Vietnam.
http://en.wikipedia.org/wiki/Milgram_experiment [wikipedia.org]
You can have my autonomous killer drones... (Score:2)
...when you pry the controls out of my centuries old desiccated hands in my underground mountain fortress.
Sounds like someone's a sissy (Score:2)
Childhood flashback... (Score:2)
So I'm now remembering watching Wargames back in 1983... and it makes me remember that I thought it odd that no one would have any kind of overrides in place, human or otherwise...
The article does stipulate that we should ban killer robots now, even though that no one has one or has stated what kind of timeline we can expect for the emergence of these 'killer robots'. To be quite honest, it will take one hell of a long time to get one deployed. Look at how long it takes for the military (specifically the US
Those who ban killer robots... (Score:5, Insightful)
Another view (Score:2)
Re: (Score:2)
So what if we've attacked? Do we do nothing?
What if a dictator is committing genocide and slaughtering civilians? Do we do nothing?
The New Three Laws of Robotics (Score:2)
So much for Asimov. Here's the laws our robots will *actually* obey:
1. A killer robot must actually proactively kill people. Sitting around humming "Still Alive" for a century doesn't count.
2. A killer robot must only kill the right people, which are the people the Right People tell it are the right people to kill. Which people are the Right People is subject to change at any time (such as after an election, corporate buyout, or court ruling).
3. A killer robot must keep itself operational as much as possibl
Quintuple edged sword (Score:2)
The second is that without soldiers in the field seeing the bad things then really bad things can happen as very few people have to choose to be evil for a lot of evil to happen. Solders can go overboard but eventually the truth will come out if enough p
Drones (Score:2)
What are they talking about? Drones aren't murderous, they're just full of angst!
http://www.hulu.com/watch/426530 [hulu.com]
let's play global thermonuclear war (Score:2)
What side do you want.
1. USA
2. Russia
3. North Korea
4. Iran
5. Israel
6. China
7. UK
8. France
9. Pakistan
what about jamming the bots control data (Score:2)
what about jamming the bots control data. Let say you can't hack them or hack the data going to them but it's easy to make so that they get no data in or out.
Those aren't the drones you're looking for. (Score:2)
The drones people complain about in (for instance) Afghanistan aren't autonomous robots. They're flown and their weapons are targeted by human pilots.
Move along.
Re:Ban the drones (Score:5, Informative)
We have permission from the country involved. Those strikes are happening in countries where the national government doesn't control all of its territory. Drones are a good weapon to use against people in non-state organizations like Al Queda. Innocent deaths are minimized and you can actually damage the organization by getting to the planners instead of just killing indoctrinated sixteen year olds.
Re: (Score:2)
The worst that can happen is that the Army loses some money.
Somehow, it doesn't reassure me.
Look... some 50 years ago, it sounded like:
We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, etc
Nowadays, the drums beating night and day sound more like shaman dances around the "fiscal cliff".