Human Rights Watch: Petition Against Robots On the Battle Field 275
New submitter KublaCant writes "'At this very moment, researchers around the world – including in the United States – are working to develop fully autonomous war machines: killer robots. This is not science fiction. It is a real and powerful threat to humanity.' These are the first words of a Human Rights Watch Petition to President Obama to keep robots from the battlefield. The argument is that robots possess neither common sense, 'real' reason, any sense of mercy nor — most important — the option to not obey illegal commands. With the fast-spreading use of drones et al., we are allegedly a long way off from Asimov's famous Three Laws of Robotics being implanted in autonomous fighting machines, or into any ( semi- ) autonomous robot.
A 'Stop the Killer Robots' campaign will also be launched in April at the British House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons. The Guardian has more about this, including quotes from well-known robotics researcher Noel Sharkey from Sheffield University."
Recommended Reading (Score:5, Interesting)
http://en.wikipedia.org/wiki/Berserker_(Saberhagen) [wikipedia.org]
Fred Saberhagen's "Beserker" series.
Aside from touching on the subject at hand, it's just some crackin' good sci-fi. :)
I don't know if we'd ever reach that point ourselves, but in that series, an unknown (and now extinct) alien race, losing a war and desperate, created "doomsday" machines that were simply programmed to kill all life. They were self-replicating, self-aware AIs that took their task seriously, too.
Then again, I ask myself what some jihadist might do, if given half the chance ... . .. ..
Re: (Score:3, Informative)
Fear of robots is a red herring (Score:4, Insightful)
There are all indications that the coming robotic revolution will usher in a new era of human peace and prosperity. Robots have no emotion, no bias. Imagine deploying a few hundred (or thousand) semi-autonomous robotic peacekeepers into a conflict zone. They maintain the peace 24/7, they never tire, they are alert and objective in their duties. War is traditionally an incredibly wasteful and expensive exercise. Look at Iraq and Afghanistan! $1 trillion and thousands of allied casualties. Deploy a robot army and watch the costs come down. No need for living quarters, no need of food or water, logistics becomes cheaper in every aspect.
Like them or loath them, Drones are incredibly efficient in what they do. They are very lethal, but they are precise. How many innocents died in the decades of embargo on Iraq and the subsequent large scale bombings under Bush? Estimates run into over 100,000. Use of drones in Libya, Mali, Yemen, Pakistan have reduced costs by hundreds of millions and prevented thousands of needless casualties. Drones are the future and the US has an edge that will not give up.
Re: (Score:3)
Re:Fear of robots is a red herring (Score:5, Insightful)
An autonomous robot needs to form a model of what's happening around it, use that to figure out what its possible long- and short-term actions will be, and finally decide how desirable various outcomes are relative to each other. All of these steps are prone to bias, especially since whoever designed the robot and its initial database is going to have their own biases.
Also, a robot acting in real life cannot carefully think everything through. There's simply not enough time for that. This necessiates some kind of emotion-analogy to provide context for reflex and simple actions, just like it does on living beings.
So there will be a lot more "interventions", since the cost (to you) is lower. I think that's part of what worries the the HRW.
Re: (Score:3)
Also, a robot acting in real life cannot carefully think everything through. There's simply not enough time for that. This necessiates some kind of emotion-analogy to provide context for reflex and simple actions, just like it does on living beings.
Why does it need to think at all? It's applying rules, not creating them.
"Sir, my scans have detected unauthorized weapons. Please put them down or I will apply force."
Re: (Score:3)
I'm going to suggest that creating a machine with a comprehensive list of rules & priority for enforcement is going to be much more feasible than creating a machine with the ability to think and apply reason.
It also has a better chain of responsibility in the case of things going FUBAR. "The CEO signed off on the robot applying this rule" vs. "the robot may have had a glitch in its Reason module, who knows?".
Re:Fear of robots is a red herring (Score:4, Funny)
"Not your pants, sir."
"Mooning a Robotic Law Enforcement Unit is a Class I misdemeanor. Applying taser."
Re:Fear of robots is a red herring (Score:4, Insightful)
Re:Fear of robots is a red herring (Score:5, Interesting)
It is a red herring. The problem is not robots in war. There's no big difference between using robots and drones in wars compared to using cruise missiles in wars. And only a slight difference between using soldiers that have been conditioned/brainwashed to follow orders unquestioningly.
The real problem is the ease of starting wars that only benefit a very few people. Hence my proposal: http://slashdot.org/~TheLink/journal/208853 [slashdot.org]
In the old days kings used to lead their soldiers into battle. In modern times this is impractical and counterproductive.
But you can still have leaders lead the frontline in spirit.
Basically, if leaders are going to send troops on an _offensive_ war/battle (not defensive war) there must be a referendum on the war.
If there are not enough votes for the war, those leaders get put on deathrow.
At a convenient time later, a referendum is held to redeem each leader. Leaders that do not get enough votes get executed. For example if too many people stay at home and don't bother voting - the leaders get executed.
If it turns out later that the war was justified, a fancy ceremony is held, and the executed leaders are awarded a purple heart or equivalent, and you have people say nice things about them, cry and that sort of thing.
If it turns out later that the leaders tricked the voters, a referendum can be held (need to get enough signatories to start such a referendum, just to prevent nutters from wasting everyone else's time).
This proposal has many advantages:
1) Even leaders who don't really care about those "young soldiers on the battlefield" will not consider starting a war lightly.
2) The soldiers will know that the leaders want a war enough to risk their own lives for it.
3) The soldiers will know that X% of the population want the war.
4) Those being attacked will know that X% of the attackers believe in the war - so they want a war, they get a war - for sufficiently high X, collateral damage becomes insignificant. They might even be justified in using WMD and other otherwise dubious tactics. If > 90% of the country attacking you want to kill you and your families, what is so wrong about you using WMD as long as it does not affect neighbouring countries?
I think if this was implemented it would be much better than banning robots. I'm biased of course ;).
Re: (Score:3)
Re:Fear of robots is a red herring (Score:5, Interesting)
A couple of issues.
1) Software can be hacked... either partially or totally. Maybe just putz with the Friend-Or-Foe logic, maybe take direct control, etc. Sure, humans can be blackmailed and extorted but usually on an individual basis. Mass-putzing with a regiment or squad and you have serious issues. Such as perhaps those drones protecting the US (if they ever become truly robotic).
2) It does make war a bit more meaningless. If you aren't facing emotional losses, then there's little reason NOT to go to war. If it's not personalized... then who cares? Sure, even now we have sympathy for the other side and protests and such... but the majority of the people that care mostly care because our brothers / sisters / sons / daughters / etc. are out there possibly dying. So that helps push back the question "should we actually GO to war with them?"
3) There ARE concerns of self-aware armed robots. Make them too self aware, and maybe they realize that the never-ending violent slaughter of humans is contradictory to their goals of preserving their owners' lives. In which case they take a OVERLY logic to preserve the FUTURE "Needs of the many" by doing PLOTLINE X. Sure, it sounds like bad sci-fi... but as you say they have no emotions and only logic. Take away emotion, and we become like cattle... where they cull the herd due to a few random mad-cow cases to save the majority.
Re: (Score:2)
Really? Given modern analysis techniques, how hard do you think it would be to program such a robot to have bias based on factors like skin color, facial structure, attire, presence of RFID/radio-ident/FoF/etc. tags, language or even accent?
You think people WOULDN'T add this log in?
Re: (Score:2)
There are all indications that the coming robotic revolution will usher in a new era of human peace and prosperity... War is traditionally an incredibly wasteful and expensive exercise.
So you are making the argument that making war cheaper and easier to launch will result in more peace and prosperity? It's so bad already that we don't even count civilian casualties accurately.
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Take the Soviet Union's place at American Boogeyman #1, which is pretty darn impressive accomplishment on their side and just plain sad on America's.
You are worrying about a bunch of third-world priests and their followers building a high-tech weapon the American Army - or any first-world country - can't out-high-tech. And it got modded +5 Interesting. Come on.
Re: (Score:2)
> You are worrying about a bunch of third-world priests ...
Give those "priests," say, a nuclear weapon, or (to stay on topic) a doomsday robot that is self-replicating, and is programmed to kill all infidels, yes, I would worry about that. Advanced weaponry is the Great Equalizer(tm). :)
Besides, I would rather believe that I was modded "interesting" because of my recommendation of Saberhagen. His stories are much better than anything I could say. :)
-- Stephen
Deal with it. (Score:2)
Re: (Score:3)
It'd be like a real life game of starcraft with humans controlling the groups of robots remotely.
Re:Deal with it. (Score:4, Insightful)
You are describing your own fantasy rather than a reasoned prediction.
Surely once the robots break through the curtain of defenders, they will begin quite efficiently to the civilian population and their infrastructure. How would robots even distinguish between them? (In fact, this is a difficulty for human soldiers today.) Is it not likely that civilians would attempt, at the last, to defend themselves and their families also?
The hope for humanity is not that the winners will somehow be more virtuous than the losers. Our only hope is that, as the consequences of armed conflict escalate, the number and severity of conflicts will dwindle.
Re: (Score:2)
(edit)
"quite efficiently to" kill
Re: (Score:2)
" with no intentional lost of human life."
Yeah, as long as the wining side chooses not to wipe out the humans on the losing side, since they'll have no robot protection anymore.
I'm sure that'll never happen.
Endorsements (Score:2)
This message is sponsored by Sarah and John Connor. With special consideration from Morpheus, Trinity, and Neo.
These are not the droids you're looking for (Score:5, Informative)
Hey, James Cameron, are you the submitter??
The automomous Terminator-style robots the summary refers to are far from becoming a battlefield standard, much to the disappointment of the /. crowd and sci-fi nerds.
Predator drones et al., like all current robotic devices in the battlefield, still have a human being in charge making all the decisions, so the points raised are completely moot.
Re: (Score:2)
Are they really just remote controlled devices rather than autonomous "robots"?
Re:These are not the droids you're looking for (Score:5, Interesting)
Yes and no: especially sophisticated autonomous robots, either self-driving vehicles or biomimetic killbots of some sort, are sci-fi stuff; but land mines 'That's bi-state autonomous area denial agent sir to you, cripple!' and more sophisticated devices like the Mark 60 CAPTOR [wikipedia.org] are autonomous killer robots.
And, so far, they've proven deeply unpopular in bleeding-heart circles. The fancier naval and anti-vehicle mines are still on the table; but the classic land mine enjoys a sense of ethical distaste only slightly less than just hacking off children's limbs yourself...
Re: (Score:3)
Landmines are the perfect example of existing autonomous technology. Next steps would be, I imagine, drones that fly themselves home if jammed. Still pretty innocuous but a step into automation.
Also imagine a first generation turret. Automated target acquisition based on stereo imaging and stereo microphones. The first models would require an operator to approve the target. But the systems are so much faster than us - soon you'd want to be able to approve a target area, hold down the "OK" button and ha
Re: (Score:2)
That Mark 60 CAPTOR is quite interesting with its audio detection of submarines.
How long before someone relaxing in their boat discovers the right song to make a false positive on that?
Then how many more blown up civilians before they figure out which song it is?
Re: (Score:3)
At least we know definitively that the trigger song isn't Margaritaville. That would have been a disaster.
Re: (Score:2)
But we shouldn't wait until the autonomous drones arrive.
Re: (Score:3)
Disappointment or not, the problem is kind of different. In fact, the problem exists quite a while ago, since people invented time-bombs, remote controlled bomb, suicide killers, and such.
The issue at hand is the following: War is about killing people and destroying stuff. People on the battlefield facing to each other turned out to be counter-productive in this regard, exemplified on many occasions in the end of 1st World Massacre. After a long period of constant threat of death, patriotism, religious fana
samson (Score:4, Interesting)
http://en.wikipedia.org/wiki/Samson_RCWS [wikipedia.org]
These turrets count I think. Israel has at times said they are keeping a man in the loop, but the technology doesn't require it, and at times they have said they are in
"see-shoot" mode. This is essentially indiscriminate area denial that is easier to turn off than mines. It does have the computer vision and targeting aspects of a killer robot, just not the path finding and obstacle avoidance parts.
Drones are Piloted (Score:2)
Re: (Score:2)
The drones America uses are piloted by humans. The other robot in use by the military is the one that disables bombs. It also is remote controlled by a human. I don't think the military has any non piloted robots deployed in combat. Even a turret would be too dangerous. An automated turret could kill our own troops. Closest thing we have is landmines.
Are you sure about that?
Re: (Score:3)
I don't think the military has any non piloted robots deployed in combat. Even a turret would be too dangerous.
Ever hear of the PHALANX/CIWS? [wikipedia.org] Automated turrets that are placed on Aircraft Carriers and on bases in the middle east to shoot down incoming mortars and rockets. Something capable of shooting 4,500 20mm rounds per minute could be very deadly. Because human reaction time is too slow, these turrets DO fire automatically.
Re: (Score:3)
But please, only when used by a well-regulated militia.
Re: (Score:3)
Bought one last week. My well-regulated militia is very interested in not being killed by a Hellfire missile shot by Obama. We are American citizens after all, and subject to assassination order by the President.
And in case anyone is too dense to recognize the sarcasm.... /sarcasm
Re: (Score:3)
also...
Re: (Score:2)
Re: (Score:3)
They are set to guard an area against any radar detectable objects, and most importantly, they do NOT have IFF. They have only trajectory and min/max target speeds, and anything traveling in the area that is heading in the wrong direction and is traveling within the set speed range is fired upon. I believe they already have shot down one friendly aircraft, which entered the kill zone while towing a target drone.
They're as close to indiscriminate killing machines as we have. They're self contained weapon
Re: (Score:3)
What you will probably see at first is 1 human in charge of 5-10 drones. The drones act 'autonomously' and the controller can take over any of them. Then you will see as they get comfortable with the tech something like 1 to 50. Then they will take the 'commander' out of the loop and put it in the hands of 'strategy committees'. Then they will let the computer fight out what from our point of view in the command 'bunker' is a large RTS game.
It's already one pilot to multiple drones. Given that one of the major features on the things is long endurance/loiter times, and they possess some limited automation of basic flight functions(ie. unlike a 'basic' RC aircraft where every control surface is directly mapped to a joystick on the controller, and the pilot has to compute the control-surface configuration that gets the path he wants), a single person can watch over multiple drones at a time, and (so long as the standing order is some variation of
Re: (Score:2)
I had a friend in the AF at Barksdale. He said he was kinda weirded out when a drone landed and taxied over to his fueling station. He made a joke that he expected it to say "Feed me meatbag". I told him the feeling he had was foreboding.
It's the same as bio-warfare (Score:5, Interesting)
Depending on how one defines "robot"... (Score:2)
Depending on how one defines "robot", this will be extremely unlikely.
Effectiveness trumps morality every time. (Score:5, Insightful)
I don't mean to be the dark figure in this conversation, but I think it's inevitable that robots will be used on the battlefield, just like people are going to continue to use cluster bombs, land mines, dum-dum bullets and other horrible devices. The reason is that they're effective.
War is a measurement of who is most effective at holding territory. It is often fought between uneven sides, for example the Iraqi army in their 40-year-old tanks going out against the American Apaches who promptly slaughtered them. Sometimes, there are seeming upsets but often there's an uneven balance behind the scenes there as well.
Robots are going to make it to the battlefield because they are effective not as killing machines, but as defensive machines. They're an improvement over land mines, actually. The reason for this is that you can programmatically define "defense" where offense is going to require more complexity.
Already South Korean is deploying robotic machine gun-equipped sentries on its border [cnet.com]. Why put a human out there to die from sniper fire when you can have armored robots watching the whole border?
Eventually, robots may make it to offensive roles. I think this is more dubious because avoiding friendly fire is difficult, and using transponders just gives the enemy homing beacons. In the meantime, they'll make it to the battlefield, no matter how many teary people sign petitions and throw flowers at them.
Re: (Score:2)
This is not so different from the "limited air war" doctrine the US practiced for 20+ years between Vietnam and Desert Storm, or the drone war today. I don't like it, either.
The 3 laws are fiction (Score:5, Insightful)
How many times must it be said? Asimov's 3 "laws" have nothing to do with real robotics, future or present. They were a _plot device_, designed to make his (fictional) stories more interesting. Even mentioning them at all in this context implies ignorance of actual robotics in reality. In reality, robot 'brains' are computers, programmed with software. Worry more about bugs in that software, and lack of oversight on the people controlling them.
Re: (Score:3)
Quite.
The day we get a robot that can understand, interpret and carry out infallibly the "three laws", we don't need the three laws - it will have surpassed the average human ability and probably could reason for itself better than we ever could. We would literally have created a "moral" robot with proper intelligence. At that point, it would be quite capable of providing any justification to its actions and even deciding that the three laws themselves were wrong (like the "0th law" used as a plot device
Re: (Score:3)
Re: (Score:2)
When I was a young science fiction reader, I liked the three laws because they added complex intellectual puzzles to science fiction short stories.
After I grew up, I realized that nowadays they function as an off switch for human imagination. They subtract value when they are mentioned in the context of real robots that we are currently building.
Some people get mentally stuck in a particular fictional universe like Star Trek or I, Robot, but these are just ideas about the future. They are not predictions
As long as citizens can have them we are cool (Score:2)
Re: (Score:3)
Killer robots can't be a government only option =D
"Killer robots don't kill people, people with killer robots kill people! Wait, um, no, actually, killer robots do kill people!"
Re: (Score:2)
Re: (Score:2)
Killer robots are people!
*sentient beings seeking equal rights.
Re: (Score:2)
Probably only billionaire citizens.
You want a Bill Gates model?
The shotgun was outlawed by the Geneva Convention (Score:3)
This led to clever people developing submachine guns.
Give it a couple decades and you'll be able to download plans for your own battlebot and then create it on your printer
Total Garbage. (Score:5, Interesting)
This article is absolute garbage. Almost everything in that Guardian article is misinformed and sensationalist.
"fully autonomous war machines"? Care to give an example? I've follow this stuff pretty closely in the news on top of researching AI myself. And from what I have seen no one is working on this. Hell, we've only just started to crack autonomous vehicles. They site X-37 space plane for gods' sake. Everything about that is classified so how do they know it is autonomous?
My favourite gem has to be this one: "No one on your side might get killed, but what effect will you be having on the other side, not just in lives but in attitudes and anger?". Pretty sure that keeping your side alive while attacking your opponent has been the point of every weapon that has ever been developed.
Re: (Score:3, Informative)
A huge amount is known about the X-37 [wikipedia.org] seeing as it's a redirected NASA project. It's capable of autonomous landing and it's widely assumed that it performed its primary reconaissance mission autonomously seeing as it's basically a glorified spy satellite capable of a controlled re-entry.
We already have fully autonomous combat aircraft, that can be pointed at a target and perform complex manouvers in order to reach and subsequently destroy it. They're called cruise missiles. You're hopelessly naive if you th
Re: (Score:2)
http://en.wikipedia.org/wiki/Phalanx_CIWS#Operation [wikipedia.org]
All it needs is power and cooling water. No human interaction required.
you want MORE robots, not less (Score:2)
robots killing robots
wars settled in a clash of machinery without any humans for miles around
Re: (Score:2)
And to extend that, as long as a lot of non-combatants are killed, you make sure the replacement of the "insurgents". Vicious circle as it is.
Re: (Score:2)
so, what's your point?
if someone is going to fight you, and you can afford to put out a robot to fight them instead of your own flesh and blood, are you saying we shouldn't do this?
Re: (Score:2)
if we did that, we'd be fucking evil
but that's not what we're going to do, and that's not what i said
why do you think you win arguments by grossly changing and misrepresenting the subject matter?
if some insane ideology has no problem throwing young men into the maw of war, why can we not respond with robots instead of our own young men?
it's a fair question
now are you going to answer it or are you going to change the subject to matter to a gross distortion that has absolutely nothing to do with what i said?
Re: (Score:2)
it depends upon the programming
if the robot is out to target only certain behavior, certain combatants, and only them, then we can say two things:
1. this is obviously superior to carpet bombing, and your comparison to that is obviously wrong
2. this is perhaps even superior to human behavior, whose judgments are not always sound, and commit atrocities and mass murder themselves
it might be MORE moral to use robots
most importantly (Score:2)
Robots are not alive.
There is no true sacrifice of blood and souls when robots take the place of soldiers in battle. In my opinion, that brings them up to WMD in terms of being able to inflict loads of casualties with little risk to the aggressor.
Re: (Score:2)
Lol (Score:2)
Only humans should be on the battlefield killing each other, robots killing robots is just so inhumane.
Did these people not hear about WWII? (Score:2)
Without "autonomous war machines" we've managed to firebomb cities (with a nice 3 hour gap between bombing runs so that fire fighters and so on would be putting out the first run's fires when the second run hit), mass murder civilians, drop atomic bombs on cities, use chemical weapons, and everything in between. I don't think feelings of mercy and pity and an ability to not follow illegal orders makes much of a difference.
Video Game War (Score:2)
My main problem with using robots (or, more likely, remotely piloted-semi-autonomous war machines) on the battlefield is that it makes war too easy. Right now, drones aside, war is a costly matter. You need to put actual lives at risk and that acts as a check on what generals/politicians would want to use troops for. Want to invade North Korea and Iran to stop them from being a threat once and for all? Well, that's going to wind up costing tons of lives which is going to make it harder to sell to the pu
Daleks (Score:2)
Re: (Score:2)
In that respect they were more like Mobile Infantry in Starship Troopers, they just didn't bother making the suits as anthropologically shaped.
Mercy (Score:2)
There have been many situations where you've had humans in on the ground, one gets killed, and the slain soldier's buddies snap and decide to massacre an entire village. I'm not really sure what part of merciful warfare autonomous robots are threatening.
A few flaws in the author's reasoning... (Score:2)
The author gave the following reasons against autonomous war robots:
|| Robots possess neither common sense... ||
Me: This is true, but isn't that the point? Someone behind the curtain has common sense? For example, the current generation of drones in use aren't intelligent, but the people flying them are making the decisions (or rather, their superiors). We need to separate the ED-209 vs. drone conversation as I'm pro drone, anti-ED-209 style military robot.
|| 'real' reason ||
Me: See above. Our current dron
I'd rather deploy robots... (Score:2)
Here's hoping we eventually get to the point where both sides just deploy robots, and whichever has robots standing at the end wins.
Lets stop wasting young lives.
Friendly Natural Intelligence (Score:2)
similar aguments against tanks & machine guns (Score:2)
The real jump would be machine-decided (A.I.) killing. For the most part there is a man in the decision loop. Even with the new Israeli "Iron Dome" missiles where operator has seconds to decide to launch. (More of a financial decision because
Thoughts (Score:2)
I have put a lot of thought into combat robots, particularly airborne ones. I think they're really an inevitable development.
I don't have a problem with robots maneuvering themselves over a battlefield. I don't have a problem with a robot killing someone. I don't even have a problem with giving it a target, and letting it decide the best way to eliminate it.
The only provision I would require is that we not have it select its own targets. There should be a human operator somewhere telling it what it should b
Re:I want that! (Score:4, Interesting)
Re:I want that! (Score:4, Informative)
His machines weren't "robots" any more than Predator drones are: they were remote controlled by radio.
Yes. That is probably why he stated that they were remote-controlled.
Re: (Score:2)
snip...
We are afraid of body bags coming home and we're afraid of collateral damage.
snip...
Hmm
I don't believe the military is afraid of collateral damage maybe reporting of it and gaining negative publicity, but certainly not of inflicting it. One reason that collateral damage figures in Iraq (inflicted by the alliance), were not reported, indeed they lied at the time and said the figures were not even recorded.
Re:Obama already leads the way (Score:5, Insightful)
In robot drone murders and you morons think he will sign something? Obama, Nobel Peace Prize winner that has killed the most innocent women and children yet!
I believe Yasar Arafat, Henry Kissinger, Yitzhak Rabin, Shimon Peres, Menachem Begin, and Le Duc Tho all currently lead Obama the "Number of Innocents Killed by a Nobel Peace Prize Winner" race.
Re: (Score:2)
Re: (Score:3)
Obama already leads the way in robot drone murders and you morons think he will sign something? Obama, Nobel Peace Prize winner that has killed the most innocent women and children yet!
(1) The drones aren't robots, they are controlled by a live operator and wouldn't even be covered by this proposal.
(2) It's really quite a stretch to believe that drone attacks have killed more innocents than Kissinger [wikipedia.org]'s wholesale bombing of Cambodia or Arafat [wikipedia.org]'s indiscriminate suicide bombing attacks.
But yeah, don't let facts get in the way of a good flame ...
Re: (Score:2)
Re: (Score:2)
Stalin's regime (officially) executed between 3.5 and 5 million (most of it in post civil war era, check how it went in post-revolution France), even assuming all of them were innocent, how could you compare that to what Hitler did and come to the conclusion, he did less???
Stalin was an ass hole but you putting him in front of Hitler (who was fine with exterminating entire nation) shocks me. You took too much anti-kommies propaganda too seriously guys.
Re: (Score:2)
It is not the "officially" executed that counts it is the millions that died due to starvation as a result of his policies. As such Stalin is way beyond Hitler. That includes the millions who died due to the fact that Stalin had "purged" the Red Army of all the effective officers who could have stopped Hitler much much sooner.
Re: (Score:3)
Well, no. In the first place, the USSR tried to negotiate with France and Britain, but the negotiations bogged down and Hitler made a better offer, basically agreeing that the USSR can get back the territories it has lost in 1917 (the Baltic states) and in the 1919-1921 (when Poland invaded the USSR, taking a large chunk of Ukraine and Belarus).
Then Germany occupi
Re: (Score:2)
Re: (Score:2)
When did Stalin or Hitler when Nobel Peace Prize?
By then, the Nobel Committee had some shreds of dignity left. Yet I'd count hundreds of millions of children forced to sing songs in school that call Stalin the "sun of humanity", "father of peace", and so on.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
There's a lawyer standing behind the drone pilot. He's there to make sure no laws are violated. So it isn't the drone, or the ROV pilot, it's the lawyer who makes the kill decision. So if you are complaining about it, ask yourself who makes the laws? More importantly, in other countries that are about to become drone capable, what sorts of laws do they have preventing arbitrary kills?
Re: (Score:2)
So you are saying that both are deeply distasteful and likely to result in casualties among civilians who are 'innocent' by any stretch of the word? Or was this one of those 'we have to stoop to their level to stop those animals' arguments?
(Incidentally, unless the extremist was fucking a defense contractor, I bet the kid and not the robot was produced on time and under budget...)
Re: (Score:2)
The evolution of humans means it takes 4 years to get another 4 year old kid to put a bomb on.
The evolution of Ford Motor Company means it takes 4 hours to get another killbot off the factory line.
Re: (Score:2)
Transform and roll out!
Re: (Score:2)
if we could just get the robots to only fight other robots...
Yes. Then, all your enemy would have to do to defeat your robot army, is to send a human army.
Re: (Score:3)
In the Asimov books, the inventor of the Robot Brain pretty much invented and designed the Positronic Brains so they the whole underlying foundation was just a large spaghetti of stuff... and the brain wouldn't function without it. And part of the spaghetti was the 3-laws... remove them and it all falls apart like a house of cards.
So it wasn't so much an issue of "Manufacturers installing the 3-laws-patch" but that the 3-laws were built into the brain's foundation. And that there weren't really ways to ma
Re: (Score:2)
Oh shit, open source really is going to destroy the world.
Re: (Score:2)
These days it's everywhere.
That's not a new phenomenon for our species - heck, back in the days of ancient Greece, one couldn't throw a stone without hitting at least one or two "oracles."
Of course, they at least had the excuse of rampant mercury poisoning...