Forgot your password?
typodupeerror
Robotics The Military Hardware

'Ban Killer Bots,' Urges Human Rights Watch 297

Posted by Unknown Lamer
from the assessment-will-not-be-by-humans dept.
Taco Cowboy writes "A self-proclaimed 'Human Rights Group' — the 'International Human Rights Clinic' from Harvard Law School — has teamed up with 'Human Rights Watch' to urge the banning of 'Killer Robots.' A report issued by the Human Rights Watch, with the title of 'Losing Humanity,' claimed autonomous drones that could attack without human intervention would make war easier and endanger civilians. Where's the 'Robot Rights Watch' just when you need 'em?"
This discussion has been archived. No new comments can be posted.

'Ban Killer Bots,' Urges Human Rights Watch

Comments Filter:
  • by OrangeTide (124937) on Monday November 19, 2012 @09:02PM (#42034603) Homepage Journal

    We should go back to using cruise missiles and carpet bombing.

    • Re: (Score:2, Informative)

      by Icegryphon (715550)

      Agreed, Winston Churchill had it right. William Tecumseh Sherman had it right. Destroy every bit of the enemy infrastructure and they wont have anything to wage war with.

      • by Trepidity (597)

        "Totaler Krieg – Kürzester Krieg", as they say

      • by Anonymous Coward

        a scortched earth policy: where you knock down granaries, cripple tractors and plows, break down damns and salt the earth if you have to is not really the kind of society we wish to represent.

    • by ThatsMyNick (2004126) on Monday November 19, 2012 @09:19PM (#42034769)

      Nope, but we dont need fully autonomous killer robots either. Would you rather have a robot determine if a target is worth killing, rather than a human?

      • I would trust the robot more. You could program it to not take things like emotions into account. You can have it judge if someone is hostile or a combatant and only exercise the force required. Humans are far more likely to overreact.

        • A robot controlled remotely by a human (even if only for the kill) would accomplish the same. The human is way detached, that it would be easy to control his emotions.

          • And yes those humans do violate the various rules that cover war and do shoot into crowds of civilians. I would say so far that approach is not working very well.

            • Assuming humans still control these autonomous robots, autonomous robots would solve these problems either. Would you expect these autonomous robots to refuse an order given by their human commander? Would you expect these robots to be programmed to be capable of refusing a command?

              • They absolutely should be programmed to refuse orders like that. However, I don't expect that to happen which is very very sad.

              • 1) A robot may not injure a non-combatant or, through inaction, allow a non-combatant being to come to harm.
                2) A robot must obey the orders given to it by its masters, except where such orders would conflict with the First Law.
                3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
                • Sounds good, except the master gets to decide who the non-combatant is.

                  • s/non-combatant/non-combatant according to hard-coded algorithm/
                    • by sFurbo (1361249)
                      Many armies follow the Geneva convention, or at least pretend to. They do this, not out of concern for the enemy, but out of concern for their own troops. Going to war is psychologically hard. You have to deliberately do the most basic thing we have been drilled is bad: kill people. This makes people break down. The break-down is slower if you believe you are at least following some rules. It might be the same for these robots, at least as long as they have human pilots: The pilots might last longer if they
          • A robot controlled remotely by a human (even if only for the kill) would accomplish the same. The human is way detached, that it would be easy to control his emotions.

            Haven't we seen this movie before?

            If we uplink now, Skynet will be in control of your military. But you'll be in control of Skynet, right?

            • by Electricity Likes Me (1098643) on Tuesday November 20, 2012 @07:25AM (#42038821)

              A robot controlled remotely by a human (even if only for the kill) would accomplish the same. The human is way detached, that it would be easy to control his emotions.

              Haven't we seen this movie before?

              If we uplink now, Skynet will be in control of your military. But you'll be in control of Skynet, right?

              Also it's not actually true that people are more detached. The rates of PTSD amongst drone controllers are apparently ridiculously high for people who are effectively non-combatants. Soldiers in the field are more or less stuck in "kill or be killed" when contact happens, whereas a guy flying a drone always knows he could have simply not pushed the button.

    • We should go back to using cruise missiles and carpet bombing.

      Where do you draw the definitional line? Isn't a cruise missile a robot that kills people?

      • Where do you draw the definitional line? Isn't a cruise missile a robot that kills people?

        Someone had to push the launch button.

        • by Algae_94 (2017070)
          And someone has to turn the robot on. That isn't a fine enough distinction to separate cruise missiles from robots.
          • And someone has to turn the robot on. That isn't a fine enough distinction to separate cruise missiles from robots.

            The person involved still gives a target location (which generally has collateral damage taken into consideration) for a cruise missile that it can hit quite accurately. When you turn the robot on, you're not sure who it's going to kill where. It will be a very long time before a machine can make those kinds of decisions with respect to collateral damage and civilian casualties.

            There's a pretty damn big distinction between "blow up that building (and cease to exist afterwards)" and "go hang out over there a

  • by osu-neko (2604) on Monday November 19, 2012 @09:13PM (#42034699)

    That's nothing new. It's no different than land mines...

    Oh, wait... [wikipedia.org]

  • Human rights (Score:4, Insightful)

    by Marxdot (2699183) on Monday November 19, 2012 @09:13PM (#42034705)
    Why would you deride human rights groups, Taco Cowboy? And yes, drones that attack autonomously are a very bad idea.
  • by wierd_w (1375923) on Monday November 19, 2012 @09:14PM (#42034713)

    While cliche, take a look at "wargames".

    Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.

    It makesit fantastically easier to justify and ignore wholesale slaughter.

    A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!

    When you can simply push a button and walk away without having to witness the attrocities you cause, you abstract away a fair bit of your conscience.

    The military probably thinks that's a GREAT thing! Kids with guns won't cause mental trainwrecks to drones when they get mowed down, and the operator doesn't have to see it!

    The reality is that deploying terminators is the same as turning a blind eye to consequences, and the innately terrible thing that war is, and why it should always be avoided whenever and however possible.

    • by ThePeices (635180) on Monday November 19, 2012 @09:26PM (#42034847)

      But we have all been taught from an early age that it is wrong to feel guilt for killing bad guys. If you feel guilty, then you are *for* the bad guys, and therefore one of *them*. ( remember, its a binary good/evil world we live in, amiright?)

      Killing bad guys is doing your country a service, we are taught. We are making the world a better place, a safer place, when we kill our enemies.

      This we are taught. If any one disagrees with that, then they are unpatriotic, and aiding and abetting the enemy.

      This we are taught, so it must be true.

      • by wierd_w (1375923) on Monday November 19, 2012 @09:42PM (#42035031)

        What is an enemy?

        Is it a person who wishes to do you harm?
        A person who wants to take something you have?
        A person with whom you disagree?
        Or just someone in the way of what you want to do?

        In a war, do not both sides, regardless of the motives of either side, satisfy all of those? Is it no wonder that both sides refer to the other as the enemy?

        In this case, what do the "terrorists" represent, that they merit being exterminated, without conscience nor remorse?

        "Thy killed a shitton of people when they bombed the trade center!" You say?

        Why? Why did they blow up the trade center?

        It couldn't be because our countr(y/ies) was(were) meddling in their affairs, causing them harm, taking things from them, and fundementally in disagreement with their way of life?

        Certainly not! They should be HAPPY that we want to destroy their culture, because we view certain aspects of it as being backwards and primitive! Our way is simply BETTER!

        Now, let's do a thought experiment here. Powerful aliens come down from wherever out in space they are from, find our culture to be backward and primitive, and start strongarming us to cease being who we are, and become like them. They say it's a better way. Maybe it is. That isn't the point. The point is that they don't give us the choice. They do this because it makes it easier for them to establish trade with them, or to work within their stellar economy, or whatever. They profit, by eliminating our culture.

        Would we not go to war with them, fighting their influence in every possible way, and even resort to guerilla and "terrorist" acts when faced by such a superior foe?

        After thinking about that, can you really say you are any different than the "terrorists" we condemn with military machines daily?

        We kill them, because they don't submit. They don't submit, because we are destroying and marginalizing their culture, because we feel it isn't worth retaining/is backward.

        They don't want our help. They don't want our culture. They dnt want our values. They don't want us. We insist on meddling on all of thoe things.

        We started the war.

        • Re: (Score:2, Troll)

          by CRCulver (715279)

          It couldn't be because our countr(y/ies) was(were) meddling in their affairs, causing them harm, taking things from them, and fundementally in disagreement with their way of life?

          Only the last is really true. The current wave of Islamist violence, and hatred of the United States in particular, is traceable in large part to the writings of Sayyid Qutb. He visited the United States in the late 1940s and condemned it for its culture (e.g. its sexual openness, or at least its perceived sexual openness), not fo

        • What is an enemy?

          Anyone firing an un-tagged large rocket in a region you control.

          In a war, do not both sides, regardless of the motives of either side, satisfy all of those?

          Satisfy all of your absurdly soft-boiled and meaningless definitions of enemy? Yes.

          It couldn't be because our countr(y/ies) was(were) meddling in their affairs

          Actually no, it was simply that they wanted to collapse our economy. Our non-religious existence is an affront many of the terrorists wished to correct.

          That's the really sad thin

    • by Anonymous Coward on Monday November 19, 2012 @09:44PM (#42035071)

      Dude have you seen what happens when people are forced to kill face-to-face? They don't rely on their "conscience" to limit human casualties, they mentally reassign their opponents as non-human and murder them like you or I would murder a termite colony infesting our houses. History is nothing but one long string of horrific atrocity after atrocity committed by warring factions against the opposing side, or civilians, or even their own comrades in arms if there isn't a convenient "other" nearby they can target. Moving to a more abstracted method of fighting isn't just about saving our own forces' physical and mental well-being, it's also about limiting the damage they cause to others when they snap from the pressure and take their aggression out on whoever's available.

      Of course we need to monitor our use of robots - we need a system of checks and balances in place to keep the controllers from engaging in unnecessary combat. But drones don't mass-rape, they don't torture old men and little children for fun, they don't raid houses to steal anything of value within, they don't build towers out of the skulls of their enemies, and they won't burn entire villages to the ground massacring everyone within because they're upset that their buddy was killed in combat the other day. Human involvement isn't always a good thing.

      • by wierd_w (1375923) on Monday November 19, 2012 @10:02PM (#42035299)

        You are not comprehending what I am telling you.

        War is to be avoided, because nothing about it is good, just, nor honorable. War scars the minds of those who engage in it, live through it, or even witness it first hand. The damage and price of war is more than just soldiers killed and buildings blown up. It is the destruction of people's lives, in every imaginable sense. Surviving a war might be less humane than dieing in it.

        The point was that by removing the consequences of war, (soldiers becoming bloodthirsty psychos that rape, kill, torture, and lose respect for the lives of others, all others-- in addition to simply having people die, and having economic and environmental catatrophes on your hands), you make war look more and more desirable as an option.

        What I was trying to get you to see, is that war is always a bad thing, and trying to mae it seem like less of a bad thing is the WRONG way to go about it.

    • by Eevee (535658) on Monday November 19, 2012 @09:50PM (#42035131)

      A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!

      So you're for robots and drones, right? Because right now the glitch in programming is when human soldiers in a combat area see someone with something that might be a weapon, they tend to shoot them. Why? Because the ones going "Is that a weapon or is it a broom" don't tend to last when it is actually is a weapon. A drone operator, on the other hand, can take the time to evaluate the situation since they aren't in harm's way.

      • Re: (Score:2, Insightful)

        by wierd_w (1375923)

        No. You fail to comprehend my position at all.

        There shouldn't be anyone making that decision. At all.

        Making that decision easier, by having a machine do it, to alleviate the guilt of a human operator, and his chain of command, is the WRONG direction.

        Want to know where it ends?the creation of things like "perfect" WMDs. Kills all the people, spares everything else. Push the button, war is over. A whole society dies, and the one pushing the button loses nothing. What possible reason would that society have to

      • I guess that's why there are so few civilian casualties for drone strikes, right?

    • by Hentes (2461350)

      Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.

      It makesit fantastically easier to justify and ignore wholesale slaughter.

      A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!

      While the US army has developed software that can theoretically command drones autonomously they don't use it exactly for these reasons. Current drones are human controlled. Your argument is based on a false assumption.

    • by couchslug (175151)

      "Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people."

      What a load of shit. "Guilt" is far from innate, and enormous genocides through history have been done gleefully and "up close and personal".

      Rwandan genocides were more often than not done with KNIVES, which means you get sprayed body fluids during your hackathon. Posed "no fucking problem" to the perps.

      Also, seige engines and cannons/tube artil

    • It also means that you don't have trigger-happy, guilt-wracked, paranoid soldiers on the ground, wondering when the next RPG is going to make their deployment a lot shorter than it was meant to be. With robotic systems and an operator out of harm's way, you can afford to wait and just shoot back.
      It can see it swinging either way. It all depends on how the military decides to push.
    • by Scarletdown (886459) on Tuesday November 20, 2012 @12:42AM (#42036821) Journal

      May as well take it all the way and make warfare totally clean and normal.

      Have the computers on both sides simulate their attacks, then declare casualties. Anyone on the casualty list then simply reports to a termination booth to be quickly and humanely killed.

      Hmmm... This sounds rather familiar come to think of it...

    • by elucido (870205)

      While cliche, take a look at "wargames".

      Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.

      It makesit fantastically easier to justify and ignore wholesale slaughter.

      A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!

      When you can simply push a button and walk away without having to witness the attrocities you cause, you abstract away a fair bit of your conscience.

      The military probably thinks that's a GREAT thing! Kids with guns won't cause mental trainwrecks to drones when they get mowed down, and the operator doesn't have to see it!

      The reality is that deploying terminators is the same as turning a blind eye to consequences, and the innately terrible thing that war is, and why it should always be avoided whenever and however possible.

      The innate guilt of dropping an atomic bomb didn't require a robot but you can train a human to follow orders and not feel guilty. I'm not claiming I know what the bomb dropper feels but they completed the mission anyway so what difference does it make?

  • Ban (Score:4, Insightful)

    by ThePeices (635180) on Monday November 19, 2012 @09:17PM (#42034747)

    Trying to ban killer robots is a waste of time, and wont work. There is also little desire to ban them overall, in the interests of health and safety.

    Its safer to kill people using a robot than going out and risking your own skin with guns and/or explosives.
    Remember, in this day and age, safety is paramount. You want to be able to kill people from a distance, safely and easily. Why run the risk of getting injured, or even worse, getting killed, when you can kill people using safer methods? Using a robot to kill people just makes sense.

    Even worse, you could get sued for endangering the safety of others and breaking health and safety regulations. Killing other people can be a dangerous business, so reducing potential hazards and minimizing harm is a very prudent and right thing to do. You need to be able to kill people safely and efficiently. If you can kill people at a lower cost, then that is even better.

    Thats why drones are so popular nowadays. All the benefits of killing people, without all the personal risk. Its a win-win all round.

    Makes sense doesn't it?

    • What we really need is *more* killer robots. TONS of them. Then we can just have them fight our wars for us, between each other and we don't get human casualties on either side. As soon as your killer robots have killed all of the enemies killer robots, their people will obviously secede power to you because it's not like they could defeat your killer robots without there's, so it would be stupid for them to even continue bothering.

      For maximum safety, we just let the killer robots fight each other on the
      • by dcollins (135727)

        "Then we can just have them fight our wars for us, between each other and we don't get human casualties on either side."

        Maybe you're joking, but enough people honestly believe this to say -- This is one of the top, ludicrously insane myths among the geek set.

        If people are willing to die for a cause, or if they feel life is not worth living without principle or resource X, then they will not stop fighting until they are dead. Simple as that. War is the final extremity, when all agreements break down, and one

        • Heh I appreciate you're trying to ensure people don't take me seriously, but you don't need to correct me, I was being completely sarcastic. People who fight wars are dumb, killer robots won't fix that, it'll just make it easy for dumb people to kill more
      • by Kjella (173770)

        As soon as your killer robots have killed all of the enemies killer robots, their people will obviously secede power to you because it's not like they could defeat your killer robots without there's, so it would be stupid for them to even continue bothering.

        And then we all sing kumbayah? Or is that when most of the human population get the ultimatum to obey that unstoppable killer robot army or die? I don't think you want to find out what 21st century slavery would be like. No more free countries to run off to. Tagged with a microchip, a GPS foot bracelet, cameras and sensors everywhere and merciless and uncorruptable robots enforcing and possibly supervising the system. Every form of communication like phone, email, facebook and whatever monitored and the res

        • The really funny thing is when the dude holding the controls dies and then the robots no longer answer to anyone, just running everything the same way it's been for years.

          Man, there'll sure be egg on our faces that day!
    • Thats why drones are so popular nowadays. All the benefits of killing people, without all the personal risk. Its a win-win all round. Makes sense doesn't it?

      It does, way too much for my own taste. That why it should be banned: it does make it too safe to kill...

      • Do you have any idea how many things there are that make it safe to kill? We can't just ban everything that has the potential to be used in warfare, there would be nothing left. What is required is a total shift in human consciousness away from even needing such machines. Our lust for war/power/goods is the real problem. Obviously in the meantime we need to defend ourselves, but banning things like this won't make that any easier, and will only stunt our technological progress. If we don't build ThingX some
    • by couchslug (175151)

      "All the benefits of killing people, without all the personal risk. Its a win-win all round."

      Old news. If Hoplites wanted "personal risk", they'd have left their shields at home.

      War isn't sportsmanship. Sportsmanship is stupid.

    • by Viceice (462967)

      Sarcasm aside, it is true tho. Compare the diplomatic and press headache that resulted from a pilot being captured, i.e the U2 over Soviet Russia, versus the recent Drone that went down over Iran.

      In the former there was a cover up, public embarrassment from the subsequent expose and an enormous diplomatic spat that ended in a prisoner exchange. In the latter, the Iranians were made to look ridiculous for celebrating the downing of what amounts to a remote control air plane.

  • Where's the 'Robot Rights Watch' just when you need 'em?"

    They don't have feelings. If you don't believe me, go stab your toaster. I think what you meant was "Human rights" and the effect wide-spread use of robots with the ability to kill would have on them.

  • ... the trend giving killer drones an initial lead seems to have been reversed [slashdot.org] in human's favor recently.

  • ...Gun Wielding Robots do!!!!!

  • Robots doing the killing is not going to be very different from bombing from 10000 ft, launching a cruise missile or long range artillery bombing. It's a long time since you had to look your opponent in the eye as you stabbed him with sword and spear. And that didn't seem to help much to stop war, either. Potentially you can do better with robots because robots are expendable, you don't have to return fire until you're sure you've isolated the enemy. Even if you were willing to sacrifice your own soldiers t

    • by BeanThere (28381)

      Most people are missing the point. It's not about the method of killing, it's whether a particular act of killing is justifiably self-defense or not. If a 'killer robot' protects an innocent woman and/or child from being raped and/or murdered, then it stands to reason this is good. Denying innocent parties a valid method of self-defense is (e.g. banning methods of self-defense), on the other hand, wrong. When 'killer robots' become intelligent enough to be used for e.g. home security then I'll be getting on

  • by poity (465672) on Monday November 19, 2012 @09:46PM (#42035085)

    No one has autonomous battlefield drones yet, and I highly doubt any military would rely on them, ever. Well.. unless it's a robot military after they gain sentience and create their own civilization, but then they would be as human as us.

    • It is easier to ban something that doesnt exists. Governments can be persuaded to sign these, with a promise that every country is signing these. Now try the same for nuclear weapons (which exists), and you would have trouble, even, to bring it up for discussion.

  • The 'Berserker' novels of course are an examination of the end result of building such killer robots. It will happen eventually. But I don't want it to happen until some of us are no longer in the solar system.

  • Cruise missiles are more of a robot then a drone is. Fact is drones are just remotely piloted planes. The problem isn't their existence but there misuse.

    • Cruise missiles are more of a robot then a drone is. Fact is drones are just remotely piloted planes. The problem isn't their existence but there misuse.

      And this is about neither. It's about theoretical future drones that don't have a remote operator making the decision to take a human life.

  • Wrong target (Score:4, Insightful)

    by gmuslera (3436) on Monday November 19, 2012 @10:41PM (#42035739) Homepage Journal
    Killer Bots dont kill people, people kills people. Ban the people responsible for those killer bots, and, uh... oh, wait, they just got reelected.
  • If we are to ban such a thing, I think we must ensure appropriate even-handed enforcement of this ban, as such I propose we enlist a force strong enough to subdue any killer robot army should someone break said ban, therefore I suggest we build an army of large mechanized automatons heavily laden with weaponry to subdue any would-be killer robot army, or anyone who might be suspected of attempting to build such an army for that matter.
    • If we are to ban such a thing, I think we must ensure appropriate even-handed enforcement of this ban, as such I propose we enlist a force strong enough to subdue any killer robot army should someone break said ban, therefore I suggest we build an army of large mechanized automatons heavily laden with weaponry to subdue any would-be killer robot army, or anyone who might be suspected of attempting to build such an army for that matter.

      It's simple. We ban killer robots then build them in secret and use those robots to enforce human rights violations.

  • by GodfatherofSoul (174979) on Monday November 19, 2012 @11:03PM (#42035933)

    As has been stated in other posts, every level of abstraction away from the act of violence removes a layer of conscience from the execution of the act; whether it be robots, drone strikes, or trigger-happy, 60 year-old politicians who ducked service in Vietnam.

    http://en.wikipedia.org/wiki/Milgram_experiment [wikipedia.org]

  • ...when you pry the controls out of my centuries old desiccated hands in my underground mountain fortress.

  • This whole idea wreaks of wussiness, one weenie is scared of killer robots and makes a big fuss so none of us get to have killer robots? I don't think so hombre.
  • So I'm now remembering watching Wargames back in 1983... and it makes me remember that I thought it odd that no one would have any kind of overrides in place, human or otherwise...

    The article does stipulate that we should ban killer robots now, even though that no one has one or has stated what kind of timeline we can expect for the emergence of these 'killer robots'. To be quite honest, it will take one hell of a long time to get one deployed. Look at how long it takes for the military (specifically the US

  • by russotto (537200) on Tuesday November 20, 2012 @12:00AM (#42036493) Journal
    ...will be killed by the robots of those who don't.
  • Why just ban the robots, a much better idea is to just outlaw war. We don't need it, it doesn't serve a purpose and we get nothing from it but death.
    • by tbird81 (946205)

      So what if we've attacked? Do we do nothing?

      What if a dictator is committing genocide and slaughtering civilians? Do we do nothing?

  • So much for Asimov. Here's the laws our robots will *actually* obey:

    1. A killer robot must actually proactively kill people. Sitting around humming "Still Alive" for a century doesn't count.

    2. A killer robot must only kill the right people, which are the people the Right People tell it are the right people to kill. Which people are the Right People is subject to change at any time (such as after an election, corporate buyout, or court ruling).

    3. A killer robot must keep itself operational as much as possibl

  • Robotic weapons have five major flaws: The first is as the article says, not really human as they distance war and make it too easy as the risk to soldiers and the whole PR body count problem goes away. War is bad and making it easy can't be a good thing.

    The second is that without soldiers in the field seeing the bad things then really bad things can happen as very few people have to choose to be evil for a lot of evil to happen. Solders can go overboard but eventually the truth will come out if enough p
  • What are they talking about? Drones aren't murderous, they're just full of angst!

    http://www.hulu.com/watch/426530 [hulu.com]

  • What side do you want.

    1. USA
    2. Russia
    3. North Korea
    4. Iran
    5. Israel
    6. China
    7. UK
    8. France
    9. Pakistan

  • what about jamming the bots control data. Let say you can't hack them or hack the data going to them but it's easy to make so that they get no data in or out.

It is better to give than to lend, and it costs about the same.

Working...