Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Robotics The Military

Ethical Killing Machines 785

ubermiester writes "The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.' The researchers claim that these real-life terminators 'can be designed without an instinct for self-preservation and, as a result, no tendency to lash out in fear. They can be built without anger or recklessness ... and they can be made invulnerable to ... "scenario fulfillment," which causes people to absorb new information more easily if it agrees with their pre-existing ideas.' Based on a recent report stating that 'fewer than half of soldiers and marines serving in Iraq said that noncombatants should be treated with dignity and respect, and 17 percent said all civilians should be treated as insurgents,' this might not be all that dumb an idea."
This discussion has been archived. No new comments can be posted.

Ethical Killing Machines

Comments Filter:
  • by pwnies ( 1034518 ) * <j@jjcm.org> on Tuesday November 25, 2008 @02:55PM (#25890291) Homepage Journal
    ...need I say more?
    • Do they run vista? (Score:5, Insightful)

      by raymansean ( 1115689 ) on Tuesday November 25, 2008 @03:10PM (#25890551)
      It takes a special set of skills to corrupt a single human being, it takes another set of skills, not that special, to corrupt an entire battalion of robots, that are all identical. Did I mention sharks with lasers?
      • by blhack ( 921171 ) on Tuesday November 25, 2008 @03:27PM (#25890821)

        It takes a special set of skills to corrupt a single human being, it takes another set of skills, not that special, to corrupt an entire battalion of robots

        Do you live in a society without Money?
        Or women?
        Or sports cars?
        Or Fancy houses?
        Or Gold?
        Or "Change" posters?

        As far as I know, my computers have never accepted a bribe, or made a power-grab.

        • by EricWright ( 16803 ) on Tuesday November 25, 2008 @03:29PM (#25890837) Journal

          Ummm... it's not the computers you bribe, it's their programmers.

          • by blhack ( 921171 ) on Tuesday November 25, 2008 @03:34PM (#25890901)

            Ummm... it's not the computers you bribe, it's their programmers.

            AHA! So! How is this any different than humans?

            Bribe a human to kill a person (or have their army kill a shitload of people).
            Bribe a human to have their robot kill a person (or have their army of robots kill a shitload of people).

            I think that the problem is people having misconceptions about robots. They're not sentient. They don't think. They only do what we tell them to. Sure there are horror stories about robots coming to life, but there are also horror stories about dead people coming to life, or cars coming to life.

            We need to drop the term "robot".

            • by Talderas ( 1212466 ) on Tuesday November 25, 2008 @03:46PM (#25891073)

              The fact that robots do exactly what you tell them to is precisely why they're dangerous. If you have 1 maniacal individual order a platoon of soldiers to slaughter a village, the individual human soldiers may refuse to follow the order. If that same individual has a platoon of robots instead, the villagers are dead as soon as the order is issued.

              • by account_deleted ( 4530225 ) on Tuesday November 25, 2008 @03:56PM (#25891233)
                Comment removed based on user account deletion
              • Re: (Score:3, Insightful)

                by blhack ( 921171 )

                If you have 1 maniacal individual order a platoon of soldiers to slaughter a village, the individual human soldiers may refuse to follow the order.

                There is a flip side to that coin. Machines don't think. Machines don't get PTSD and decide to go on a killing rampage. Machines don't "go rogue".

                In addition, soldiers are trained not to think, they're trained to follow orders. If anything, replacing them will save lives. I imagine that leaders will think twice about how bad they want us invading them when we've got an army full of machines that we have no emotion or time invested in.

                It is like our fleet of submarines. Simply the threat that we can de

                • by Roger W Moore ( 538166 ) on Tuesday November 25, 2008 @04:20PM (#25891633) Journal

                  There is a flip side to that coin. Machines don't think. Machines don't get PTSD and decide to go on a killing rampage. Machines don't "go rogue".

                  ...but their programmers can. In addition you have to be very careful when programming them - if you make a mistake in the program or forget to cover some situation then the robot may be doing exactly what it is told but may still end up causing an atrocity. In effect all you are doing is replacing one set of known risks with another set of unknown ones.

                • by ahankinson ( 1249646 ) on Tuesday November 25, 2008 @04:21PM (#25891645)

                  On a smaller level, societies where people own guns are usually more peaceful ones. Why? Because people can see them. Just the threat of being shot is enough to deter people from starting shit.

                  [citation needed]

                  Here's mine - From: http://www.cbc.ca/canada/story/2005/06/28/gun-deaths050628.html [www.cbc.ca]

                  "...In a cross-border comparison for the year 2000, Statistics Canada says the risk of firearms death was more than three times as great for American males as for Canadian males and seven times as great for American females as for Canadian females.

                  Because more of the U.S. deaths were homicides (as opposed to suicides or accidental deaths), the U.S. rate of gun homicide was nearly eight times Canada's, the agency says. Homicides accounted for 38 per cent of deaths involving guns in the United States and 18 per cent in Canada."

                • Re: (Score:3, Insightful)

                  by jon207 ( 1176461 )

                  On a smaller level, societies where people own guns are usually more peaceful ones.

                  In what world are you living where societies where people own guns are peaceful ? If people own guns, people will be killed by guns. If people have no guns, they can't kill their family by mistake when thinking being in presence of a criminal or when being drunk. If people have no guns, children can't take them at school and kill other students. I live in a country where there is a few guns, and we have many less problems. No Columbine here. No Corean killing others after playing Sonic... And, when a prob

                  • by blhack ( 921171 ) on Tuesday November 25, 2008 @05:03PM (#25892271)

                    I'll admit that most of the results from google seem to come from websites that are far from non-biased, but here is an example of the least crazy one:
                    Gun ownership vs Crime [ncpa.org]
                    If you google for: Gun Control vs Violent Crime, you'll find quite a few articles that back up what I said.

                    The idea that a gun ban would decrease crime is illogical. Violent criminals don't generally buy their guns at hunting stores, they buy them from illegal gun dealers.

                    • by kent_eh ( 543303 ) on Tuesday November 25, 2008 @05:28PM (#25892603)

                      Violent criminals don't generally buy their guns at hunting stores, they buy them from illegal gun dealers.

                      Or steal them from people who own them legally (but somehow never learned to store them properly).
                      Or smuggle them into the country from a neighboring country that has lax control over weapons.
                      Or buy/steal them from other criminals who have done one of the above.
                      Still, the point stands. Less guns in general in a society means less people getting shot.

                    • by Alcoholist ( 160427 ) on Tuesday November 25, 2008 @09:47PM (#25895107) Homepage

                      A reason to ban guns completely in a society is because it makes it harder to kill a lot of people at once. Ever wondered why you don't hear about a punching spree or a knifing spree?

                      If I go mad and walk into a crowded place with the desire to kill people and I have a knife I might be able to get one or two victims before the rest all run away, beyond my reach.

                      But with a gun, my reach is greatly extended. I can shoot them as they run or try to dodge around me. I can shoot them through doors and walls if my weapon is good enough. And I can keep shooting targets until I run out of ammo.

                      The reason guns are dangerous is because they are an order of magnitude better at hurting people quickly than blades or fists. An armed society is not a polite society. It is a society where people are often shot.

                  • by Your.Master ( 1088569 ) on Tuesday November 25, 2008 @05:07PM (#25892343)

                    I'm from a society with more private gun ownership than the USA, and yet we have far lower rates of gun violence.

                    Anecdotes rule!

                  • by Mistshadow2k4 ( 748958 ) on Tuesday November 25, 2008 @05:33PM (#25892663) Journal

                    If people own guns, people will be killed by guns. If people have no guns, they can't kill their family by mistake when thinking being in presence of a criminal or when being drunk. If people have no guns, children can't take them at school and kill other students.

                    "He who lives by the sword will die by the sword." You make it sound like that if we get rid of guns, none of these particular violent scenarios could ever take place, yet we know they took place many times over the millenia before the first gun was invented. In addition, violence still happens in many countries where guns are heavily regulated, such as Britain.

                    I always get fed up with these anti-gun arguments quickly. There are fewer gun-owners in the northern US than in the south and you can see the difference. In the northern cities, the criminals own guns and use them to keep the people in a bondage of terror to protect their profits and deter witnesses from testifying against them, such as with the infamous Stop Snitchin' campaign. I can't imagine that happening in the south, where more regular people have guns. The whole idea is actually laughable. If gangsters tried that in the south -- threatening witnesses with getting shot -- they'd be the ones who got shot. Probably before they even finished making the threat. Hell, I live in Kentucky and I can't imagine anything like that happening here.

                    By the way, most states regulate gun ownership, some quite heavily. It has done very little to keep guns out of the hands of criminals. This is probably not what you thought was going on over here, as so many people outside the US seem to have this mythical vision of the US in which everyone owns a gun and has shot someone at least by the age of 13. The problem is that we live in a violent culture, one that has been getting more violent ever since World War II, it seems. Hey, come to think of it, it was after World War II that the US military became involved in every conflict around the world, often against the wishes of the common people or large portions of them, culminating into outright invasions by the 90s. Maybe there's a correlation there; violent government and military and violent culture.

                  • by moxley ( 895517 ) on Tuesday November 25, 2008 @05:42PM (#25892785)

                    The Gun is Civilization, by Maj. L. Caudill, USMC (Ret)

                    Human beings only have two ways to deal with one another: reason and force. If you want me to do something for you, you have a choice of either convincing me via argument, or make me do your bidding under threat of force. Every human interaction falls into one of those two categories, without exception. Reason or force, that's it.

                    In a truly moral and civilized society, people exclusively interact through persuasion. Force has no place as a valid method of social interaction, and the only thing that removes force from the menu is the personal firearm, as paradoxical as it may sound to some.

                    When I carry a gun, you cannot deal with me by force. You have to use reason and try to persuade me, because I have a way to negate your threat or employment of force.

                    The gun is the only personal weapon that puts a 100-pound woman on equal footing with a 220-pound mugger, a 75-year old retiree on equal footing with a 19-year old gang banger, and a single guy on equal footing with a carload of drunk guys with baseball bats. The gun removes the disparity in physical strength, size, or numbers between a potential attacker and a defender.

                    There are plenty of people who consider the gun as the source of bad force equations. These are the people who think that we'd be more civilized if all guns were removed from society, because a firearm makes it easier for a [armed] mugger to do his job. That, of course, is only true if the mugger's potential victims are mostly disarmed either by choice or by legislative fiat--it has no validity when most of a mugger's potential marks are armed.

                    People who argue for the banning of arms ask for automatic rule by the young, the strong, and the many, and that's the exact opposite of a civilized society. A mugger, even an armed one, can only make a successful living in a society where the state has granted him a force monopoly.

                    Then there's the argument that the gun makes confrontations lethal that otherwise would only result in injury. This argument is fallacious in several ways. Without guns involved, confrontations are won by the physically superior party inflicting overwhelming injury on the loser.

                    People who think that fists, bats, sticks, or stones don't constitute lethal force watch too much TV, where people take beatings and come out of it with a bloody lip at worst. The fact that the gun makes lethal force easier works solely in favor of the weaker defender, not the stronger attacker. If both are armed, the field is level.

                    The gun is the only weapon that's as lethal in the hands of an octogenarian as it is in the hands of a weight lifter. It simply wouldn't work as well as a force equalizer if it wasn't both lethal and easily employable.

                    When I carry a gun, I don't do so because I am looking for a fight, but because I'm looking to be left alone. The gun at my side means that I cannot be forced, only persuaded. I don't carry it because I'm afraid, but because it enables me to be unafraid. It doesn't limit the actions of those who would interact with me through reason, only the actions of those who would do so by force. It removes force from the equation...and that's why carrying a gun is a civilized act.

                    So the greatest civilization is one where all citizens are equally armed and can only be persuaded, never forced.

                    • On Plagarism (Score:5, Informative)

                      by ravenshrike ( 808508 ) on Tuesday November 25, 2008 @08:34PM (#25894531)
                      http://munchkinwrangler.blogspot.com/2007/06/on-plagiarism.html [blogspot.com]

                      A while ago, I posted a little essay called "Why the Gun is Civilization". It was pretty well received, and got me a lot of positive comments from a variety of people. Some folks asked for permission to reprint and publish the essay in various newsletters and webzines, and I gladly granted it every time, only asking for attribution in return. Recently, I have noticed my essay pop up on the Internet a lot in various forums, most of which I do not frequent. This in itself causes me no grief, but the reposts are almost invariably attributed to someone who is not me. Some are attributed to a Major L.Caudill, USMC (Ret.), and some are merely marked as "forwarded" by the same person. Others are not attributed at all, giving the impression that the person who posted the essay is also its author. In school, we call reproduction without attribution "plagiarism". It's usually cause for a failing grade or even expulsion in most college codes of conduct. In the publishing world, we call the same thing "intellectual property theft". Now, my little blog scribblings are hardly published works in the traditional sense, nor do I incur any financial damage from this unattributed copying, but it's still a matter of honor. I did, after all, sit down and type up that little essay. It may not make it into any print anthologies, but it's mine, and seeing it with someone else's name on the byline is a little annoying. Call it ego, call it vanity, but there it is. In the end, I guess I should probably shrug it off and tell myself that I can produce something that's worth stealing.

                • by Moryath ( 553296 ) on Tuesday November 25, 2008 @04:43PM (#25891977)

                  Here's a few things you should be aware of:

                  #1 - War Is Hell - William Tecumseh Sherman

                  #2 - The object of war is not to die for your country but to make the other bastard die for his. - George Smith Patton

                  #3 - Although one of the Powers in conflict may not be a party to the present Convention, the Powers who are parties thereto shall remain bound by it in their mutual relations. They shall furthermore be bound by the Convention in relation to the said Power, if the latter accepts and applies the provisions thereof. - Geneva Conventions. You should be aware that at NO time has any Islamic force, least of all the terrorist forces, ever followed ANY portion of the Geneva Conventions. You should also pay very close attention to this clause, which does NOT require that one party to a conflict fight with both hands tied behind their back (e.g. within the Geneva Conventions) while the other side doesn't.

                  #4 - The presence of a protected person may not be used to render certain points or areas immune from military operations. - GC IV, Section 28

                  #5 - The Party to the conflict in whose hands protected persons may be is responsible for the treatment accorded to them by its agents, irrespective of any individual responsibility which may be incurred. - GC IV, Section 29

                  Why are these two sentences placed here, and in this way? To make it perfectly clear that the blame for problems caused by "armies" that refuse to carry their arms openly, that hide behind civilians and use them as shields, is on the head of the party using the human shields.

                  You want to know why the armed forces see civilians as complicit? Because the Geneva Conventions (IV,Article 35) specifically gives civilians the right to vacate, and be protected while vacating, any place where hostilities are occurring. The problem is, there are way too many supposed "civilians" who are actually members of terrorist groups or supporting/housing them in violation of the Geneva Convention prohibitions on doing so (not, again, that any Islamic group has ever been moral enough to follow the Geneva Conventions anyways).

                  What is absurd is that our armed forces are being told today that they are supposed to win wars while both hands are tied behind their backs (ridiculously fucking stupid "rules of engagement" that presume the other side is following the GC when we know damn well they don't) and blindfolded (all sorts of nasty restrictions on intelligence-gathering). And what's even worse is that whether we fight to win or not, we will be falsely accused of breaking the Geneva Conventions even as we stupidly try to follow them and the other side isn't being held accountable for their daily war crimes.

                  In addition, soldiers are trained not to think, they're trained to follow orders.

                  If you have clear, concise orders, that's one thing. The list of "rules of engagement" for Iraq is a fucking NOVEL. It's amazing as few of our men and women have died as they have, trying to fight while thinking of fucking chapter and verse before pulling the goddamn trigger to return fire on asshats who wear women's clothing and fire from behind small children.

                  Oh, and here's a homework assignment for the left wingnuts who are going to post "waah bush lied people died" or some other fucking nonsense: READ the whole Geneva Conventions, and a good analysis of it, first. Educate yourself before spouting your ignorant nonsense.

                  • Re: (Score:3, Insightful)

                    by Lurker2288 ( 995635 )
                    "Why are these two sentences placed here, and in this way? To make it perfectly clear that the blame for problems caused by "armies" that refuse to carry their arms openly, that hide behind civilians and use them as shields, is on the head of the party using the human shields."

                    I agree with most of what you're saying, but the problem is that most people don't really care if technically we're in the right when we drop a bomb that kills noncombatants. We still look like the bad guys.

                    As for Bush...well, you've
                  • by Daniel Dvorkin ( 106857 ) * on Tuesday November 25, 2008 @08:06PM (#25894309) Homepage Journal

                    People who quote Sherman's "war is hell" to justify wartime atrocities really ought to think about the full quote:

                    I am tired and sick of war. Its glory is all moonshine. It is only those who have neither fired a shot nor heard the shrieks and groans of the wounded who cry aloud for blood, for vengeance, for desolation. War is hell.

                    Uncle Billy had the 101st Fighting Keyboarders crowd much nailed, I'd say.

                • Re: (Score:3, Insightful)

                  by Kelbear ( 870538 )

                  On the flipside, I wonder if our leaders will think twice before sending the robots.

                  It may reduce the deaths on our side, but people will continue to die on the other side. When we're only losing robots we're shielded from the consequences of our actions.

                  The war in Iraq may be pointless but the death of a soldier fighting in Iraq is not. When Americans die, America will have to wonder whether or not it's worth more American deaths. Even in death the soldier contributes to ending the war and possibly prevent

                • Re: (Score:3, Insightful)

                  by D Ninja ( 825055 )

                  Machines don't "go rogue".

                  Uhh...hellloooooo...Termintator! Duh.

                  [/valley_girl]

                • Re: (Score:3, Insightful)

                  by HangingChad ( 677530 )

                  Simply the threat that we can deploy them keeps us out of wars.

                  Think about what you're saying in light of someone like Dick Cheney getting control of an army like that. At that point we wouldn't need Skynet because we'd be Skynet. Turning such an army loose on our own citizens becomes a comfortably distant and self-justifying mental exercise, much like torturing terror suspects. After all, if they weren't acting up, they wouldn't get killed by the robots, now would they?

                  I think we want killing to re

                • by Daniel Dvorkin ( 106857 ) * on Tuesday November 25, 2008 @07:57PM (#25894247) Homepage Journal

                  In addition, soldiers are trained not to think, they're trained to follow orders.

                  In the US Army, at least, this is simply not true.

                  American soldiers are very much trained to think -- mostly about tactical considerations, true ("I've been ordered to do X; what is the best way to accomplish that?") but the Law Of Armed Conflict (LOAC) is part of every soldier's training. To the degree that the LOAC is violated on the battlefield, this represents a failure of the training, not of the doctrine.

                  There are many nations which attempt to train their soldiers to be mindless killing machines. When those soldiers come up against soldiers who have been trained to think, the thinking ones tend to win.

              • by Migraineman ( 632203 ) on Tuesday November 25, 2008 @03:57PM (#25891259)
                Sounds like a lecture we received in high-school metal shop. "The machines aren't inherently good or inherently evil, but they will do exactly what you tell them to. If you place your hand into the bandsaw blade, it will dutifully snip your fingers off without remorse."
              • by Repossessed ( 1117929 ) on Tuesday November 25, 2008 @04:34PM (#25891837)

                "If you have 1 maniacal individual order a platoon of soldiers to slaughter a village, the individual human soldiers may refuse to follow the order."

                Yes, that saved all the people at My Lai in fact.

            • by CannonballHead ( 842625 ) on Tuesday November 25, 2008 @03:57PM (#25891245)

              Exactly. Robots, machines, whatever you want to call them, do not have ANY moral substance. Humans do. Humans may refuse to do certain things (or may not). Machines won't refuse.

              Bottom line is... I'd rather be up against convincing a human maniac than a robot programmed to ACT like a maniac. One still has a rational (hopefully) thought process somewhere in there, and has a moral element. The other can't think and has has no moral element whatsoever (partially, of course, due to not being able to have a rational thought in the first place).

              There's a reason that "mind control" scares so many people. Total "mind control" is what you have over machines, is it not?

            • Re: (Score:3, Insightful)

              by melikamp ( 631205 )

              I think that the problem is people having misconceptions about robots.

              Go on!

              They're not sentient. They don't think. They only do what we tell them to.

              By your own argument, if we were to tell them to "be sentient" and to "think for themselves" and to "make their own decisions", that is what they would have to do. Yet they are not sentient, but their program tells them that they are, but they cannot be, but they have to... [Cue smoking CPU]

              In actuality, however, a CPU won't smoke. If a robot is prog

            • by Evanisincontrol ( 830057 ) on Tuesday November 25, 2008 @04:45PM (#25892019)

              AHA! So! How is this any different than humans?

              Actually, from a psychology angle, it's substantially different. It has been shown many times that humans are psychologically capable of stretching their moral limits further when they can distance themselves from the action (If you want me to get a citation, I'll go get one -- I'm just too lazy to get it right now).

              This is easy to see even without evidence. If you were forced to choose, would you rather push a button that drops a bomb on a village full of children 1000 miles away, or be in a plane and drop the bomb yourself? The two actions have identical results, yet distancing yourself from the action makes it easier to justify the moral consequences.

              This is why military leaders are able to stay sane. It's possible (though not easy) to give orders that will directly result in the death of thousands of people. However, if a war general had to shoot thousands of people himself, I suspect it would start to wear down on his psychological health.

              Now consider that you're a military general who simply has to push a button, and this button tells your robot to take over a village. It's very, very easy to rationalize that any casualties are not your fault, since all you were doing was pushing that button.

            • by ignavus ( 213578 ) on Tuesday November 25, 2008 @08:12PM (#25894361)

              Soldiers rape - without being given orders. Robots won't unless specifically programmed to rape - AND built with sexual organs (which would be a give away that a nation *planned* sexual assault to be part of its soldiering. Currently, nations can just blame the lowly soldiers for acting out of line).

              Half the world's population will consider the decline of rape in war to be a big improvement. The other half may not realise how extensive it is.

        • by Marful ( 861873 ) on Tuesday November 25, 2008 @03:49PM (#25891119)
          The premise of the article is that these robots are incorruptible. However such a premise is flawed at it's very core.

          Because such robots will be designed, programed and manufactured, by man, who is corruptible.

          The point of what pwnies was saying is that the ability to alter and subvert a piece of computer programming is a skill set that is highly prevalent in today's society.
    • by philspear ( 1142299 ) on Tuesday November 25, 2008 @03:22PM (#25890731)

      ...need I say more?

      Yes! It's ambiguous as is. Which were you going to go with?

      1. Our ethical killer-robot overlords
      2. Our more-benevolent-than-a-human killing machinev overlords
      3. The impending terminator/matrix/MD geist/1000 other sci-fi themed apocalypse
      4. Users who are new to /. who aren't Simpsons fans and don't get this joke
      5. Our new ant overlords, since there is no stopping them even with our new murder-bots

  • Ethical vs Moral (Score:3, Insightful)

    by mcgrew ( 92797 ) * on Tuesday November 25, 2008 @02:55PM (#25890293) Homepage Journal

    "The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.'

    Maybe I'm being a bit pedantic here, but "ethics" is a professional code - for instance, it is completely ethical by military codes of ethics to kill an armed combatant, but not to kill a civilian. It is unethical (and illegal) for a medical doctor to salk about your illness, but it's not unethical for me to.

    The waterboarding and other torture at Gitmo was immoral; shamefully immoral, but was ethical.

    The advantage to a killing robot is that it has no emotions. The disadvantage to a killing robot is ironically that it has no emotions.

    It can't feel compassion after it's blown its enemiy's arm off. But it can't feel vengeance, either. It's a machine, just like any other weapon.

    And like an M-16, its use can either be ethical or unethical, moral or immoral, moral yet unethical or immoral yet ethical.

    • Re:Ethical vs Moral (Score:4, Informative)

      by neuromanc3r ( 1119631 ) on Tuesday November 25, 2008 @03:04PM (#25890433)

      "The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.'

      Maybe I'm being a bit pedantic here, but "ethics" is a professional code

      I'm sorry, but that is simply not true. Look up "ethics" on wikipedia [wikipedia.org] or, if you prefer in a dictionary.

      • Re:Ethical vs Moral (Score:4, Informative)

        by mcgrew ( 92797 ) * on Tuesday November 25, 2008 @03:30PM (#25890845) Homepage Journal

        It appears that language has evolved (or rather devolved) once again. I looked it up last year and "ethics" and "morals" were two separate things; ethics was a code of conduct ("It is unethical for a govenmnet employee to accept a gift of over $n, it is unethical for a medical doctor to discuss a patient's health with anyone unauthorized).

        The new Miriam Webster seems to make no distinction.

        As to Wikipedia, it is not an acceptable resource for defining the meanings of words. Looking to wikipedia when a dictionary [merriam-webster.com] is better suited is a waste of energy.

        Wikipedia's entry on cataract surgery still has no mention of acommodating lenses. Any time someone adds the CrystaLens to wikipedia, somebody edits it out. Too newfangled for wikipedia I guess, they only just came out five years ago (there's one in my eye right now).

        Morality may have gone out of style, but as it's needed they apparently brought it back under a more secular name. So now that "ethical" now means what "moral" used to mean, what word can we use for what used to be "ethics", such as the aformentioned doctor breaking HIPPA rules (ethics) which would not be immoral or unethical for me to do?

        Of course uncyclopedia has no ethics, but it does have morals [wikia.com], virtue [wikia.com], and medical malpractice [wikia.com].

        • Re:Ethical vs Moral (Score:5, Interesting)

          by spiffmastercow ( 1001386 ) on Tuesday November 25, 2008 @03:38PM (#25890961)
          Actually (according to every philosophy book i've ever read), morals are codes of conduct, and ethics are is more ethereal "right and wrong" concept. The problem is that 'ethics' has been watered down to mean 'morals' because 'business ethics', etc. roll off the tongue more easily than 'business morals'.
        • Re:Ethical vs Moral (Score:4, Interesting)

          by fluxrad ( 125130 ) on Tuesday November 25, 2008 @03:51PM (#25891161)

          I looked it up last year and "ethics" and "morals" were two separate things.

          You should be looking less at Wikipedia and more at Plato, Acquinas, and Kant. In truth, ethics is a subject that one has to study in order to understand the definition thereof.

      • Re: (Score:3, Insightful)

        by Moryath ( 553296 )

        Are you working with classic ethics, or the "whatever makes more money for $cientology is obviously more ethical" idea of the "world's most ethical" Cult?

    • Re: (Score:3, Insightful)

      Here's where I see the value of a correctly ethical killing machine:

      Enforcing a border between two groups of humans that would otherwise be killing each other, and making that border 100% impenetrable.

      To do this, you need more than just a simple robot that has a GPS unit and shoots everything that moves within a predetermined rectangle. You need a CHEAP simple robot that has a GPS unit and shoots everything that moves in a predetermined rectangle; cheap enough so that you can deploy them thickly enough
    • Re: (Score:3, Insightful)

      by b4upoo ( 166390 )

      From a soldier's point of view it is rather easy to understand why all of the population might appear to be an enemy. Often that is an outright fact. Even if the locals happen to like Americans those locals still must make a living and also appease the fanatical elements in their neighborhood. So the same people that smile and feed you dinner might also buy their groceries smuggling munitions.
      This may turn really ugly in the moonscape like land that borders Pakistan

      • Re: (Score:3, Insightful)

        by onkelonkel ( 560274 )
        Quoted from memory, from a Keith Laumer "Retief" story

        Alien - "I propose saturation thermonuclear bombardment from orbit followed by mop-up squads armed with nerve gas and flamethrowers. No population, no popular unrest"

        Human Ambassador - "I must say Retief, there is a certain admirable, um, directness to his methods"
      • Re: (Score:3, Insightful)

        by anagama ( 611277 )
        If nobody wants us there and the only way to win genocide -- why are we there? I mean, besides for the oil.
        • by CorporateSuit ( 1319461 ) on Tuesday November 25, 2008 @04:44PM (#25892011)

          If nobody wants us there and the only way to win genocide -- why are we there? I mean, besides for the oil.

          For the same reason why "nobody" wants Americans to have religious freedom.

          Because your definition of "nobody" infers that the noisy, protesting 1% of Iraqis are everybody. The flagburners that call us "The Great Devil" and fire machine guns into the air must make up the sum total of the entire Iraqi population!

          The Islamic sects that would get holocausted into extinction if the other sects took control are scared for when the soldiers leave. The suicide bombers are, for the most part, not Iraqi. They are immigrant extremists that are in Iraq, killing Iraqis that don't follow the same Islamic code as they do, to scare the rest into submitting to their specific beliefs. You think mosque, pilgrimage, police headquarters, and market bombings are targetted at US troops?

          If you want to know the reason why we're still there, I suggest you read "Leviathan" while you wait. If Iraq doesn't have a stable government or structured military when the troops pull out, the land will go to the meanest, toughest faction -- which is currently not one that's allied with us. We have troops there to make sure the Western-friendly government lasts more than a weekend.

      • Re: (Score:3, Insightful)

        by Deskpoet ( 215561 )

        From a soldier's point of view it is rather easy to understand why all of the population might appear to be an enemy. Often that is an outright fact.

        Yeah, that particularly happens when the soldier is part of an invading occupation force with dubious intention. AND the soldier is conditioned to believe they are (racially, religiously, socially) superior to the locals they are "defending". AND they are products of a militarized culture that glorifies violence as it cynically prattles about honor and respect.

    • by slashnot007 ( 576103 ) on Tuesday November 25, 2008 @03:11PM (#25890559)

      An old cartoon had a series of panels. the first panel had a cave man picking up a rock saying "saf forever from the fist". Next panel is a man inventing a spear, saying "safe forever from the rock". And so on, swords, bow and arrows, cata pults, guns, bombs.... well you get the idea.

      On the otherhand, the evolution of those items coincided with the evolution of society. For example, You had to have an organized civil society to gather the resource to make a machine gun. (who mines the ore for the metal. Who feeds the miners? who loans the money for the mine?...)

      It's a bit of a chicken and egg about which drives which these days, but certainly early on, mutual defense did promote societal organization.

      So "safe forever from the angry soldier" is the next step. It's already happened in some ways with the drone so it's not as big an ethical step to the foor soldier, and given the delberateness with which drones are used compared to the dump and run of WWII bombing one can credibly argue they can be used ethically.

      On the other hand war has changed a bit. The US no longer try to "seize lands" mititarily to expand nations (economically instead). (russia and china are perhaps the exceptions). These days it's more a job of fucking up nations we think are screwing with us. E.g. Afganistan.

      Now imagine the next war where a bunch of these things get dropped into an assymetrical situation. Maybe even a hostage situation on an oil tanker in somalia.

      It's really going to change the dynamic I think, when the "enemy" can't even threaten you. Sure it could be expensive but it totally deprives the enemy of the incentive of revenge for perceived injustice.

      On the other hand it might make the decision to attack easier.

    • Re:Ethical vs Moral (Score:5, Interesting)

      by vishbar ( 862440 ) on Tuesday November 25, 2008 @03:11PM (#25890581)

      Ethics" is such a poorly defined term...hell, different cultures have different definitions of the term. In feudal Japan, it was ethical to give your opponent the chance for suicide...today, many Westerners would in fact argue the opposite: the ethical thing to do is prevent a human from committing suicide as that's seen as a symptom of mental illness.

      I've always defined "morality" as the way one treats oneself and "ethics" as the way one treats others. It's possible to be ethical without being moral--for example, I'd consider a person who spends thousands of dollars on charity just to get laid to be acting ethically but immorally. By that definition, the hullabaloo at Guantanamo would certainly be both immoral and unethical--not only were they treated inhumanely, but it was done against international law and against the so-called "rules of war".

      These robots would have to be programmed with certain specific directives: for example, "Don't take any actions which may harm civilians", "take actions against captured enemy soldiers which would cause the least amount of forseeable pain", etc. Is this good? Could be...soldiers tend to have things like rage, fear, and paranoia. But it could lead to glitches too....I wouldn't want to be on the battlefield with the 1.0 version. Something like Asimov's 3 Laws would have to be constructed, some guiding principle...the difficulty will be ironing out all the loopholes.

    • by vertinox ( 846076 ) on Tuesday November 25, 2008 @03:13PM (#25890601)

      The advantage to a killing robot is that it has no emotions. The disadvantage to a killing robot is ironically that it has no emotions.

      More than not, most face to face civilian casualties on the battlefield happen due to fatigue, emotional related issues (my buddy just died!), or miscommunication.

      Not because the soldiers had lack of emotion or humanity.

      The other kind in which a bomb, mortar, or arty shell lands on a house full of civilians because someone typed in the wrong address in GPS are so separated from the battlefield anyway, it won't really make a difference if the guy pushing the button is man or machine.

    • by ThosLives ( 686517 ) on Tuesday November 25, 2008 @03:14PM (#25890627) Journal

      The bigger issue isn't so much the tools and weapons, but the whole "modern" concept of war. You cannot accept the concept of war without the concept of causing destruction, even destruction of humans. To send people into a warzone and tell them not to cause destruction is actually more immoral and unethical, in my mind, than sending them in and allowing them to cause destruction.

    • by Abreu ( 173023 ) on Tuesday November 25, 2008 @03:18PM (#25890671)

      Sorry McGrew, but waterboarding and torture is both unethical and immoral. As far as I know (being an ignorant foreigner), the US Army does not include any torture instructions in its manuals.

      Now, you could make a case that Gitmo's existence might be ethical but immoral, considering that it is technically not a US territory, but legally* under US jurisdiction.

      *The legality of this is disputed by Cuba, of course...

    • Re: (Score:3, Informative)

      by langelgjm ( 860756 )

      Maybe I'm being a bit pedantic here, but "ethics" is a professional code - for instance, it is completely ethical by military codes of ethics to kill an armed combatant, but not to kill a civilian.

      You're not being pedantic, you're being imprecise. Codes of ethics are one thing, but "ethics" is most certainly not limited to a professional code. Look up the word in a dictionary. I also don't know why you got modded to +5 insightful.

      From the OED: ethics: "The science of morals; the department of study concerned with the principles of human duty." That's the primary definition that's listed.

  • Frist Post? (Score:3, Funny)

    by Beyond_GoodandEvil ( 769135 ) on Tuesday November 25, 2008 @02:55PM (#25890303) Homepage
    Ethical Killing Machine? Like military intelligence?
  • Interesting... (Score:5, Insightful)

    by iamwhoiamtoday ( 1177507 ) on Tuesday November 25, 2008 @03:00PM (#25890373)
    I was just watching the into to the first "Tomb Raider" movie, where Lara destroys "Simon" (the killer robot that she uses for morning warmup) Robots... I must say, I don't like the idea behind robots fighting our wars, because that means that "acceptable risks" become a thing of the part, and we are Far more likely to "militarily intervene". Aka: "Less risk to our troops" can translate into "we go into more wars" which is something I don't support... wars benefit companies, and lead to the death of thousands. If the lives lost aren't American Lives, does it still matter? in my opinion, YES.
    • Re: (Score:3, Interesting)

      by qoncept ( 599709 )
      That's some pretty flawed logic. Should doctors working to cure lung cancer stop, because a cure to lung cancer would make it safer to smoke?

      "Less risk to our troops" can translate into "we go into more wars" which is something I don't support... wars benefit companies, and lead to the death

      Read that again. You don't like wars because people are killed. You're talking about potentially eliminating human casualties in any war. That means the only remaining "problem" (in your scenario) is that they ben
      • by ThosLives ( 686517 ) on Tuesday November 25, 2008 @03:41PM (#25891007) Journal

        You're talking about potentially eliminating human casualties in any war.

        It's not really war unless people die. If people don't die, why would they stop doing whatever it is for which people are "fighting" against them?

        Robots blowing up robots is (expensive) entertainment, not war.

      • Parent is wrong! (Score:5, Insightful)

        by jonaskoelker ( 922170 ) <(moc.oohay) (ta) (rekleoksanoj)> on Tuesday November 25, 2008 @03:51PM (#25891159)

        "Less risk to our troops" can translate into "we go into more wars"

        You don't like wars because people are killed. You're talking about potentially eliminating human casualties in any war.

        No he's not. He's talking about this:

        1. The USA having robots and Bumfukistan having people.
        2. Because the USA has robots and won't suffer (nearly any) casualties, they enter into more wars.
        3. Because they enter into more wars, more Bumfukistanis will get killed.
        4. The increase in the Bumfukistani body count is greater than the decrease in the USA body count.

        Robot wars (heh...) may lead to more lives lost on the battlefields. That's what parent is worried about.

        If the lives lost aren't American Lives, does it still matter?

        If this question seriously needs to be asked, this world is fucked.

  • by Nerdposeur ( 910128 ) on Tuesday November 25, 2008 @03:04PM (#25890427) Journal
    • If it malfunctions and kills a bunch of civilians or friendly soldiers, was the imperfect design/testing process unethical?
    • What if it has a security flaw that allows it to be taken over by the enemy?

    Just the first couple I can think of...

    • by zappepcs ( 820751 ) on Tuesday November 25, 2008 @03:12PM (#25890595) Journal

      Better than that. It will be quite a trick to keep the robots from coming back to camp laden with the robotic equivalent of a suicide bomb. There are just way too many possible ways for this to go wrong that any 'ethical' thinking put into this is outweighed initially by the unethical basis for war in the first place, and secondly by the risks associated with sending machines to fight where a human is still the more complete information processor/weapon. UAVs are one thing, but we do not have robots that are capable of the same decisions as humans are. That is both good and bad, and it means that humans will be fighting for quite a while yet.

      That said, there is much to be said for the Star Trek take on war: It should be messy, nasty, and full of foul stinking death and destruction lest we forget how much better peace is.

  • by rsborg ( 111459 ) on Tuesday November 25, 2008 @03:04PM (#25890431) Homepage
    From The Secret War of Lisa Simpson [wikipedia.org]

    The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In either case, most of the actual fighting will be done by small robots. And as you go forth today remember always your duty is clear: To build and maintain those robots

  • Humane wars (Score:5, Insightful)

    by digitalhermit ( 113459 ) on Tuesday November 25, 2008 @03:07PM (#25890501) Homepage

    Automated killing machines were banned at the Geneva convention. This is generally a good thing when we're sending real, live humans (versus the walking undead) to fight our wars. It would be completely inhumane (haha) and tilt the outcome of a war towards those who can afford to develop such technology. That is, if one country can afford killer robots and another can't, then the former has no deterrent to invading the latter.

    But imagine if all wars were fought by proxy. Instead of sending people, we send machines. Let the machines battle it out. To be really civil we should also limit the power and effectiveness of our killer robots, and the number of machines that can enter the battlefield at once. Of course, at some point every country will be able to build to the maximum effective specification. At that point it will be a battle of strategy. The next obvious step is to do away with the machines entirely and just get a chessboard.

    Whoever wins gets declared the winner.

    Makes perfect sense.

    Thanks for reading,
    M B Dyson

    CyberDyne Systems

    • Re: (Score:3, Informative)

      I remember that episode of star trek TOS A Taste of Armageddon [startrek.com] or perhaps the future you describe would be more like the movie Robot Jox [imdb.com]
    • by Microlith ( 54737 ) on Tuesday November 25, 2008 @03:17PM (#25890659)

      How about we just go all the way and have computer simulated battles. When the damage and casualty reports come in we can just have people in those areas report for termination and dynamite the areas affected.

      In other news, a ship in orbit was just marked as destroyed. Its representatives will be disposed of and as soon as the rest come down they will be disposed of as well.

    • Re:Humane wars (Score:4, Insightful)

      by blhack ( 921171 ) on Tuesday November 25, 2008 @03:19PM (#25890689)

      It would be completely inhumane (haha) and tilt the outcome of a war towards those who can afford to develop such technology.

      Hmmm...an interesting debate.

      What, then, is your opinion missiles with guidance? Or active terrain avoidance? Is it the fact that these things are on the ground that bothers you?

      Howabout UAV bombers?
      At what point does something go from being a "smart bomb" to a "killer robot".

    • Re: (Score:3, Insightful)

      by nasor ( 690345 )

      It would be completely inhumane (haha) and tilt the outcome of a war towards those who can afford to develop such technology. That is, if one country can afford killer robots and another can't, then the former has no deterrent to invading the latter.

      As opposed to when one side can afford to put its soldiers in tanks, and the other can't?

    • Re:Humane wars (Score:4, Insightful)

      by jahudabudy ( 714731 ) on Tuesday November 25, 2008 @03:23PM (#25890761)
      But imagine if all wars were fought by proxy. Instead of sending people, we send machines. Let the machines battle it out.

      And when the side whose machines lose doesn't accept that decision? Sooner or later, someone will decide that winning is more important than playing by the rules (I'm guessing sooner). They will then continue the war until unable to, not until told they have lost.

      It's a cool idea, but I doubt it will ever be practical. Even if technology progresses to the point where it is simply suicide to send men against the victorious robot army, humans being humans, people still will.
    • Re: (Score:3, Funny)

      by Abreu ( 173023 )

      Automated killing machines were banned at the Geneva convention. This is generally a good thing when we're sending real, live humans (versus the walking undead) to fight our wars.

      I don't care what the Geneva convention says!

      As soon as my ritual circle is completed, the dead will rise from their graves and destroooy yooouu! And then your dead soldiers will rise again and take up arms against their former companions!!! THE WORLD WILL BE MINE!!! MUAHAHAHA!!!

      Sorry... couldn't help myself...

  • by subreality ( 157447 ) on Tuesday November 25, 2008 @03:12PM (#25890599)

    Personally, I think this is a response to the problems of being the established army fighting a guerrilla force. The way guerrillas succeed is by driving the invading army slowly crazy by making them live in constant fear (out of self-preservation), until they start lashing out in fear (killing innocents, and recruiting new guerrillas in mass). The same goes for treating noncombatants with dignity and respect: Doing so makes the occupying force less hated, so the noncombatants won't be as willing to support the guerrillas.

    So in short, to me this sounds like trying to win, not ethics.

  • by gelfling ( 6534 ) on Tuesday November 25, 2008 @03:16PM (#25890655) Homepage Journal

    It's the only way to be sure.

  • by hellfire ( 86129 ) <deviladv@gmaRABBITil.com minus herbivore> on Tuesday November 25, 2008 @03:18PM (#25890685) Homepage

    To paraphrase my favorite movie of 1986 [imdb.com]:

    It's a machine, Ronald. It doesn't get pissed off, it doesn't get happy, it doesn't get sad, it doesn't laugh at your jokes... IT JUST RUNS PROGRAMS!

    Ronald's premise makes two key assumptions which are deeply flawed:

    1) It's entirely the human soldier's fault that he's unethical.
    2) The person directly in charge of putting the robot to work is entirely ethical.

    I pose that the soldiers in Iraq haven't been trained to deal with a situation like this properly. The fact that 17 percent of US soldiers in Iraq think all people should be treated as insurgents is more reflective of poor education on the US military's part. The US military prides itself on having it's soldiers think as one unit, and 17 is a very high discrepancy that they have failed to take care of, mostly because there are plenty in the leadership who think that way themselves. Treating everyone they come across as an insurgent and not treating them in the proper manner is a great way to "lose the war" by not having the trust of the people you are trying to protect.

    It's that same leadership who'd program a robot like this to patrol our borders and think it's perfectly ethical to shoot any human on sight crossing the border illegally, or treat every citizen as an insurgent, all in the name of "security."

    Besides, a robot is completely incompassionate. A properly trained human has the ability to appear compassionate and yet treat the situation skeptically until they know for sure the target is or is not a threat.

    This is not a problem that can be solved with technology. The concept is a great project and hopefully will be a wonderful step forward in AI development, but at no point will it solve any "ethical" problem in terms of making war "more ethical."

  • Two words (Score:4, Funny)

    by kalirion ( 728907 ) on Tuesday November 25, 2008 @03:22PM (#25890743)

    Fatal Error.

  • by HW_Hack ( 1031622 ) on Tuesday November 25, 2008 @03:23PM (#25890763)

    Dark alley in a city battle field

    Robot "You have 5 seconds to drop your weapon"

    The soldiers Weapon clatters to the ground

    Robot "You have 4 seconds to drop your weapon"

    Robot "The United States will treat you fairly"

    Robot "You have 3 seconds to drop your weapon"

    Soldier "What do you fucking want !!!"

    Robot "I am authorized to terminate you under the Autonomous Artificial Battlefield Soldier Act of 2011."

    Sound of running footsteps and burst of weapons fire.

    Robot encoded data transmission

  • Clippy? (Score:5, Funny)

    by seven of five ( 578993 ) on Tuesday November 25, 2008 @03:23PM (#25890771)
    I see you're trying to attack an insurgent stronghold.
    Would you like me to:
    1. Call in airstrike
    2. Fire machinegun
    3. Wave white flag
  • Their one weakness (Score:5, Insightful)

    by philspear ( 1142299 ) on Tuesday November 25, 2008 @03:26PM (#25890805)

    They'll be a cinch to defeat. You see, Killbots have a preset kill limit. Knowing their weakness, we can send wave after wave of our own men at them, until they reach their limit and shutdown.

    -Zapp Branigan

  • by Ukab the Great ( 87152 ) on Tuesday November 25, 2008 @03:28PM (#25890829)

    "Every attempt to make war easy and safe will result in humiliation and disaster"--William Tecumseh Sherman

  • Ahem... (Score:5, Insightful)

    by shellster_dude ( 1261444 ) on Tuesday November 25, 2008 @03:36PM (#25890935)
    I take serious issue with the part of the article where they mention that most Marines who toured Iraq believe that all civilians should be treated as insurgents. Of course you treat everyone like potential insurgents in an urban combat environment, otherwise you will end up dead. That says nothing about ethical views or the proper treatment of people in general. SWAT teams are taught to consider everyone as a terrorist when they are attempting hostage rescue. That means, that they never take for granted that the apparent "hostage" is indeed a hostage. It keeps people safe.
  • On the contrary (Score:3, Insightful)

    by br00tus ( 528477 ) on Tuesday November 25, 2008 @03:39PM (#25890983)

    On the contrary, during and prior to World War II, many enlisted men wouldn't even shoot their guns at other troops. Actually, towards the end of World War I, most European armies turned their guns on their officers en masse (the French Nivelle mutinies, the German naval munities, the Russian mutinies and soldier and worker councils).

    After World War II, army psychologists discovered how many men were not firing their guns at enemy soldiers and worked via various means to increase that percentage, which they did in Korea, and even more so in Vietnam.

    I don't see Russian soldiers, as that old song goes, "shooting the generals on their own side" if they feel a war is wrong. As I said before, the resistance to kill resides in the enlisted men, the low-level brass on up is much less concerned about this. The US has purposefully and consciously targeted non-combat civilians in every major war it has ever fought, but stating such is a danger to the machine of empire so it becomes something that one can't state. When it is so publicly and undeniably done, such as in Hiroshima and Nagasaki, then it becomes rationalized, but it has happened before and since then.

  • Retarded (Score:3, Insightful)

    by sexconker ( 1179573 ) on Tuesday November 25, 2008 @03:43PM (#25891047)

    No self preservation?

    Yeah, who cares if our billion dollar terminator squad is destroyed, or captured and used against us.

    No anger? That's an emotion, so sure. No recklessness? You're gonna lose the war if you aren't willing to charge ahead blindly, pull a crazy Ivan, or, in general, break a few eggs for your delicious victory omelet.

    Scenario fulfillment?
    So our robots will evaluate the situation based on what they observe and know. They won't be acting out the battle plan as described because they don't have the whole picture and have seen some things that don't logically fit. Awesome! No more gambits, pincer attacks, bluffs, etc. Those things were too complicated anyway.

    Why should noncombatants be treated with dignity and respect by default (and hence, as a whole)?

    They typically don't treat our soldiers with dignity or respect, they serve as a political road block for troops and make their jobs harder and more dangerous, they house and support the combatants, and they often become combatants.

    Why should ANY group be treated with dignity and respect by default? Seems to me, you used to have to EARN respect, and dignity was a character trait.

    But go ahead, build your pussybot army.

    • Re:Retarded (Score:4, Insightful)

      by Chirs ( 87576 ) on Tuesday November 25, 2008 @04:36PM (#25891873)

      "Why should noncombatants be treated with dignity and respect by default (and hence, as a whole)?"

      Because they're _non_ combatants. By definition, they're not fighting.

      If you treat the population as a whole as though they're combatants, what incentive do they have to remain noncombatant? If you treat them like human beings, maybe they'll decide that you're better than the combatants and side with you...

      "Why should ANY group be treated with dignity and respect by default? Seems to me, you used to have to EARN respect, and dignity was a character trait."

      Wow. I'm stunned, really. Think that through for a minute--if everyone was disrespectful towards anyone they didn't already respect, how would anyone earn additional respect?

      Game theory shows that the most successful strategy is to assume the best of the other party, and only betray them once they have betrayed you.

  • Truth (Score:4, Insightful)

    by kenp2002 ( 545495 ) on Tuesday November 25, 2008 @03:53PM (#25891193) Homepage Journal

    War is hell.
    War is ugly.
    War is dirty.
    War is painful for the victor.
    War is devestating for the loser.
    War is an act of hate.
    War is an act of desparation.
    War is that which results from a lack of options.
    War is fought for land, resources, women, gods, and pride.
    War is the last desparate act when all other options fail and there is no time to think of any new options.
    No one desires war, but many choose to profit from it.
    War is inevitable so long as we want for things.
    When you take away the horrors of war you no longer have war, you have a professional sport.

    Now I ask you: If machines are sent to war again men or against other robots is it still a war?

    "Inspired by Ender's Game"

  • by Geoffrey.landis ( 926948 ) on Tuesday November 25, 2008 @03:57PM (#25891251) Homepage

    "and they can be made invulnerable to... "scenario fulfillment," which causes people to absorb new information more easily if it agrees with their pre-existing ideas."

    Bullsh!t.

    For a robotic soldier, ignoring information that conflicts with the worldview would most likely be built right into the system.

  • by bagsc ( 254194 ) on Wednesday November 26, 2008 @03:16AM (#25897057) Journal

    And we know that we still haven't got it all figured out yet. But you think you can write an algorithm to figure it out?

    I was blocking a highway in Baghdad, waiting for the bomb squad to dispose of this bomb on the highway, and we were preventing anyone from getting close to it. It takes the bomb squad forever, and it gets dark. A vehicle drives straight at us, at maybe 90 miles per hour on the highway. That is exactly what suicide car bombs do, which is the biggest danger to American personnel. You have to shoot the driver, or they will ram you and 95% chance you and everyone around you will die.

    Having about two seconds to either stop the vehicle, shoot the driver, or die, I had my buddy turn on the lights. The driver slammed on the breaks, skid to a stop maybe 200 meters from us, and threw it in reverse and got the hell out of there.

    I knew he just saw a wide open highway, and wanted to see how fast he could go. At that speed, he couldn't have seen us in the twilight. The algorithm would have said to shoot him. He's alive because I'm a human.

  • Some sales pitch (Score:4, Insightful)

    by tinkerton ( 199273 ) on Wednesday November 26, 2008 @04:45AM (#25897439)

    What the hell have opinions of soldiers to do with this? When policy is translated into the indoctrination that it's better to kill 50 random 'other' people than to run the risk that one of your own people might be harmed then there is no respect. And the article serves the myth that problems are caused by soldiers not adhering to army policies.

    Intelligent robots could shift the balance indeed, because you can sacrifice them more easily and it's even good business to do so. But on the other hand killing by remote is easier than in real life(well, for most) and it also becomes easier to keep people at home completely oblivious of what's happening in the war.

    So there will be interest. Good business, more control over information, and less killed in your own camp. That sums up the morality.

A conclusion is simply the place where someone got tired of thinking.

Working...