Forgot your password?
typodupeerror
Robotics The Military

Ethical Killing Machines 785

Posted by kdawson
from the i-for-one-welcome dept.
ubermiester writes "The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.' The researchers claim that these real-life terminators 'can be designed without an instinct for self-preservation and, as a result, no tendency to lash out in fear. They can be built without anger or recklessness ... and they can be made invulnerable to ... "scenario fulfillment," which causes people to absorb new information more easily if it agrees with their pre-existing ideas.' Based on a recent report stating that 'fewer than half of soldiers and marines serving in Iraq said that noncombatants should be treated with dignity and respect, and 17 percent said all civilians should be treated as insurgents,' this might not be all that dumb an idea."
This discussion has been archived. No new comments can be posted.

Ethical Killing Machines

Comments Filter:
  • by slashnot007 (576103) on Tuesday November 25, 2008 @04:11PM (#25890559)

    An old cartoon had a series of panels. the first panel had a cave man picking up a rock saying "saf forever from the fist". Next panel is a man inventing a spear, saying "safe forever from the rock". And so on, swords, bow and arrows, cata pults, guns, bombs.... well you get the idea.

    On the otherhand, the evolution of those items coincided with the evolution of society. For example, You had to have an organized civil society to gather the resource to make a machine gun. (who mines the ore for the metal. Who feeds the miners? who loans the money for the mine?...)

    It's a bit of a chicken and egg about which drives which these days, but certainly early on, mutual defense did promote societal organization.

    So "safe forever from the angry soldier" is the next step. It's already happened in some ways with the drone so it's not as big an ethical step to the foor soldier, and given the delberateness with which drones are used compared to the dump and run of WWII bombing one can credibly argue they can be used ethically.

    On the other hand war has changed a bit. The US no longer try to "seize lands" mititarily to expand nations (economically instead). (russia and china are perhaps the exceptions). These days it's more a job of fucking up nations we think are screwing with us. E.g. Afganistan.

    Now imagine the next war where a bunch of these things get dropped into an assymetrical situation. Maybe even a hostage situation on an oil tanker in somalia.

    It's really going to change the dynamic I think, when the "enemy" can't even threaten you. Sure it could be expensive but it totally deprives the enemy of the incentive of revenge for perceived injustice.

    On the other hand it might make the decision to attack easier.

  • Re:Ethical vs Moral (Score:5, Interesting)

    by vishbar (862440) on Tuesday November 25, 2008 @04:11PM (#25890581)

    Ethics" is such a poorly defined term...hell, different cultures have different definitions of the term. In feudal Japan, it was ethical to give your opponent the chance for suicide...today, many Westerners would in fact argue the opposite: the ethical thing to do is prevent a human from committing suicide as that's seen as a symptom of mental illness.

    I've always defined "morality" as the way one treats oneself and "ethics" as the way one treats others. It's possible to be ethical without being moral--for example, I'd consider a person who spends thousands of dollars on charity just to get laid to be acting ethically but immorally. By that definition, the hullabaloo at Guantanamo would certainly be both immoral and unethical--not only were they treated inhumanely, but it was done against international law and against the so-called "rules of war".

    These robots would have to be programmed with certain specific directives: for example, "Don't take any actions which may harm civilians", "take actions against captured enemy soldiers which would cause the least amount of forseeable pain", etc. Is this good? Could be...soldiers tend to have things like rage, fear, and paranoia. But it could lead to glitches too....I wouldn't want to be on the battlefield with the 1.0 version. Something like Asimov's 3 Laws would have to be constructed, some guiding principle...the difficulty will be ironing out all the loopholes.

  • by zappepcs (820751) on Tuesday November 25, 2008 @04:12PM (#25890595) Journal

    Better than that. It will be quite a trick to keep the robots from coming back to camp laden with the robotic equivalent of a suicide bomb. There are just way too many possible ways for this to go wrong that any 'ethical' thinking put into this is outweighed initially by the unethical basis for war in the first place, and secondly by the risks associated with sending machines to fight where a human is still the more complete information processor/weapon. UAVs are one thing, but we do not have robots that are capable of the same decisions as humans are. That is both good and bad, and it means that humans will be fighting for quite a while yet.

    That said, there is much to be said for the Star Trek take on war: It should be messy, nasty, and full of foul stinking death and destruction lest we forget how much better peace is.

  • Missed A Step?? (Score:2, Interesting)

    by tripdizzle (1386273) on Tuesday November 25, 2008 @04:14PM (#25890621)
    If we can create things like this, why haven't we previously had robots on the battlefield controlled by soldiers that works like an FPS? I can understand that there would be a lag issue (haha) but you think they would have tried this before going to a completely automated system. As for the fear aspect, I think a soldier would have less of an itchy trigger finger if it was a robot on the line that can just be replaced rather than their life.
  • by hellfire (86129) <deviladv @ g m a i l . c om> on Tuesday November 25, 2008 @04:18PM (#25890685) Homepage

    To paraphrase my favorite movie of 1986 [imdb.com]:

    It's a machine, Ronald. It doesn't get pissed off, it doesn't get happy, it doesn't get sad, it doesn't laugh at your jokes... IT JUST RUNS PROGRAMS!

    Ronald's premise makes two key assumptions which are deeply flawed:

    1) It's entirely the human soldier's fault that he's unethical.
    2) The person directly in charge of putting the robot to work is entirely ethical.

    I pose that the soldiers in Iraq haven't been trained to deal with a situation like this properly. The fact that 17 percent of US soldiers in Iraq think all people should be treated as insurgents is more reflective of poor education on the US military's part. The US military prides itself on having it's soldiers think as one unit, and 17 is a very high discrepancy that they have failed to take care of, mostly because there are plenty in the leadership who think that way themselves. Treating everyone they come across as an insurgent and not treating them in the proper manner is a great way to "lose the war" by not having the trust of the people you are trying to protect.

    It's that same leadership who'd program a robot like this to patrol our borders and think it's perfectly ethical to shoot any human on sight crossing the border illegally, or treat every citizen as an insurgent, all in the name of "security."

    Besides, a robot is completely incompassionate. A properly trained human has the ability to appear compassionate and yet treat the situation skeptically until they know for sure the target is or is not a threat.

    This is not a problem that can be solved with technology. The concept is a great project and hopefully will be a wonderful step forward in AI development, but at no point will it solve any "ethical" problem in terms of making war "more ethical."

  • Re:Interesting... (Score:3, Interesting)

    by qoncept (599709) on Tuesday November 25, 2008 @04:20PM (#25890701) Homepage
    That's some pretty flawed logic. Should doctors working to cure lung cancer stop, because a cure to lung cancer would make it safer to smoke?

    "Less risk to our troops" can translate into "we go into more wars" which is something I don't support... wars benefit companies, and lead to the death

    Read that again. You don't like wars because people are killed. You're talking about potentially eliminating human casualties in any war. That means the only remaining "problem" (in your scenario) is that they benefit companies.
  • Re:Ethical vs Moral (Score:5, Interesting)

    by spiffmastercow (1001386) on Tuesday November 25, 2008 @04:38PM (#25890961)
    Actually (according to every philosophy book i've ever read), morals are codes of conduct, and ethics are is more ethereal "right and wrong" concept. The problem is that 'ethics' has been watered down to mean 'morals' because 'business ethics', etc. roll off the tongue more easily than 'business morals'.
  • Re:Ethical vs Moral (Score:4, Interesting)

    by fluxrad (125130) on Tuesday November 25, 2008 @04:51PM (#25891161) Homepage

    I looked it up last year and "ethics" and "morals" were two separate things.

    You should be looking less at Wikipedia and more at Plato, Acquinas, and Kant. In truth, ethics is a subject that one has to study in order to understand the definition thereof.

  • by HolyCrapSCOsux (700114) on Tuesday November 25, 2008 @04:52PM (#25891169)
    Let me help you out

    It's a whole lot easier to shoot those who you disagree with than a) figure out what motivates a group of disparate individuals, and b) exploit those motivations.

    this is why we have wars and need killbots.
  • by CannonballHead (842625) on Tuesday November 25, 2008 @04:57PM (#25891245)

    Exactly. Robots, machines, whatever you want to call them, do not have ANY moral substance. Humans do. Humans may refuse to do certain things (or may not). Machines won't refuse.

    Bottom line is... I'd rather be up against convincing a human maniac than a robot programmed to ACT like a maniac. One still has a rational (hopefully) thought process somewhere in there, and has a moral element. The other can't think and has has no moral element whatsoever (partially, of course, due to not being able to have a rational thought in the first place).

    There's a reason that "mind control" scares so many people. Total "mind control" is what you have over machines, is it not?

  • by KDR_11k (778916) on Tuesday November 25, 2008 @05:18PM (#25891617)

    The robots do what all the people involved in their development and deployment told them, not just what the last order was. If the robot was programmed to avoid bad orders like that (soldiers are told to avoid them, after all) it could very well refuse an illegal order without violating the "robots do what they are told" rule.

  • by wilder_card (774631) on Tuesday November 25, 2008 @05:29PM (#25891759)

    If you think human police or soldiers are bad, just wait until a whole army of robots malfunctions. Hasn't anybody read "With Folded Hands", "Fondly Fahrenheit", or any of Fred Saberhagen's Berserker novels? This is a bad bad bad bad idea.

    The more sophisticated you make a robot or computer, the more its failure modes resemble organic dysfunction -- until they're indistinguishable from insanity.

  • Re:Ethical vs Moral (Score:2, Interesting)

    by ENIGMAwastaken (932558) on Tuesday November 25, 2008 @05:40PM (#25891937)
    This is correct. Really, people generally use the terms interchangeably but "ethics" is the more philosophical term, "morals" the more practical one. They have slightly different connotations too. Moral means essentially 'good', whereas 'ethical' seems to just mean 'not wrong.'
  • by Evanisincontrol (830057) on Tuesday November 25, 2008 @05:45PM (#25892019)

    AHA! So! How is this any different than humans?

    Actually, from a psychology angle, it's substantially different. It has been shown many times that humans are psychologically capable of stretching their moral limits further when they can distance themselves from the action (If you want me to get a citation, I'll go get one -- I'm just too lazy to get it right now).

    This is easy to see even without evidence. If you were forced to choose, would you rather push a button that drops a bomb on a village full of children 1000 miles away, or be in a plane and drop the bomb yourself? The two actions have identical results, yet distancing yourself from the action makes it easier to justify the moral consequences.

    This is why military leaders are able to stay sane. It's possible (though not easy) to give orders that will directly result in the death of thousands of people. However, if a war general had to shoot thousands of people himself, I suspect it would start to wear down on his psychological health.

    Now consider that you're a military general who simply has to push a button, and this button tells your robot to take over a village. It's very, very easy to rationalize that any casualties are not your fault, since all you were doing was pushing that button.

  • Re:Ethical vs Moral (Score:1, Interesting)

    by Anonymous Coward on Tuesday November 25, 2008 @05:57PM (#25892183)

    Then again, in Habermas' terminology ethics concerns a person's own life, how to make it good and fulfilling, whereas moral concerns general justice and questions of the "right or wrong" sort that transcend an individuals personal goals. It's all just terminology, and you can define it as you like.

  • by Anonymous Coward on Tuesday November 25, 2008 @06:06PM (#25892327)

    In addition, soldiers are trained not to think, they're trained to follow orders.

    This is a common misconception among those who have never served in the military, although I suppose it could be accurate for the militaries of some nations. In the US military, soldiers are trained to follow lawful orders. This distinction is important within the military culture; blind obedience is not a desired outcome of the US military's training. Whether any given soldier is capable of discerning an unlawful order, or is willing to disobey an unlawful order, is a different matter. In most cases, unlawful orders are reasonably obvious, such as targeting unarmed civilians, torturing prisoners, and so on (yes, there's the nasty debate over the definition of torture, but that's tangential to my point). As I understand it, this view generally holds for North American, European, and many other militaries, and some put it into practice better than others. Of course, that doesn't mean such incidents don't occur. My preference would be that the US military would investigate and prosecute "unlawful order" offenses more aggressively than it actually does.

    As an example, when I was in basic training we were specifically instructed that the .50 caliber machine gun could not be used against personnel; it was to be used only against equipment (Jeeps, APCs, mortars, helicopters), structures (gun emplacements, doors, ammo dumps), and so on. IIRC, this was a Geneva Convention restriction so I assume it's still the case, but I've been out for a long time. So, if an NCO or officer orders the .50 cal. crew to fire on enemy troops, the soldier must respectfully refuse the order. Of course, many drill sergeants introduce their trainees to the Army's rather dark humor, which includes the idea that an enemy soldier's belt, helmet, radio, equipment harness, and rifle are all "equipment", with the obvious consequences of targeting it as such.

    - T

  • by Moryath (553296) on Tuesday November 25, 2008 @08:00PM (#25893711)

    You do not like the quote by H.L. Mencken? This is why I had originally put my sig on hiatus till after the election, too many wingnuts like yourself attacking me for even having one.

  • by mikaere (748605) on Tuesday November 25, 2008 @08:34PM (#25894031)
    Well, actually it did save some some. I read an account where a different platoon came across the atrocity in action and actually defended some of the villagers.
  • Re:Ethical vs Moral (Score:3, Interesting)

    by Binty (1411197) on Tuesday November 25, 2008 @08:42PM (#25894095)

    I've gotta disagree. "Ethics" and "morals" means basically the same thing, although there can be a distinction depending on the context of discussion. For example, I'm a law student and we study legal ethics, not legal morals.

    I think the confusion comes from etymology. "Ethic" is greek in origin and "moral" is latin. According to this online etymological dictionary [etymonline.com] ethic and moral were actually translated into each other by the ancients.

  • Re:Ethical vs Moral (Score:5, Interesting)

    by a whoabot (706122) on Tuesday November 25, 2008 @09:26PM (#25894473)

    I disagree as well.

    Your theory that morals are culturally-based or a standardized code of conduct is just one theory of ethics -- constructivism or something like that. Ethics as a branch of philosophy is also called moral philosophy. Ethics in this sense is the study of the status of morality, of morals.

    Pretty much in every modern piece of ethics, "moral" and "ethical" are synonyms. In Hume and Hume scholarship, though, "moral" frequently means "absolute," as in "moral certainty," as well meaning the regular "ethical." For example go to the end of Mackie's "Subjectivity of Values" (this is a very popular paper in ethics). Here he uses the terms as synonyms is the same sentence: "...the central ethical concepts of Plato or Aristotle are in a broad sense prescriptive...they show that their moral thought is an objectification of the desired and the satisfying." So moral thought is thought dealing with ethical concepts -- just as anyone would say that moral thought is thought dealing with moral concepts. And here he is saying that when Plato or Aristotle use ethical concepts, that is, speak of good things or actions, they actually are referring to things or actions that are connected to their subjective feelings of pleasure, but they objectify these things in language as "good." So Mackie is saying that ethical concepts are not-universal at all, but are merely expressions of desires of certain people -- this is precisely what you denied. Notice also that Mackie says that the ethical concepts are prescriptive -- which was the second feature you ascribed to morals as distinguishing them from the matter of ethics.

    TL;DR:

    If you ask a philosophy professor what Kant's moral theory was, he'll tell you about the Categorical Imperative. If you ask a philosophy professor what Kant's ethical theory was, he'll tell you about the Categorical Imperative. The word is just not regularly distinguished in the topic of ethics. The reason "ethically" was originally used in this story was probably for the simple reason that "more morally" sounds weird in its phonemic repetition.

  • by Opyros (1153335) on Tuesday November 25, 2008 @10:22PM (#25894897) Journal
    Yep. This [cnn.com] is a 10-year-old news story of a ceremony at which they received a medal for it — their names being Hugh Thompson, Lawrence Colburn, and Glenn Andreotta. The last named was killed only a week after his heroic act.

"The Amiga is the only personal computer where you can run a multitasking operating system and get realtime performance, out of the box." -- Peter da Silva

Working...