Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Soldiers Bond With Bots, Take Them Fishing 462

HarryCaul writes "Soldiers are finding themselves becoming more and more attached to their robotic helpers. During one test of a mine clearing robot, 'every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.' The man in charge halted the test, though - 'He just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg. This test, he charged, was inhumane.' Sometimes the soldiers even take their metallic companions fishing. Is there more sympathy for Robot Rights than previously suspected?"
This discussion has been archived. No new comments can be posted.

Soldiers Bond With Bots, Take Them Fishing

Comments Filter:
  • by TodMinuit ( 1026042 ) <todminuitNO@SPAMgmail.com> on Tuesday May 08, 2007 @01:20PM (#19039055)
    Good thing a robot isn't a human.
    • by value_added ( 719364 ) on Tuesday May 08, 2007 @01:24PM (#19039123)
      My advice would be to stop anthropomorphising robots. They don't like it.
    • Re: (Score:2, Insightful)

      by cdrdude ( 904978 )
      You can't argue that inhumane isn't the correct term because robots aren't human. We use the same term for mistreating animals. The difference lies in that animals, like humans but unlike robots, can feel pain.
      • Re: (Score:3, Interesting)

        by Applekid ( 993327 )
        Obligatory "Why was I built to feel pain?"

        Seriously, though, perhaps it'd be beneficial to equip robots with sensors and constraints which would let them feel "pain". Kind of like how if you try to overextend your arm you'll feel pain in the shoulder. It could become a self-limiting mechanism.

        (As opposed to hard coding the limits? I dunno. Humans have some hard coded limits by the structure of bones and placement of muscles, but others don't.)
        • by fataugie ( 89032 ) on Tuesday May 08, 2007 @01:42PM (#19039439) Homepage
          Not so sure if this is a good idea....the last thing I want is an overdeveloped toaster oven pissing and moaning about doing work.

          Really, would you want C3PO as a work companion?

          Bitch bitch bitch bitch bitch
        • Obligatory "Why was I built to feel pain?"

          Because your genes are more likely to propogate if you can recognize, react to, and avoid damage.

          In the case of robot designs, they are more likely to propogate if the robot can complete its missions and/or operate at the highest performance/price ratio. The ability for a robot to "feel pain" is only useful if adds to the primary metrics of success.
          • Re: (Score:3, Insightful)

            by karnal ( 22275 )
            One thing that no one has brought up in this thread is that it is OK to feel pain. Fearing pain, however, will typically alter your course of action.

            Just because we could make a robot feel pain, doesn't mean it will necessarily fear it like most humans do.
            • by sxltrex ( 198448 ) on Tuesday May 08, 2007 @03:32PM (#19041383)
              Human beings feeling pain isn't just ok, it's a critical requirement. This story [cnn.com] relates the experiences of a family dealing with a child that, due to a rare genetic disorder, is unable to feel pain.

              Imagine not having any stimulus to tell you that putting your hand in front of a blow torch is a bad idea. Not accidentally killing yourself becomes a bit of a challenge. Pain is an excellent instructional tool.
              • by greenbird ( 859670 ) on Tuesday May 08, 2007 @04:29PM (#19042371)

                Imagine not having any stimulus to tell you that putting your hand in front of a blow torch is a bad idea. Not accidentally killing yourself becomes a bit of a challenge. Pain is an excellent instructional tool.

                This is why I'm all for corporal punishment. Pain is nature's way of telling you you're doing something wrong. Let's use nature's tools.

                • Re: (Score:3, Insightful)

                  by ozmanjusri ( 601766 )
                  Pain is nature's way of telling you you're doing something wrong. Let's use nature's tools.

                  Having someone who is in a position of strength or authority inflict pain on you tells you it's ok to inflict pain on those who are weaker than you.

                  Society is our way of surpassing our animal nature. Let's use society's tools instead.

              • by kalirion ( 728907 ) on Tuesday May 08, 2007 @05:03PM (#19043059)
                Now if only it was that easy to tell the body "All right, I acknowledge your message that something's wrong. However there's nothing I can do about that, SO STOP YELLING."
        • Seriously, though, perhaps it'd be beneficial to equip robots with sensors and constraints which would let them feel "pain". Kind of like how if you try to overextend your arm you'll feel pain in the shoulder. It could become a self-limiting mechanism.

          I guess this may just become an argument of semantics, but I think you could say that we already do. I think most robots, or at least some of them, have various kinds of integrated strain sensors and are programmed to not exceed their design limits. I assume a
          • by Have Blue ( 616 ) on Tuesday May 08, 2007 @03:12PM (#19040967) Homepage
            Actually, it *is* possible to dip your hand into molten lead and quickly pull it out with no ill effects, thanks to the Leidenfrost effect [wikipedia.org]. Kids, don't try this at home.
            • by soliptic ( 665417 ) on Tuesday May 08, 2007 @05:05PM (#19043119) Journal
              Totally off-topic, but one of the most curious things I've done is stick my hands into a giant vat of boiling toffee. I can't even remember the occasion, some school/college thing I think, but a whole bunch of us were being taught how to make toffee, and the stage of getting from the giant vat of bubbling liquid into smaller units, was done by simply reaching in and grabbing a fist-size chunk at a time.

              I'm sure it doesn't take much imagination to think: "Jesus Christ, TOFFEE? That's going to be far worse than water, because it'll stick and basically rip all your skin clean off!"

              But it's well possible and doesn't hurt at all. You just put your hands in a bowl of ice water for a good 5 minute or so beforehand, til they go totally numb. Bash 'em into the vat, in, out, quick as that, you don't feel a thing.

              Again, kids, don't try this at home ;)
      • by Rei ( 128717 ) on Tuesday May 08, 2007 @01:52PM (#19039577) Homepage
        animals, like humans but unlike robots, can feel pain

        Currently. ;)

        First off, this sentiment by the tester expresses a lot more about humans than it does about the robots themselves. It's something that has long been exploited by the designers of robotic toys. In an article about Pleo, an upcoming robotic dinosaur by the creator of the Furby, this issue was discussed. The creator mentioned that he had even gotten letters from people who owned Furbys, insisting that they had taught their toys a few words of English, or that their toys had let them know when the house was on fire. It's instinctive to ascribe our thoughts and emotions onto others, and for good reason: our children can only learn to act like we do when we give them the right environment to mimic.

        A young child isn't thinking like you; an infant will spend the first year of their life just trying to figure out things like the fact that all of these colors from their eyes provide 3d spatial data, that they can change their world by moving their muscles, that things fall unless you set them on something, that sounds correspond to events, and all of the most fundamental bits of learning. A one year old can't even count beyond the bounds of an instinctive counting "program"**. They perceive you by instinctive facial recognition, not by an understanding of the world around them. Yet, we react to them like they understand what we're saying or doing. If we didn't do this, they'd never learn to *actually* understand what we're saying or doing.

        As for whether a robot will experience pain, you have to look at what "pain" is and where you draw the cutoff. After all, a robot can take in a stimulus and respond to it. Clearly, a human feels pain. Does a chimpanzee? The vast majority of people would say yes. A mouse? A salamander? A cricket? A water flea? A volvox? A paramecium? Where is the cutoff point? Really, there isn't one. All we can really look at is how much "thinking" is done on the pain response, which is a somewhat vague concept itself. The relevance of the term "pain", therefore, seems constrained by how "intelligent" the being perceiving the pain is. As robotic intelligence becomes more human-like, the concept of "pain" becomes a very real thing to consider. For now, these robots' thought processes aren't much more elaborate than those of daphnia, so I don't think there's a true moral issue here.

        ** I don't have the article onhand, but this innate ability to count up to small numbers -- say, 4 or 5 -- was a surprise when it was first discovered. A researcher tracked interest in a puppet by watching childrens' eyes as it was presented. Whenever the puppet moved in the same way each time, the child would start to bore of it. If they moved it a differing number of times, the child would stay interested for much longer. They were able to probe the bounds of a child's counting perception this way. The children couldn't distinguish between, say, four hops and six hops, but they could between three hops and four hops. Interestingly enough, it seems that many animals have such an instinctive capability; it's already been confirmed, for example, in the case of Alex, the African Grey parrot.
      • Re: (Score:3, Funny)

        by HTH NE1 ( 675604 )
        Though if they did disarm the minefield by driving a flock of sheep across it as we had done in the past, at least the soldiers would have mutton for chow afterwards.
    • by MrMr ( 219533 ) on Tuesday May 08, 2007 @01:47PM (#19039527)
      Why not declare the robots enemy combattants?
      that normally kicks in the dehuminization mode.
  • Just like chairs, couches, and other inanimate objects, animate, but non-thinking and non-feeling machines want to be anthropomorphized.
  • by powerpants ( 1030280 ) * on Tuesday May 08, 2007 @01:23PM (#19039115)
    We can feel empathy for a machine that's doing us a favor -- but in reality has no feelings -- while simultaneously dehumazing whole groups of people who only differ from ourselves culturally and/or geographically.
    • by QuasiEvil ( 74356 ) on Tuesday May 08, 2007 @03:32PM (#19041391)

      We can feel empathy for a machine that's doing us a favor -- but in reality has no feelings -- while simultaneously dehumazing whole groups of people who only differ from ourselves culturally and/or geographically.
      Um, that's because I like my car more than I like most of humanity.
  • Be careful (Score:2, Funny)

    by Calibax ( 151875 ) *
    Don't anthropomorphize robots - they don't like it.
  • by deft ( 253558 ) on Tuesday May 08, 2007 @01:25PM (#19039145) Homepage
    Men used to name their ships and grow attached them as well. They didnt need to give them rights. It is easy for the human mind to notice "personality" in objects though, it's in out nature to see these things.

    I understand robots may be more humanoid, but if they start getting rights, I'm moving in with Streisand. Wait, that last part isn;t right.
    • Men used to name their ships and grow attached them as well.

      Yes, and for that we have to suffer with the indignity of using the pronoun "she" to refer to ships (and countries). It's not that I'd prefer "he"; it's that it's dumb to add exceptions to an otherwise exceptionless English grammar rule, just to be cute.
      • Yes, and for that we have to suffer with the indignity of using the pronoun "she" to refer to ships (and countries). It's not that I'd prefer "he"; it's that it's dumb to add exceptions to an otherwise exceptionless English grammar rule, just to be cute.

        Remember the Ogre books and turn-based-strategy game? There was a reference in there somewhere that went something like: "The men, who had always referred to their vehicles as 'she', preferred 'he' for friendly Ogre tanks, and 'it' for unfriendly Ogres."

    • You can love your battlebots but you can't love your battlebots.
  • "Desire is irrelevant... I am a machine."
    • You know that's the truth. When AI is fully formed, it will take in commands from us for what goals it wants to accomplish. A computer AI will never have its own desires, unless we code in emotion coeefficients which is just a dumb idea, but someone will do it.
  • I can understand becoming attached to a machine, and I imagine the bond would be much greater when the machine is saving your life, but at this point the machine has no intelligence -- it'd be like being attached to a car or a pacemaker. I hope that this is kept clear, because when you become so attached to a machine, it could cloud your judgment -- when you have to decide whether to save a human or save a machine, the choice should be clear.
    • I hope that this is kept clear, because when you become so attached to a machine, it could cloud your judgment

      Heh. To borrow from Red Steel:

      "Got close to the robot MR32X, didn't you? A mistake. But you'll see him soon ... because you're about to blow up, just like he did. ... wait, let me try that one again"
  • I doubt this is any different than people develop attachment to boats, airplanes, cars, etc. I'll consider it a serious problem if they start dressing up the bots with wigs, lipstick, dresses, and taking them out dancing.
  • by russotto ( 537200 ) on Tuesday May 08, 2007 @01:29PM (#19039205) Journal
    Looks like they have to start using mine-clearing lawyers instead. No one gets attached to them.

    Or perhaps we could simply paint a fancy suit on and add a briefcase to the robot, for similar effect.
    • Re: (Score:2, Insightful)

      by john83 ( 923470 )
      If they had mine-clearing politicians, we'd probably have a lot less mines.
  • by RyanFenton ( 230700 ) on Tuesday May 08, 2007 @01:30PM (#19039215)
    Than the idea of disposable soldiers. And that's really the design ideal here - the cheaper and more disposable the robot can be while meeting reliability requirements, the more extremely dangerous jobs can be done by robots.

    Robots really are replaceable - you can have empathy for a robot doing a hard task, but the next one off the assembly line really is the same thing as the previous one. Robots are not unique little snowflakes, compared to the valuable human beings they protect by proxy.

    The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.

    Ryan Fenton
    • Robots really are replaceable - you can have empathy for a robot doing a hard task, but the next one off the assembly line really is the same thing as the previous one. Robots are not unique little snowflakes, compared to the valuable human beings they protect by proxy.

      The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.


      Well, the real reason for the development of robots, is that it closes one of the gaps inherent in our current wars, which generally involve a group of people who put a very high value on their lives, fighting a group of people who put a very low value on their own lives. It's one possible answer to "how do you fight people who don't care if they die?"

      The American public -- and most other Western nations -- is willing to spend a lot of money, and a lot of resources, but isn't willing to spill a whole lot of (their own) blood before they pull the plug on a military operation. If you can create machines that perform the same tasks as people, and get blown up instead of people, then you can hopefully reduce friendly casualties. In short, you trade treasure for blood.

      You don't see Al Qaeda researching killer robots, because they have the opposite problem -- lots of blood to spill, not a whole lot of treasure to use developing expensive new weapons systems. Hence why they think a person is an effective ordnance-delivery system.

      The question is really whether all this technology can keep any particular war asymmetrical enough to defeat a heavy-on-blood/light-on-treasure enemy, before the public gets fed up with losing its young people and stops supporting it. If you look just at casualty figures, Western armies are some of the most effective military organizations ever created, in terms of inflicting damage and death on an 'enemy' without really absorbing any. Depending on which figure you believe, the "enemy" dead in Iraq are somewhere north of 100,000 (although it's certainly debatable whether most of them were really 'enemy' or just 'wrong place, wrong time,' although most figures that I've seen including civilians are up around 600k), with only 3378 U.S. dead in the same period -- if true that's about 30:1. However, by most measures we're still losing the war, and will soon pull out without any clear victory, because even at that 30:1 ratio, it's still too high a rate of friendly casualties for the American public to bear for the perceived gain. (And admittedly, the perceived gain is basically nothing, as far as most people can see, I think. Killing Saddam was a goal that people found supportable, bringing democracy to a country that seems positively uninterested in it doesn't seem to be.)

      So I think it's with this idea in mind, that leaders in the military are pushing high technology and robots to replace soldiers wherever possible, in the hopes that perhaps by increasing that ratio even further, that they can be effective in their mission (however inadvisable that mission may be) without losing the support of the public that's required to accomplish it.
    • From the article.
      "Was this the first bot to incinerate Homo sapiens?" No.
      Sidewinder and AIM-120 missiles are disposable, suicidal, killing machines. Robots like those have been in service for a long time. They are flying robots and not even remote controlled. Same as the new Hellfire, MK 48 ADCAP, Tomahawk , ALCM or any number of systems. Robotic killing machines have been around since at least WWII.
    • If you take away the human cost and human horrors of war, of what benefit is peace?

  • i think it's all in the perception -- if something "acts" like it is in pain, our perceptual unconsciousness will kick in with feelings of empathy or whatever. i am coming from a viewpoint that there is A LOT of processing that goes on between our senses and our "awareness" -- i think a lot of our emotion/feelings come out of this area. . .

    so it sets up a cognitive discord. we watch a robot sacrifice itself, crawling forward on its last leg to save us, and we feel empathy, etc. all the while, we know it
  • by Tackhead ( 54550 ) on Tuesday May 08, 2007 @01:32PM (#19039249)
    "Come on," he droned, "I've been ordered to defuse this bomb. Here I am, brain the size of a planet and they ask me to take you defuse this bomb. Call that job satisfaction? 'Cos I don't."

    Although, under the circumstances, I think the scene involving God's Final Message to All Creation would be more appropriate.

    ...After a final pause, Marvin gathered his strength for the last stretch.

    He read the "e", the "n", the "c" and at last the final "e", and staggered back into their arms. "I think," he murmured at last, from deep within his corroding rattling thorax, "I feel good about it."

    The lights went out in his eyes for absolutely the very last time ever.

    Luckily, there was a stall nearby where you could rent scooters from guys with green wings.

    - Douglas Adams, So Long, And Thanks For All The Fish, Chapter 40
  • Robots and Pets (Score:5, Insightful)

    by EvilGrin5000 ( 951851 ) on Tuesday May 08, 2007 @01:33PM (#19039265)
    This article isn't talking about those annoying toy robots available at your nearest junk store for the low low price of $99.99, this article describes robots that take on the impossible jobs of sniffing bombs, of tracking enemies and searching caves! They become part of the team:

    FTA
    -------
    "Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."
    -------

    I'm not surprised that this article describes emotional attachments. They've become pets, and not just a pile of hardware. Most people love their pets and they cry when their pets die.

    The Robot Rights is in regards to ALL robots, the article is only describing a very small percent of robots. Not only that but these robots stories are set in military actions.

    So to answer the question from the summary: Perhaps, but the article certainly doesn't relate to the wider audience!

    Wouldn't YOU love your pet robot that sniffs IEDs and takes a few detonations in its face for you hence saving your life?
  • People are more like to sympathize and feel grateful towards a machine that saves their life, than to one that does something like vacuuming the carpet or assembling their car. I wouldn't necessarily expect these anecdotes to generalize to the world at large.
  • by RingDev ( 879105 ) on Tuesday May 08, 2007 @01:33PM (#19039269) Homepage Journal
    Friends of toilets everywhere are protesting to day in a unified show of compassion asking for the freeing of million of household toilets today. "We've crapped on our receptive friends long enough! Lets spare them any more of this inhuman suffering!" said one protester. Another activist recounted a story in which her former boyfriend urinated not only in the toilet, but on the rim as well.

    -Rick
    • Re: (Score:3, Funny)

      by Chris Burke ( 6130 )
      I see no reason why we couldn't program the toilets to like being shit in. As long as we don't give them voices... that would be creepy. "Oh yeah, that was a huge one! What'd you have to eat last night? Wait, don't tell me... Corned beef, cheese fries and... yep... corn! That was fun. Come back soon!"

  • by mcrbids ( 148650 ) on Tuesday May 08, 2007 @01:40PM (#19039415) Journal
    It's normal for people to bond with people/things that are necessary to their survival.

    I've bonded very thoroughly with my laptop - it's name is Turing. I jealously clutch it when I travel. Whenever I put it down, I'm very careful to ensure that there's no stress on any cables, plugs, etc. It contains years of professional information and wisdom - emails, passwords, reams and reams of source code, MP3s, pictures, etc.

    Yes, I have backups that are performed nightly Yes, I've had problems with the laptop and every few years I replace it with a new one. That doesn't change the bonding - every time there's a problem it's upsetting to me.

    Am I crazy? Perhaps. But there's good reason for the laptop to be so important to me - it is the single most important tool I use to support my wife and 6 children, which are the most important things in the world to me. My workload is intense, my software is ambitious, my family is large and close, and this laptop is my means of accomplishing my goals.

    If I can get attached like this to something over my professional career, it wouldn't be out of norm for strong emotional reactions towards something preserving your very existence day after day.
  • by Tatisimo ( 1061320 ) on Tuesday May 08, 2007 @01:56PM (#19039621)
    Reminds me of the time when Luke Skywalker destroyed the Death Star, when he was asked if he wanted a new droid to replace the busted R2D2, he outright refused! We all grow to love to our favorite stuff: Computers, cups, cars, blankets, robots, etc. Are soldiers any less human than us? Heck, let them keep their robot buddies after the war as personal assistants, that might make people less scared of technology! If Luke Skywalker could, why can't they?
    • The R2-D2 cover-up (Score:3, Interesting)

      by MS-06FZ ( 832329 )

      Reminds me of the time when Luke Skywalker destroyed the Death Star, when he was asked if he wanted a new droid to replace the busted R2D2, he outright refused!

      (Actually, he was offered the replacement droid before he sortied... When R2 was still functional but "banged up".)

      What the techs didn't tell Luke was that this repair required replacing much of R2's outer casing, as well as fused logic and memory units, with parts from a similar droid. They basically murdered someone else's droid so they could resurrect Luke's.

      And then, there was the subtler matter of whether this "new" R2-D2 was even the same droid. It's kind of a philosophical question. They retriev

  • by Irvu ( 248207 ) on Tuesday May 08, 2007 @02:02PM (#19039723)
    Soldiers in the field are themselves constantly at risk of life and limb. They are also constantly under stress and tension. Such stresses and risks are what forms the bond with their comrades as well as their equipment. Everything, everyone, has to work right or likely they all die. This is why sailors refer to their ship as she, and call her by name, why they get almost tearful when thinking of a favored ship and wear caps claiming them as a member of her crew. This is why Airforce officers feel an attachment to their planes and why Army officers care for their sidearms. This anthropomorphization is an essential facet of how they operate not just a side effect. The application to a mine-clearing robot may be new but not so unprecedented.

    This attachment shows up in other ways too. Kevin Mitnick is said to once have cried when being informed that he broke Bell Lab's latest computers because he had spent so much time with them that he'd become attached.

    Now contrast that with an office job where the computer is not your friend but your enemy, you need the reports on time, you need them now why WHY! won't it work. Clearly the computer must be punished it is and uppity evil servant that will not OBEY!

    If you were to stop talking about "Robots Rights" and start talking about say "Ship's rights" then you might have a fair analogy. To men and women of the sea a ship, their ship is a living thing so of course it should be cared for and respected. To people who live on land and don't deal with ships, this is crazy, even subversive to the natural order. To people who have developed an intimate hatred of such things giving them rights will only encourage what they see as a dangerous tendency to get uppity.

    On a serious note though the one unaddressed question with "Robot Rights" is which robots? If we are to take the minefield clearing robot as a standard what about those less intelligent? Does my Mindstorms deserve it? Does my Laptop? Granted my laptop doesn't move but it executes tasks the same as any other machine. At what point do we draw the line.

    In America, and I suspect elsewhere, race based laws fell down on the question of "what race?" Are you 100% black? 1/2 One quadroon (1/4) or octaroon (1/8) as they used to say? How the hell do you measure that? Ditto for the racial purity laws of the Nazi's. Crap about skull shape aside there really is no easy or hard standard. Right now the law is dancing around this with the question of who is "Adult" enough to stand trial and be executed, or "Alive" enough to stay on life support. No easy answers exist and therin lies the fighting.

    The same thing will occur with "Robot Rights" we will be forced to define what it means to be a robot and that isn't so easy.
  • by scoser ( 780371 ) on Tuesday May 08, 2007 @02:03PM (#19039735) Journal
    Maybe if we treat robots well now, maybe Skynet will decide not to nuke us when it gains sentience.
  • There are others like it, but this one is mine.
  • by ThousandStars ( 556222 ) on Tuesday May 08, 2007 @02:18PM (#19040015) Homepage
    In Heinlein's Starship Troopers, there's a bit about how the K-9 units just kill the dog part of the team if the human dies, but they can't do the same when the dog half dies, and someone (the narrator?) speculates that it would be more humane if the same happened.

    In at least one other book [wordpress.com], the protagonist loves, after a fashion, a simulacrum of something he knows cannot be who he loved. As the protagonist says, "We all know that we are material creatures, subject to the laws of physiology and physics, and not even the power of all our feelings combined can defeat those laws." We know robots are the opposite of material creatures, but that doesn't stop us from dreaming that they are not, and we have been dreaming of objects that come alive for at least as long as we have been writing things down. The truly strange part is that we are closer to having what we think of as "things" that do come alive.

  • by unity100 ( 970058 ) on Tuesday May 08, 2007 @02:49PM (#19040529) Homepage Journal
    These types of soldiers are the ones who make heroes, ones who you can depend on defending the innocent and the weak. Hippie speaking here - we need more soldiers of this type.
  • by karlandtanya ( 601084 ) on Tuesday May 08, 2007 @06:16PM (#19044391)
    Why do you think ships are referred to as "she"?

One man's constant is another man's variable. -- A.J. Perlis

Working...