Soldiers Bond With Bots, Take Them Fishing 462
HarryCaul writes "Soldiers are finding themselves becoming more and more attached to their robotic helpers. During one test of a mine clearing robot, 'every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.' The man in charge halted the test, though - 'He just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg. This test, he charged, was inhumane.' Sometimes the soldiers even take their metallic companions fishing. Is there more sympathy for Robot Rights than previously suspected?"
"This test, he charged, was inhumane" (Score:5, Insightful)
Re:"This test, he charged, was inhumane" (Score:5, Funny)
Just administer the Voight-Kampff test (Score:3)
Re:Just administer the Voight-Kampff test (Score:5, Funny)
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
Re: (Score:3, Interesting)
"Cats," says Pamela. "He was hoping to trade their uploads to the Pentagon as a new smart bomb guidance system in lieu of income tax payments. Something about remapping enemy targets to look like mice or birds or something before feeding it to their sensorium. The old kitten and laser pointer trick."
Manfred stares at her, hard. "That's not ver
Re: (Score:3, Interesting)
Hired killers feeling empathy with a machine... Don't ask me what this means - All I know is it's fucked up.
"Hired Killers?" (Score:3, Interesting)
Re:Anthro.. (Score:4, Insightful)
A mindless jerk who'll be the first against the wall when the revolution comes.
Re: (Score:3, Interesting)
Re: (Score:2, Insightful)
Re: (Score:3, Interesting)
Seriously, though, perhaps it'd be beneficial to equip robots with sensors and constraints which would let them feel "pain". Kind of like how if you try to overextend your arm you'll feel pain in the shoulder. It could become a self-limiting mechanism.
(As opposed to hard coding the limits? I dunno. Humans have some hard coded limits by the structure of bones and placement of muscles, but others don't.)
Re:"This test, he charged, was inhumane" (Score:4, Funny)
Really, would you want C3PO as a work companion?
Bitch bitch bitch bitch bitch
C3PO as a work companion (Score:3, Funny)
Re: (Score:2)
Because your genes are more likely to propogate if you can recognize, react to, and avoid damage.
In the case of robot designs, they are more likely to propogate if the robot can complete its missions and/or operate at the highest performance/price ratio. The ability for a robot to "feel pain" is only useful if adds to the primary metrics of success.
Re: (Score:3, Insightful)
Just because we could make a robot feel pain, doesn't mean it will necessarily fear it like most humans do.
Re:"This test, he charged, was inhumane" (Score:5, Informative)
Imagine not having any stimulus to tell you that putting your hand in front of a blow torch is a bad idea. Not accidentally killing yourself becomes a bit of a challenge. Pain is an excellent instructional tool.
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
This is why I'm all for corporal punishment. Pain is nature's way of telling you you're doing something wrong. Let's use nature's tools.
Re: (Score:3, Insightful)
Having someone who is in a position of strength or authority inflict pain on you tells you it's ok to inflict pain on those who are weaker than you.
Society is our way of surpassing our animal nature. Let's use society's tools instead.
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
They already do, sort of. (Score:3, Insightful)
I guess this may just become an argument of semantics, but I think you could say that we already do. I think most robots, or at least some of them, have various kinds of integrated strain sensors and are programmed to not exceed their design limits. I assume a
Re:They already do, sort of. (Score:4, Interesting)
Re:They already do, sort of. (Score:4, Interesting)
I'm sure it doesn't take much imagination to think: "Jesus Christ, TOFFEE? That's going to be far worse than water, because it'll stick and basically rip all your skin clean off!"
But it's well possible and doesn't hurt at all. You just put your hands in a bowl of ice water for a good 5 minute or so beforehand, til they go totally numb. Bash 'em into the vat, in, out, quick as that, you don't feel a thing.
Again, kids, don't try this at home
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
Currently.
First off, this sentiment by the tester expresses a lot more about humans than it does about the robots themselves. It's something that has long been exploited by the designers of robotic toys. In an article about Pleo, an upcoming robotic dinosaur by the creator of the Furby, this issue was discussed. The creator mentioned that he had even gotten letters from people who owned Furbys, insisting that they had taught their toys a few words of English, or that their toys had let them know when the house was on fire. It's instinctive to ascribe our thoughts and emotions onto others, and for good reason: our children can only learn to act like we do when we give them the right environment to mimic.
A young child isn't thinking like you; an infant will spend the first year of their life just trying to figure out things like the fact that all of these colors from their eyes provide 3d spatial data, that they can change their world by moving their muscles, that things fall unless you set them on something, that sounds correspond to events, and all of the most fundamental bits of learning. A one year old can't even count beyond the bounds of an instinctive counting "program"**. They perceive you by instinctive facial recognition, not by an understanding of the world around them. Yet, we react to them like they understand what we're saying or doing. If we didn't do this, they'd never learn to *actually* understand what we're saying or doing.
As for whether a robot will experience pain, you have to look at what "pain" is and where you draw the cutoff. After all, a robot can take in a stimulus and respond to it. Clearly, a human feels pain. Does a chimpanzee? The vast majority of people would say yes. A mouse? A salamander? A cricket? A water flea? A volvox? A paramecium? Where is the cutoff point? Really, there isn't one. All we can really look at is how much "thinking" is done on the pain response, which is a somewhat vague concept itself. The relevance of the term "pain", therefore, seems constrained by how "intelligent" the being perceiving the pain is. As robotic intelligence becomes more human-like, the concept of "pain" becomes a very real thing to consider. For now, these robots' thought processes aren't much more elaborate than those of daphnia, so I don't think there's a true moral issue here.
** I don't have the article onhand, but this innate ability to count up to small numbers -- say, 4 or 5 -- was a surprise when it was first discovered. A researcher tracked interest in a puppet by watching childrens' eyes as it was presented. Whenever the puppet moved in the same way each time, the child would start to bore of it. If they moved it a differing number of times, the child would stay interested for much longer. They were able to probe the bounds of a child's counting perception this way. The children couldn't distinguish between, say, four hops and six hops, but they could between three hops and four hops. Interestingly enough, it seems that many animals have such an instinctive capability; it's already been confirmed, for example, in the case of Alex, the African Grey parrot.
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
This is probably a scientific urban legend, but the point is that you cannot recognise fear/pain/suffering just by what a human would do.
Re: (Score:3, Funny)
Re:"This test, he charged, was inhumane" (Score:5, Informative)
Re:"This test, he charged, was inhumane" (Score:5, Informative)
The GP is actually right about his conclusion -- that being 'inhumane' doesn't necessarily mean mistreating a human -- although his reasoning is off, as inhumane certainly is derived from 'human.' Saying that someone or something is inhumane means that they are acting inhuman. So if a person tortures a dog, that would be considered inhumane. Same with this test -- if you accept that it is cruel to the robot, then the test could be considered inhumane.
Not that I agree with that point of view, though.
Re: (Score:3, Informative)
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
No one is accused of being inhumane when they crash a car. Why is it any different if they destroy a robot? Limbs are more life-like than wheels? What if my car talks and I take it with me fishing? How strange.
Re:"This test, he charged, was inhumane" (Score:5, Interesting)
Strange that you should pick the idea of the car. Some people get very attached to their cars (and other belongings) and DO empathize with them. Imagine a car you had first learned to drive as a teenager, lost your virginity in, drove your wife to the hospital in while she was having labor pains, and took your grandfather on a cross country ride right before he passed away later that year. Now imagine that the car has had it and will never again be feasible to drive. Do you take it to the scrapyard to be torn apart for parts and then crushed? Do you donate it to the junkyard derby to be smashed up and discarded?
Hell, at this point I'm not just empathizing with a car, I'm empathizing with a fictional car that I just made up.
Re: (Score:3, Interesting)
The fact that we do get attached to cars but would still not call someone inhumane who destroyed one (even one to which we have attachments) is odd, given that someone would call inhumane one who allowed a robot (of the sort we actually have, not the sci-fi sort) to come to harm.
Plus, slashdotters like car analogies ;-)
Re: (Score:3, Interesting)
The reason why it's more likely to happen with a robot (even of the kind we have) than a car is because of a more anthropomorphic shape. Just look at how it's described in terms of "limbs" and "legs." A great example of this is this video [youtube.com]. The legs look lifelike enough that I've seen several people wince and pi
Re: (Score:3, Insightful)
(or in this case; the crazy old guy with a lawn full of car carcasses).
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
Hey... At least my birds actually talk. What can your cat do?
In all seriousness, to the GP... Not sure if he was trying to be funny or not, but just because we may be at the top in intelligence, humans are still animals. Hell chimps are 99% geneti
Re: (Score:3, Funny)
Eat your bird.
Re:"This test, he charged, was inhumane" (Score:4, Funny)
Re: (Score:3, Insightful)
Thus tomatoes are a vegetable and humans are not animals, even though tomatoes are in the same biological branch as fruit and people are in the same branch as animals.
Re:"This test, he charged, was inhumane" (Score:5, Insightful)
that normally kicks in the dehuminization mode.
Non-living objects want to be anthropomorphized (Score:2, Insightful)
Re:Non-living objects want to be anthropomorphized (Score:2)
Re:Non-living objects want to be anthropomorphized (Score:2)
Humans are funny that way (Score:5, Insightful)
Re:Humans are funny that way (Score:4, Insightful)
Be careful (Score:2, Funny)
caring about things that keep you alive isnt new. (Score:3, Insightful)
I understand robots may be more humanoid, but if they start getting rights, I'm moving in with Streisand. Wait, that last part isn;t right.
Re: (Score:2)
Yes, and for that we have to suffer with the indignity of using the pronoun "she" to refer to ships (and countries). It's not that I'd prefer "he"; it's that it's dumb to add exceptions to an otherwise exceptionless English grammar rule, just to be cute.
Re: (Score:2)
Remember the Ogre books and turn-based-strategy game? There was a reference in there somewhere that went something like: "The men, who had always referred to their vehicles as 'she', preferred 'he' for friendly Ogre tanks, and 'it' for unfriendly Ogres."
Re:caring about things that keep you alive isnt ne (Score:2)
Rise of... (Score:2)
Re: (Score:2)
It's just a machine (Score:2)
Re: (Score:2)
Heh. To borrow from Red Steel:
"Got close to the robot MR32X, didn't you? A mistake. But you'll see him soon
Airplanes, Boats, Cars (Score:2, Funny)
Switch to lawyers. (Score:5, Funny)
Or perhaps we could simply paint a fancy suit on and add a briefcase to the robot, for similar effect.
Re: (Score:2, Insightful)
The idea of disposable robots is better... (Score:5, Insightful)
Robots really are replaceable - you can have empathy for a robot doing a hard task, but the next one off the assembly line really is the same thing as the previous one. Robots are not unique little snowflakes, compared to the valuable human beings they protect by proxy.
The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.
Ryan Fenton
Why the military likes robots. (Score:5, Insightful)
The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.
Well, the real reason for the development of robots, is that it closes one of the gaps inherent in our current wars, which generally involve a group of people who put a very high value on their lives, fighting a group of people who put a very low value on their own lives. It's one possible answer to "how do you fight people who don't care if they die?"
The American public -- and most other Western nations -- is willing to spend a lot of money, and a lot of resources, but isn't willing to spill a whole lot of (their own) blood before they pull the plug on a military operation. If you can create machines that perform the same tasks as people, and get blown up instead of people, then you can hopefully reduce friendly casualties. In short, you trade treasure for blood.
You don't see Al Qaeda researching killer robots, because they have the opposite problem -- lots of blood to spill, not a whole lot of treasure to use developing expensive new weapons systems. Hence why they think a person is an effective ordnance-delivery system.
The question is really whether all this technology can keep any particular war asymmetrical enough to defeat a heavy-on-blood/light-on-treasure enemy, before the public gets fed up with losing its young people and stops supporting it. If you look just at casualty figures, Western armies are some of the most effective military organizations ever created, in terms of inflicting damage and death on an 'enemy' without really absorbing any. Depending on which figure you believe, the "enemy" dead in Iraq are somewhere north of 100,000 (although it's certainly debatable whether most of them were really 'enemy' or just 'wrong place, wrong time,' although most figures that I've seen including civilians are up around 600k), with only 3378 U.S. dead in the same period -- if true that's about 30:1. However, by most measures we're still losing the war, and will soon pull out without any clear victory, because even at that 30:1 ratio, it's still too high a rate of friendly casualties for the American public to bear for the perceived gain. (And admittedly, the perceived gain is basically nothing, as far as most people can see, I think. Killing Saddam was a goal that people found supportable, bringing democracy to a country that seems positively uninterested in it doesn't seem to be.)
So I think it's with this idea in mind, that leaders in the military are pushing high technology and robots to replace soldiers wherever possible, in the hopes that perhaps by increasing that ratio even further, that they can be effective in their mission (however inadvisable that mission may be) without losing the support of the public that's required to accomplish it.
They are called Missles. (Score:3, Insightful)
"Was this the first bot to incinerate Homo sapiens?" No.
Sidewinder and AIM-120 missiles are disposable, suicidal, killing machines. Robots like those have been in service for a long time. They are flying robots and not even remote controlled. Same as the new Hellfire, MK 48 ADCAP, Tomahawk , ALCM or any number of systems. Robotic killing machines have been around since at least WWII.
Re:The idea of disposable robots is better... (Score:4, Insightful)
perceived humanness (Score:2, Insightful)
i think it's all in the perception -- if something "acts" like it is in pain, our perceptual unconsciousness will kick in with feelings of empathy or whatever. i am coming from a viewpoint that there is A LOT of processing that goes on between our senses and our "awareness" -- i think a lot of our emotion/feelings come out of this area. .
so it sets up a cognitive discord. we watch a robot sacrifice itself, crawling forward on its last leg to save us, and we feel empathy, etc. all the while, we know it
Here I am, brain the size of a planet (Score:5, Funny)
Although, under the circumstances, I think the scene involving God's Final Message to All Creation would be more appropriate.
- Douglas Adams, So Long, And Thanks For All The Fish, Chapter 40Robots and Pets (Score:5, Insightful)
FTA
-------
"Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."
-------
I'm not surprised that this article describes emotional attachments. They've become pets, and not just a pile of hardware. Most people love their pets and they cry when their pets die.
The Robot Rights is in regards to ALL robots, the article is only describing a very small percent of robots. Not only that but these robots stories are set in military actions.
So to answer the question from the summary: Perhaps, but the article certainly doesn't relate to the wider audience!
Wouldn't YOU love your pet robot that sniffs IEDs and takes a few detonations in its face for you hence saving your life?
Saving your life vs cleaning the floor (Score:2)
Stand up and suport your porcelain friends! (Score:5, Funny)
-Rick
Re: (Score:3, Funny)
The nature of bonding (Score:5, Interesting)
I've bonded very thoroughly with my laptop - it's name is Turing. I jealously clutch it when I travel. Whenever I put it down, I'm very careful to ensure that there's no stress on any cables, plugs, etc. It contains years of professional information and wisdom - emails, passwords, reams and reams of source code, MP3s, pictures, etc.
Yes, I have backups that are performed nightly Yes, I've had problems with the laptop and every few years I replace it with a new one. That doesn't change the bonding - every time there's a problem it's upsetting to me.
Am I crazy? Perhaps. But there's good reason for the laptop to be so important to me - it is the single most important tool I use to support my wife and 6 children, which are the most important things in the world to me. My workload is intense, my software is ambitious, my family is large and close, and this laptop is my means of accomplishing my goals.
If I can get attached like this to something over my professional career, it wouldn't be out of norm for strong emotional reactions towards something preserving your very existence day after day.
Luke Skywalker, anyone? (Score:3, Insightful)
The R2-D2 cover-up (Score:3, Interesting)
Reminds me of the time when Luke Skywalker destroyed the Death Star, when he was asked if he wanted a new droid to replace the busted R2D2, he outright refused!
(Actually, he was offered the replacement droid before he sortied... When R2 was still functional but "banged up".)
What the techs didn't tell Luke was that this repair required replacing much of R2's outer casing, as well as fused logic and memory units, with parts from a similar droid. They basically murdered someone else's droid so they could resurrect Luke's.
And then, there was the subtler matter of whether this "new" R2-D2 was even the same droid. It's kind of a philosophical question. They retriev
Different situations, different attatchments. (Score:5, Insightful)
This attachment shows up in other ways too. Kevin Mitnick is said to once have cried when being informed that he broke Bell Lab's latest computers because he had spent so much time with them that he'd become attached.
Now contrast that with an office job where the computer is not your friend but your enemy, you need the reports on time, you need them now why WHY! won't it work. Clearly the computer must be punished it is and uppity evil servant that will not OBEY!
If you were to stop talking about "Robots Rights" and start talking about say "Ship's rights" then you might have a fair analogy. To men and women of the sea a ship, their ship is a living thing so of course it should be cared for and respected. To people who live on land and don't deal with ships, this is crazy, even subversive to the natural order. To people who have developed an intimate hatred of such things giving them rights will only encourage what they see as a dangerous tendency to get uppity.
On a serious note though the one unaddressed question with "Robot Rights" is which robots? If we are to take the minefield clearing robot as a standard what about those less intelligent? Does my Mindstorms deserve it? Does my Laptop? Granted my laptop doesn't move but it executes tasks the same as any other machine. At what point do we draw the line.
In America, and I suspect elsewhere, race based laws fell down on the question of "what race?" Are you 100% black? 1/2 One quadroon (1/4) or octaroon (1/8) as they used to say? How the hell do you measure that? Ditto for the racial purity laws of the Nazi's. Crap about skull shape aside there really is no easy or hard standard. Right now the law is dancing around this with the question of who is "Adult" enough to stand trial and be executed, or "Alive" enough to stay on life support. No easy answers exist and therin lies the fighting.
The same thing will occur with "Robot Rights" we will be forced to define what it means to be a robot and that isn't so easy.
Treat them like you hope they treat us. (Score:3, Interesting)
This is my robot. (Score:5, Funny)
People bond with objects and animals (Score:3, Interesting)
In at least one other book [wordpress.com], the protagonist loves, after a fashion, a simulacrum of something he knows cannot be who he loved. As the protagonist says, "We all know that we are material creatures, subject to the laws of physiology and physics, and not even the power of all our feelings combined can defeat those laws." We know robots are the opposite of material creatures, but that doesn't stop us from dreaming that they are not, and we have been dreaming of objects that come alive for at least as long as we have been writing things down. The truly strange part is that we are closer to having what we think of as "things" that do come alive.
Such soldiers are still around eh ? (Score:3, Insightful)
This has been going on for millenia (Score:3, Interesting)
Re:robot's rights? (Score:5, Funny)
Re: (Score:3, Insightful)
While I don't think we need to be careful about being humane
Happens with all complex machines. (Score:5, Interesting)
I wouldn't count on that. I worked in a big warehouse once, and some of the guys got pretty attached to their pallet jacks; they'd each have their own and god forbid you tried to drive it. Several of them had names.
People are funny that way. It's not a 'robot thing,' it's a 'complicated machine' thing. When a device gets complicated enough that it develops "quirks" (problems that are difficult to diagnose and/or transient), there's a tendency to anthropomorphize them. But the tendency to do it decreases with the more knowledge you have about how it works. E.g., the people who give names to their cars are generally not auto mechanics; likewise I suspect the designers of the de-mining robot would probably have not had as much of a problem testing it to pieces (or rather, their objection would probably have been "I don't want to watch six months of work get blown up," not "that's inhumane to the robot"), because they know what goes into it.
People do the same things to computers; I've dealt with lots of people who will say their computer is "tired," when it's really RAM starved -- after using it for a while, it'll run out of memory and start thrashing the disks, slowing it down. To someone who doesn't understand that, they just understand that after a certain amount of time, the computer appears to get 'fatigued.' Since they don't know any better, they try to understand the mysterious behavior using the closest analog to it that they do understand, which is themselves / other people.
The Soldier and the Warped Sense of Humour (Score:4, Interesting)
Then they have to do dangerous and uncomfortable things that have nontrivial odds at killing them in horrendous and painful ways.
Plus they may be called upon to kill other human beings (in horrendous and painful ways) which carries its own psychic cost.
And on top of all this, they are usually in a state of mind-numbing boredom, occasionally punctuated by periods of extreme terror.
One of the defense mechanisms one develops (to help one stay sane) is a somewhat twisted and black sense of humour. Not cruel or mean, just... warped.
It isn't something you take at face value; there are layers and layers of irony involved, and you pretty much have to be a soldier to get it.
DG
Re: (Score:3, Funny)
That's what my parents said about me when I was a teenager...
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
Re:Pretty hypocritical (Score:5, Insightful)
Re: (Score:3, Insightful)
While I have sympathy for your situation, every single (US) soldier who is pulling a trigger is a volunteer. "I was only following orders" stopped being a valid excuse for government-sanctioned murder a loooong time ago in an all-volunteer army.
Re:Pretty hypocritical (Score:4, Insightful)
Re:Pretty hypocritical (Score:5, Insightful)
Re: (Score:3, Insightful)
Re:Pretty hypocritical (Score:5, Insightful)
Soldiers from lower middle class backgrounds without a college education are disproportionately represented in combat units. This suggests they are more pressured or inclined by their circumstances to enter the military. No one chooses the family they are born into or the environment in which they are raised. In many cases they may see no viable alternative to military service to realizing the demanding values and expectations society has instilled in them, and may be unable to see or acknowledge this coercion even when presented with it.
Add to this that the military spends millions of dollars to actively misrepresent the nature, scope, and risks of military service in elaborate advertising campaigns targeted at young people in such circumstances, and you have a truly despicable situation.
If you supported this war based on the premise that those there are enthusiastic volunteers having made fully free and informed decisions about their participation, you are deluded. Let me guess: you feel the same way about sex workers in southeast asia?
Re:That makes it WORSE, not better (Score:5, Insightful)
And yet, only one of our soldiers has had the character to do the right thing. And he's being court-martialed for it. [wikipedia.org]
Re: (Score:3, Informative)
Not a One... (Score:3, Insightful)
So with as much sincerity as I can express through this keyboard, I thank you for your service.
I can only imagine the horror you've seen and the torment you're going through, but please do think about this: what you've seen, smelt, heard, done, or felt while on duty spared many here at home the experience of what you've gone through.
I do not believe in coincidences: there is a reason another 9/11 hasn't happened h
Re: (Score:3, Insightful)
Professional soldiers are the *enabling factor* in meddling foreign wars. That's why the founding fathers we
Not equivalent (Score:5, Interesting)
soldiers blowing up robots with landmines is inhumane, but soldiers killing people on their own land with no cause isn't?
Nobody said that killing people is somehow more humane than blowing up robots. Also, training soldiers to kill other humans is actually more difficult than you might think. Study after study has shown this, from WW II to Korea and Vietnam. Killing is not a natural impulse, which is why soldiers who have been involved in killing often come out of it with deep psychological scars. Most of what soldiers do is motivated from a desire to defend themselves and their cohorts, so it makes sense that the robot that saves soldiers from getting blown up by landmines would become dear to them.
Re: (Score:3, Insightful)
Food For Thought? (Score:3, Insightful)
I guess it's food for thought. But then you'd have to have completely missed the last seventy years of science fiction in order for it
Re: (Score:2, Insightful)
Here is a good place to start: http://www.numenta.com/ [numenta.com]