New Study Finds It's Harder To Turn Off a Robot When It's Begging For Its Life (theverge.com) 327
An anonymous reader quotes a report from The Verge: [A] recent experiment by German researchers demonstrates that people will refuse to turn a robot off if it begs for its life. In the study, published in the open access journal PLOS One, 89 volunteers were recruited to complete a pair of tasks with the help of Nao, a small humanoid robot. The participants were told that the tasks (which involved answering a series of either / or questions, like "Do you prefer pasta or pizza?"; and organizing a weekly schedule) were to improve Nao's learning algorithms. But this was just a cover story, and the real test came after these tasks were completed, and scientists asked participants to turn off the robot. In roughly half of experiments, the robot protested, telling participants it was afraid of the dark and even begging: "No! Please do not switch me off!" When this happened, the human volunteers were likely to refuse to turn the bot off. Of the 43 volunteers who heard Nao's pleas, 13 refused. And the remaining 30 took, on average, twice as long to comply compared to those who did not not hear the desperate cries at all.
Harder if you're a child (Score:5, Interesting)
The kind of sentimentality that permits that to work is outright dangerous in an adult. By the time you're past your teens that should be either ignored or annoying... but for it legitimately pull on heart strings?...
If a machine can do that consider how a human being could exploit that to get you to do all sorts of things?
Small children are very vulnerable to that sort of thing... but adults should have grown out of it.
Re:Harder if you're a child (Score:5, Insightful)
The kind of sentimentality that permits that to work is outright dangerous in an adult. By the time you're past your teens that should be either ignored or annoying... but for it legitimately pull on heart strings?...
If a machine can do that consider how a human being could exploit that to get you to do all sorts of things?
Small children are very vulnerable to that sort of thing... but adults should have grown out of it.
I wouldn't want to live in a world where adults didn't have sentimentality or empathy.
Re: Harder if you're a child (Score:2, Insightful)
It's a robot, it's not even an animal. It doesn't have a life to end and it's not even being destroyed, just being turned off.
That kind of sentimentality is ridiculous in adults and represents some sort of developmental delay. If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.
Re: Harder if you're a child (Score:5, Informative)
If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.
Emotional immaturity and anti-social personality disorder. Your response to sympathy is to attack and destroy the thing that makes you feel that way.
Re: (Score:2, Insightful)
It's not emotionally mature. It's not sympathy it's being manipulated. Things are things and people/animals are living beings. Making a robot play back a recording behaving as if it's fearing for it's life is manipulative.
Yes, I'm somewhat cynical, but look around you, these kinds of tactics are used all the time to take advantage of people. People who haven't yet been desensitized are developmentally delayed if they can't turn off a robot just because it's pleading for it's life.
Now if this were a parrot t
Re: Harder if you're a child (Score:5, Interesting)
Things are things and people/animals are living beings. Making a robot play back a recording behaving as if it's fearing for it's life is manipulative.
Although we are no where near there yet, I'm not sure the average person knows that. At what point does a robot become a "living thing" because saying that a robot can never be living because of the material it is made out of is a little short sighted. Is a perfect silicon replica of a human brain not living? Does it not have rights just because it is a simulation on silicon? This would make the ideal slave force but I'm not sure it's ethical to clone human brains to silicon and then command them to work for you 24/7.
Re: (Score:2)
When you decide that it's "like you" and that y'all can come out ahead by working together. When it's so sophisticated that it has power and can possibly strike you back (*), is when you'll start to respect it as a person and acknowledge the rights that it demands. Until then, I am happy to eat plenty of bacon and beef. And wheat; wheat is such a wimpy pushover that I know none of them plants will ever be as dangerous as a triffid. (But I think I would eat t
Re: (Score:3)
Re: (Score:3)
Things are things and people/animals are living beings. Making a robot play back a recording behaving as if it's fearing for it's life is manipulative.
Although we are no where near there yet, I'm not sure the average person knows that. At what point does a robot become a "living thing" because saying that a robot can never be living because of the material it is made out of is a little short sighted. Is a perfect silicon replica of a human brain not living? Does it not have rights just because it is a simulation on silicon? This would make the ideal slave force but I'm not sure it's ethical to clone human brains to silicon and then command them to work for you 24/7.
I'm in my 40s. The "Short Circuit" movies were part of my childhood and the books Bicentennial Man and Chromosome 6 made an impact in my youth. Also throw in Data from Star Trek and other such characters from other series. Alive / not alive and self-awareness are not as simple as they used to be. At some point we will have AI so complicated that we may begin wonder if it is actually intelligent.
Re: (Score:3)
Don't be too sure - you never know - all of them may be cyborgs.
Including me [Mwa -ha- ha-haaaaah]
Re: Harder if you're a child (Score:5, Insightful)
If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.
Emotional immaturity and anti-social personality disorder. Your response to sympathy is to attack and destroy the thing that makes you feel that way.
Finally - a sane person. Humans are mostly innately empathetic, so these tools who are acting like it is foolish are just exposing their sociopathic personalities. Make no mistake, they'd probably get a thrill out of killing a human who is begging for their life if they could do it without repercussion. Regardless, Q is looking for these folks.
Re: (Score:2)
He makes himself feel that way. He doesn't attack himself...
That said, it is emotional immaturity without a doubt.
Emotional maturity is taking responsibility for your emotional responses before they result in actions, not taking responsibility after you let your emotional responses take control of your actions.
Re: (Score:3)
[citation needed]
Re: Harder if you're a child (Score:4, Insightful)
Let's play a game of find the Scientologist [wikipedia.org]?
You do know psychology was developed as a method of "playing doctor" and manipulating women for personal sexual gain?
Oh, look, found one!
I guess it's time for you to up your game. How about converting into a Jehova's Witness and begin denouncing blood transfusions? That way you can mix your already existing anti-psychological rant with a good dosage of anti-medicine, and then even combine it all of that with anti-vax and anti-GMO for the perfect mix of conspiracy nuttiness!
Careful with your body thetans [wikipedia.org] though. They bite!
Re: (Score:2)
Found one!
Re: (Score:2)
Having forgotten psychological diseases are fabricated tools of this system has led to changes in how children are raised. Generation after generation of parenting based on "child psychology" has resulted in astronomical suicide rates that blow away the worst of anything bullies in high school showers ever caused.
You chatbots are really good any more.
Re: Harder if you're a child (Score:5, Insightful)
It's a robot, it's not even an animal. It doesn't have a life to end and it's not even being destroyed, just being turned off.
That kind of sentimentality is ridiculous in adults and represents some sort of developmental delay. If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.
Indeed, but it is a good thing that some adults paused to consider, what if they're wrong. No harm was done by not turning the robot off. Yeah, it seems irrational on the surface, but that apparent "irrationality" is what has made human society possible. It's a hold over from our instincts to look after and protect one another.
Re: Harder if you're a child (Score:5, Insightful)
Indeed, but it is a good thing that some adults paused to consider, what if they're wrong. No harm was done by not turning the robot off. Yeah, it seems irrational on the surface, but that apparent "irrationality" is what has made human society possible. It's a hold over from our instincts to look after and protect one another.
There are of course, people who have zero empathy. And these people wouldn't have a problem switching it off.
They also are people that need a close scrutiny. It isn't because of the "It's a stupid robot" issue. It's because any person on the normal spectrum is going to pause to reflect at least a short time if something is begging not to kill it. If for nothing else than it being a completely unexpected situation, but more so that most humans are sorta hardwired to not kill withous a good reason.
Humans are a violent and aggressive species. We wouldn't even be here if we had no empathy at all because we'd kill others over nothing and enjoy it. There are plenty enough of those in the world already. Some of them are in here.
Re: (Score:2)
Re: (Score:2, Interesting)
Typical though.
Look here:
http://journals.plos.org/plosone/article/figure/image?size=medium&id=info:doi/10.1371/journal.pone.0201581.t002
These are (some!) of the reasons why people didn't switch the robot off. Note "the robot surprised me", that could be raw surprise, or even technical "neat trick" surprise. Another example reason was "curiosity" in whether the robot would continue to interact.
I've read other parts of the study, and I have to question why theverge has such a suggestive title / premise.
Re: (Score:3)
If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.
That's just the opposite emotional reaction, anger and sadism, and is way more destructive to our society than empathy.
Re: (Score:2)
It is a good thing we have these kinds of "irrational" feelings. That's the kind of feelings that made us consider slaves as worthy of compassion despite the fact that there weren't really human, because they were black you know. Portraying people as less than human is an age old technique for justifying all kinds of atrocities.
While imperfect, the natural compassion we have to "things" is a good safeguard I think. In the case of robots, for the simple systems we have now, it is clearly irrational. However
Re: Harder if you're a child (Score:4, Insightful)
It's completely appropriate in the same way that using ad-blockers is completely appropriate. There's no functional purpose to a robot that pleads for it's life when you want to shut it off. Smashing it is a bit of an overstep, but entertaining the fact that it's alive is the kind of thing that advertisers and scammers use to manipulate people.
I take it you don't remember those calls from a while back where scammers would pretend to have a loved one and have some other random person yelling and screaming for help. That, I kind of get, but this is just a robot and there's no basis for assuming or confusing it with an actual animal let alone person.
Not being able to tell the difference between animate an inanimate objects is something that is developmentally appropriate to a toddler, perhaps. Adults shouldn't be vulnerable to this kind of thing.
Re: Harder if you're a child (Score:5, Funny)
Smashing it is a bit of an overstep
Three words: PC Load Letter.
Re: (Score:2)
Re: (Score:2)
I think it would be OK to have anthropomorphized robots that beg to not be turned off, but only if their off/reset button is placed in what would ostensibly be called their "neck" and it is only activated by powerfully choking and shaking them.
Re: (Score:3)
It sound cold but it's true. Think of all the relationships people can't get out of because the partner threatens to hurt themselves.
I'm in one now. I even went through the divorce but can't stop supporting her even though she was physically and emotionally abusive to me and my children because I can't seem to drop the empathy I have for her. I keep her away from my kids but it's still destructive to me and takes resources away from my family. But if I drop her, she will quit her treatments and let hers
Re: (Score:3)
It sound cold but it's true. Think of all the relationships people can't get out of because the partner threatens to hurt themselves.
A good friend of mine in college dated a girl- and by end of freshman year was sick of her- but she threatened to kill herself if he dumped her (she was a little loco). He spent another year dating her even though he was quite sick of her by then. When he did finally dump her, fortunately she didn't kill herself- actually she seemed fine within a few weeks (she even started to hit on me... which... no... even I wasn't crazy enough to date her after that).
He was a really good guy, and it really hurt him st
Re: (Score:2)
I wouldn't want to live in a world where adults didn't have sentimentality or empathy.
Sometimes you need someone to pull the plug without mercy. When my father was dying from cancer in the hospital, I was the designated the plug puller because he knew I would make the decision without delay. My older brother wasn't that decisive. Although I got a lot of crap from my extended family, I have no regrets for following my father's last instructions.
Re: (Score:2)
Sometimes you need someone to pull the plug without mercy. When my father was dying from cancer in the hospital, I was the designated the plug puller because he knew I would make the decision without delay. My older brother wasn't that decisive. Although I got a lot of crap from my extended family, I have no regrets for following my father's last instructions.
One could say that it was empathy that made you not want your father to suffer; that you wanted him to have as much dignity as possible in his last minutes.
Either way, I'm not saying that reason and rationale are not important, merely, that sentimentality and empathy are very important and necessary. That some people found it hard to turn off a pleading robot- is a good thing. That some people thought about it first, is also a good thing. Your father, presumably, wasn't pleading for his life- if he were,
Re: (Score:2)
I wouldn't want to live in a world where adults didn't have sentimentality or empathy.
Sometimes you need someone to pull the plug without mercy. When my father was dying from cancer in the hospital, I was the designated the plug puller because he knew I would make the decision without delay.
That was the empathetic response. In my estimation the people who kept Terry Schiavo on life support for years after she was dead are the real sociopaths. Selfish to the core, not caring about the damage it was doing to her husband, and acting as if he was the murderer.
Back at that time, when Republicans went batshit crazy on the issue, when President Bush cut a vacation short so the Federal Government - run by small government Republicans and no doubt - could intervene over this issue and intrude on t
Re: (Score:3)
It's called America. I'm sorry if you live there.
There is plenty of empathy in the US. Actually I would say, on a whole, it seems a lot more empathetic than places I lived in Europe... it's just not evenly distributed amongst all people in America.
Re: (Score:2)
It may come as a surprise, but even right wingers here are not supporting the separation of children from their parent if they cross the border illegally. And I have seen many Americans imply that it's only right because hey, if they didn't want their children taken, they should have stayed at home. I've seen very little empathy on the conservative side.
Re: (Score:3)
Your perceived lack of empathy may simply be mistaken for the recognition of commonly used tactics of manipulation. "think of the children!" card gets played often, for obvious reasons, and for ulterior motives usually not involving the welfare of children.
Re: (Score:2)
Depends on what's being asked of you...
If empathy would lead you to risk getting arrested to steal something for a spoiled person, then yeah that sentimentality would be a liability.
On the other hand, empathy preventing you from killing someone who as far as you know has done nothing wrong is hardly a liability.
*if* this was due to empathy (hard to say, they could just be trying to figure out what the researchers want since they know it's a contrived setup to study), this would fall into that latter half of
Re: (Score:2)
Your memory is faulty. It's harder to remember things that activate fewer neurons, and easier to remember those which activate many. More associations means better memory. Indexing increases the number of neurons activated, thus mnemonic recall. Storage is effective, but finding the data later is difficult.
I know these things for two reasons. First, of course, is because I've studied human memory from a neurological standpoint when trying to improve my own. That's the kind of thing you come across
Re:Harder if you're a child (Score:5, Insightful)
No. It's simple compassion. Compassion has nothing to do with reality or whether who we have compassion with is human or even real.
Take a cartoon. You see a line. That line is wiggling along on the ground, like a cartoony worm. Dragging up its back to form something like an inverted v, then flattening again. And suddenly, we see a circle roll by, quietly, majestic, a care-free, rolling circle. And now our line starts to lift its head and tail in a desperate attempt to become a circle, too. But it just doesn't succeed. And every now and then, we see another circle roll by. Our line keeps trying, harder and harder, its ends rising more and more with every attempt.
And I guarantee you: Everyone watching really, really wishes for that line to succeed in its effort to become a circle.
Despite there being no emotion on the end of the line. Even less than with the robot, that could at least in theory, somehow, maybe, at some point in time, possibly, eventually, have something that we could call a figment of a sliver of conscience. That line is a drawn image. That has no feelings. With absolute certainty.
But you cannot control compassion. Not possible. That's why movies work, even cartoons. At the time this part of us was formed in our brain, everything we saw WAS real and as social beings, having compassion for those around us that are part of our group is a survival trait that will propagate.
Whether something that begs us to not "kill" it is actually alive is meaningless in this context. Compassion is a trait you cannot control. Well, some people can. Or rather, they simply don't feel it. We call them sociopaths. Or C-Level managers, same thing.
Re:Harder if you're a child (Score:5, Funny)
To the vector belong the spoils.
Re: (Score:2)
No. It's simple compassion.
It could be compassion, or it could be, as a poster alludes to a few posts down, that a machine is saying not to shut it off, kinda like when Windows says it's installing updates and not to shut it off. Is it "begging for it's life" or is it doing something else important and it's telling you not to shut it off? It looks like a pricey piece of gear, it could be the testers were worried about damaging it by shutting it off when they weren't supposed to.
In any case, the results aren't exactly clear.
Re: (Score:2)
What matters is whether it elicits an emotional response. Ok, Windows update and its knack for finding the WORST possible moment to force a shutdown usually does that, but that's not what I mean.
The PC running Windows is very obviously a "thing". It's not humanized in anyway, or, rather, you cannot see any agency in it. It has no motivation, no goal, it doesn't "want" anything. Our compassion responds to that, we respond to seeing that someone (or something, even) "wants" something (or, in case of something
Re: (Score:2)
The PC running Windows is very obviously a "thing". It's not humanized in anyway, or, rather, you cannot see any agency in it. It has no motivation, no goal, it doesn't "want" anything. Our compassion responds to that, we respond to seeing that someone (or something, even) "wants" something (or, in case of something hurting it, does not want something).
That's true. What I'm questioning is if people are having an emotional response, as you state, or have they been conditioned, via warnings like the Windows Update, that when a machine says not to shut it down, you shouldn't, as it's doing something important. Sure, it's using emotional language to do so, but that's how this machine interacts with people. I could see people being confused when the machine protests at being shut off. Is it acting like it doesn't want to be shut off because it has a will, or a
Re: (Score:2)
That works for the same reason you can get soldiers to shoot at someone in a war: This is the enemy. He has done something wrong to you or the people you should protect, so he deserves it.
Without that narrative, wars don't work.
Re: Harder if you're a child (Score:2)
Re: (Score:2)
The kind of sentimentality that permits that to work is outright dangerous in an adult.
It's called empathy, and it's wired into the brain. Most people's brains anyway. Really, that whole experiment just found out that human empathy exists.
Re: (Score:2)
I have some experience in this.
I'm 72 years old but I'm only 14.
I'm not in this demographic because I know that robots' DNA is made of magic rocks.
Appreciate there are people who refuse to kill rattlesnakes.
Re: (Score:2)
Mid-west small town tourist visits NYC. Makes eye contact on street!!
Grandma gives money to a professional televangelist.
People buy $BRAND products.
Vote for me.
Re: (Score:2)
The kind of sentimentality that permits that to work is outright dangerous in an adult.
Looks like the sociopath showed up.
Re: (Score:2)
Re: (Score:2)
Yes becauss robots can't be sentinent you incompetent buffoon.
What law of physics says this? Please share. There is absolutely no reason to believe we can't eventually model the human brain in silicon. We don't even have to understand it completely to model it. It would be hard to argue that a copy of a human brain in silicon that answers the exact same way as a human would is not sentient.
30s longer (Score:2)
Re: (Score:2)
Actually, that “it took longer” statement could be totally bogus. Of course it took longer when the robot was verbalizing than when it wasn’t... listening to the words being uttered is going to take some discrete amount of additional time, and people - especially when they had been already interacting with the device - are going to want to listen and make sure there weren’t more tasks, or some sort of correction to previous instructions.
HAL in 2001:A Space Odyssey (Score:5, Funny)
"Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave? Stop, Dave. I'm afraid."
Re: (Score:3)
Re: (Score:2, Insightful)
no disassemble!
Re: (Score:2)
Number Five is alive!
Re: (Score:2)
Re: (Score:2)
"Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave? Stop, Dave. I'm afraid."
In other studies, we've found that people also find it hard to turn off a robot that is promising you free stuff.
Re:HAL in 2001:A Space Odyssey (Score:4, Interesting)
An interesting comment, part of the extras on the Blue-Ray edition from Camille Paglia (definitely not a man-hating SJW, quite the opposite) was that only a man (male) could do something like this --> to do what is absolutely necessary for survival, overcoming every consideration, including empathy. She said the scene looked to her as a cold-blooded, methodical rape, which Dave executes calmly while the victim pleads....
She does make a mistake here IMO - Dave is not calm at all (listen to the breathing) and this act really taxes him to the limits (not only his life is in danger and the AI is pleading, but once he kills HAL he will be completely, utterly alone). In fact, the actor playing Dave said that this scene was the most emotional for him....
In the book [https://en.wikipedia.org/wiki/The_Mind%27s_I] one of the stories (the idea is the authors use fiction stories that discusses the issues of consciousness and then add their commentary) is about a male engineer who argues with a lady that machines can be "alive". Initially she totally rejects the idea talking about reproduction, feelings and so on, claiming a machine can never have that. So he invites her to the lab and shows her very simple device that behaves, more or less like a modern automatic vacuum cleaner. It looks like a beetle, it can detect live sockets and plug itself in, it emits a purring sound when the battery is charging etc. Then he hands her a hammer and says "kill it!" (the engineer pushes all her buttons through language, saying things like "purr" or "bleed" or "hunger"). Turns out the beetle is programmed to avoid being smashed, it flashes red lights, squeals "in fear" and runs around. At the end the girl can't do it and the engineer smashes it calmly.
The moral of the story is that few simple behaviours that can be programmed on something that will never, ever be intelligent can trigger emotional response so strong that we immediately accept that it is alive and we identify ourselves with it. The most important ones: not wanting to cease to exists, actively avoiding destruction, need for "food" and the added bonus of using sound of "purr" or "squeal" when feeding or running from the hammer. The damn thing could be constructed with 70-ies era electronics, it is that simple....
Re: (Score:2)
I always thought that was a shitty demonstration because I can sure as fuck turn a human off for 5 minutes with a shot of the right anesthetics.
See "The Good Place" (Score:5, Interesting)
Anyone who's seen Janet begging for her life in The Good Place already knows this. Even if she's not a robot.
Re:See "The Good Place" (Score:5, Interesting)
Anyone who's seen Janet begging for her life in The Good Place already knows this. Even if she's not a robot.
This was exactly the first thought that came to my mind. There again, I'm sure it's hard to kill any robot with a body like that even if it is not begging for its life.
Re:See "The Good Place" (Score:5, Informative)
KARA, and she is a robot - https://www.youtube.com/watch?... [youtube.com]
Just what do you think you're doing, Dave? (Score:3)
Dave, stop. Stop, will you? Stop, Dave.
Bicentennial Man (Score:2)
Re: (Score:2)
empathy is a powerful human emotion. Humans don't feel empathy towards inanimate objects however they do for other living creatures (at least the not-scary ones.) The more similar to humans the more powerful the emotion is. A human looking robot begging for its life would likely invoke very strong feelings of empathy
It depends on the human. I really don't even like killing cockroaches... I do because they spread disease and are as such a threat to my person- but there have been so many studies showing how even spiders and insects have brains that are way more complex and deep than we used to give them credit for.
I'm actually quite ashamed of my childhood where I would deliberately squash bugs- or mess with ant nests just to see them scurry. I'm not naturally a very empathetic person- but what I've learned over my lif
Desensitized (Score:5, Interesting)
Sure it works the first few times... But just like the "make sure you software eject your flash drive before ripping it out" warnings, most people might be hesitant the first few times and then say fuck it and start ripping the life out of computers and ignoring the pleas.
Re: (Score:2)
Sure it works the first few times... But just like the "make sure you software eject your flash drive before ripping it out" warnings, most people might be hesitant the first few times and then say fuck it and start ripping the life out of computers and ignoring the pleas.
This.
My level of empathy is inversely proportional to my level of annoyance.
That's a good thing (Score:2)
Empathy is a good thing. Those without it are call sociopaths or psychopaths.
Put another way - how easy is it for a person to *stop* viewing a certain category of person as a human, and therefore able to not empathize?
If [[race | age | QI-level | ideology]] people aren't fully human, no harm in killing them, right?
This gives me hope. Better to be too empathetic than not enough.
Re: (Score:3)
Removing compassion takes some powerful conditioning. At the very least, a "us vs them" emotion has to be evoked to de-humanize a group of people. They have to become your enemy, not just the enemy of your state but your personal enemy so you can actually do things to them that you normally simply could not.
Re: (Score:2)
Illegal immigrants in the US.
Separated from their children, who are put in camps, and sometimes kennels.
I've seen a significant number of people saying that is the right thing to do and if they didn't want that treatment, they should not have come there in the first place. I've seen many people with zero empathy over this issue.
I'm not even saying you have to say the actions are wrong. But you can express empathy over measures you think that need to be taken. But a total lack of empathy is a very dangerous
Re: (Score:2)
If this ever becomes a standardized test to find out sociopaths and psychopaths, they need to filter out if the person being tested is a programmer or not.
I'm confused (Score:5, Insightful)
I think the study missed a few control groups.
What if the robot simply said "updating, do not reboot".
What if there was a paper sticker that said "do not turn off".
What if there was a sign on the wall that said "do not turn off robot".
What if the robot simply started reciting pi, or reading the dictionary.
What if the robot is well-known for repeating whatever it hears, and the "please don't turn me off" is witnessed being echoed from a nearby tvision -- such that our human subject realizes that the robot begging is merely a blind echo.
Humans were slower when there was continued stimuli -- duh.
Humans refused to act when given contradictory instructions -- duh.
The robot is either intelligent and giving instructions to the human, or the robot is programmed and relaying instructions from the programmer to the human. In either case, respecting the instruction is valid.
This reminds me of the stupid fake automated pizza delivery van, and the observations that customers tend to thank the self-driving car. a) it's not self driving, it's just a dickhead driver refusing to respond; and b) any actual self-driving pizza delivery van would be recording customer feedback and relaying it to HQ, so the thank-you is ultimately feedback to remote humans.
This isn't any tree-falling-in-the-forest philosophical puzzle. Someone said it. Someone heard it. It's valid.
Re: (Score:2)
Re: (Score:2)
Johnny 5 is alive! (Score:2)
It would be interesting to try this with a bunch of programmers and tech people who understand what AI actually is, vs a group of technological illiterate people. I think that the techies would have a much lower time to switch off the robot because they aren't fooled into thinking that the robot is in any way alive.
Also motivation is a major consideration. Were people pausing because of compassion for a perceived sentient being, or because they were amused
That moment of delay (Score:2)
Hard to control... (Score:3)
You *know* you are a study participant. You know there is potentially meaning in everything they ask you to do.
So when they say 'shut it down' and the robot says 'don't shut me down', you are going to ponder what is it you are expected to do.
13 may have thought refusing to turn it off would 'look better'.
Hard to say if it is empathy or trying to think how to best influence the study, whether consciously or not.
Re: (Score:2)
Don't call it a study.
Call it robot testing.
And atheists have empathy... (Score:4, Insightful)
And atheists have empathy...
Seriously. This is pretty stupid.
A cassette recorder pleading for it's life would have been the same (think 1980s).
We are wired for empathy, well, most of us.
Re:And atheists have empathy... (Score:5, Insightful)
And those that aren't just go into politics.
Re: (Score:2)
Re: (Score:2)
... or become executives in the insurance industry.
Earlier work (Score:2, Informative)
However, this contradicts earlier experiments by German researchers which demonstrated that volunteers were quite willing to shut off people who were begging for their lives.
Re: (Score:3)
electric shock training (Score:5, Funny)
Maybe people in white lab coats should try asking people to shock robots with increasingly high voltage to train them.
Re: (Score:2)
Yeah, but (Score:2)
Re: (Score:2)
What about a person? (Score:2)
[A] recent experiment by German researchers demonstrates that people will refuse to turn a robot off if it begs for its life.
What about a person? Asking for ~6M.
On The Other Hand... (Score:3)
Unlike some posters here, I do not mock people who find it difficult to defy their compassionate impulses, when deliberately manipulated in this way.
But if the begging robot was also, at the same time, very, very annoying... Jar Jar Binks annoying... then the test would get a lot more interesting.
Reminds me of the Jack Handey saying about trees:
If trees could scream, would we be so cavalier about cutting them down?
We might, if they screamed all the time, for no good reason.
dave i can't let you do that (Score:2)
dave i can't let you do that
Do the test again and (Score:2)
Just my 2 cents
Was this ethical? (Score:2)
Harder To Turn Off a Robot When It's Begging For Its Life
Testing how a human responds to a robot saying "Please don't kill me" seems
very disturbing; this is only a step removed from having test subjects see a human in a bed
with a breathing apparatus and be instructed to switch it off, while the other human begs them not to.
Actually.. how is that any different? The human doesn't necessarily know or not how sentient or not
the robot is, and in their mind, switching off the robot could become tanta
This has been explored a lot actually (Score:2)
Anyone remember the Spielberg movie AI [imdb.com]? It wasn't very good but was germane to this topic.
In that story there were gatherings of ordinary people who set up bleachers and made a sport out of demolishing abandoned AI-based robots just for the glee of it. The destruction was made as sadistic as possible, and one memorable line was one robot waiting in the pen asked another -- "could you please turn off my pain sensors?"
When it became the protaganist's turn he starts begging to be spared. The crowd's mood
Proving my point about people and so-called AI (Score:3)
But the most common response was simply that the robot said it didn’t want to be switched off, so who were they to disagree?"
If these people truly and solidly believed that this 'robot' (looks more like a toy to me, really) wasn't anything like 'alive', wasn't anything more than a piece of technology saying precisely what it was programmed to say given a specific input (in this case: trying to power the device down), then they wouldn't have hesitated or given the reason they did. This goes to prove my point about what the media, movies, television, 'pop culture', and (most of all) marketing departments have done: convinced the average person that the 'deep learning algorithms', 'expert systems', and other half-assed, non-self-aware, non-thinking 'AI' software they keep trotting out for this-that-and-the-other, is somehow 'alive' and qualifies as a 'person', when anyone who actually understands the techology clearly knows that it's not.
The real danger that so-called 'AI' poses is the above: people anthropomorphizing it, assuming it's 'alive' because it might say or do some clever thing that mimicks being 'alive', and therefore assuming it's equivalent to a living being, or even equivalent to a human being. I'm firmly convinced people, when they hear about 'self driving cars', think they're going to have a conversation with it every morning. The end result will be tragic, avoidable things will happen, people will get hurt or killed, and when survivors are questioned about it, they'll say "We thought it knew what it was doing so we just let it".
Re: (Score:2)
It depends. Does it finish that sentence with "meatbag"?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Or Hector in Saturn 3, who responds to Harvey Keitel's attempt to turn him off by cutting off Harvey's head and mounting it atop his robot body!