Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Robotics Power Science

New Study Finds It's Harder To Turn Off a Robot When It's Begging For Its Life (theverge.com) 327

An anonymous reader quotes a report from The Verge: [A] recent experiment by German researchers demonstrates that people will refuse to turn a robot off if it begs for its life. In the study, published in the open access journal PLOS One, 89 volunteers were recruited to complete a pair of tasks with the help of Nao, a small humanoid robot. The participants were told that the tasks (which involved answering a series of either / or questions, like "Do you prefer pasta or pizza?"; and organizing a weekly schedule) were to improve Nao's learning algorithms. But this was just a cover story, and the real test came after these tasks were completed, and scientists asked participants to turn off the robot. In roughly half of experiments, the robot protested, telling participants it was afraid of the dark and even begging: "No! Please do not switch me off!" When this happened, the human volunteers were likely to refuse to turn the bot off. Of the 43 volunteers who heard Nao's pleas, 13 refused. And the remaining 30 took, on average, twice as long to comply compared to those who did not not hear the desperate cries at all.
This discussion has been archived. No new comments can be posted.

New Study Finds It's Harder To Turn Off a Robot When It's Begging For Its Life

Comments Filter:
  • by Karmashock ( 2415832 ) on Friday August 03, 2018 @08:07AM (#57062616)

    The kind of sentimentality that permits that to work is outright dangerous in an adult. By the time you're past your teens that should be either ignored or annoying... but for it legitimately pull on heart strings?...

    If a machine can do that consider how a human being could exploit that to get you to do all sorts of things?

    Small children are very vulnerable to that sort of thing... but adults should have grown out of it.

    • by Oswald McWeany ( 2428506 ) on Friday August 03, 2018 @08:25AM (#57062752)

      The kind of sentimentality that permits that to work is outright dangerous in an adult. By the time you're past your teens that should be either ignored or annoying... but for it legitimately pull on heart strings?...

      If a machine can do that consider how a human being could exploit that to get you to do all sorts of things?

      Small children are very vulnerable to that sort of thing... but adults should have grown out of it.

      I wouldn't want to live in a world where adults didn't have sentimentality or empathy.

      • by Anonymous Coward

        It's a robot, it's not even an animal. It doesn't have a life to end and it's not even being destroyed, just being turned off.

        That kind of sentimentality is ridiculous in adults and represents some sort of developmental delay. If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.

        • by bluefoxlucid ( 723572 ) on Friday August 03, 2018 @08:55AM (#57062960) Homepage Journal

          If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.

          Emotional immaturity and anti-social personality disorder. Your response to sympathy is to attack and destroy the thing that makes you feel that way.

          • Re: (Score:2, Insightful)

            by Anonymous Coward

            It's not emotionally mature. It's not sympathy it's being manipulated. Things are things and people/animals are living beings. Making a robot play back a recording behaving as if it's fearing for it's life is manipulative.

            Yes, I'm somewhat cynical, but look around you, these kinds of tactics are used all the time to take advantage of people. People who haven't yet been desensitized are developmentally delayed if they can't turn off a robot just because it's pleading for it's life.

            Now if this were a parrot t

            • by Wycliffe ( 116160 ) on Friday August 03, 2018 @09:22AM (#57063162) Homepage

              Things are things and people/animals are living beings. Making a robot play back a recording behaving as if it's fearing for it's life is manipulative.

              Although we are no where near there yet, I'm not sure the average person knows that. At what point does a robot become a "living thing" because saying that a robot can never be living because of the material it is made out of is a little short sighted. Is a perfect silicon replica of a human brain not living? Does it not have rights just because it is a simulation on silicon? This would make the ideal slave force but I'm not sure it's ethical to clone human brains to silicon and then command them to work for you 24/7.

              • by Sloppy ( 14984 )

                At what point does a robot become a "living thing"

                When you decide that it's "like you" and that y'all can come out ahead by working together. When it's so sophisticated that it has power and can possibly strike you back (*), is when you'll start to respect it as a person and acknowledge the rights that it demands. Until then, I am happy to eat plenty of bacon and beef. And wheat; wheat is such a wimpy pushover that I know none of them plants will ever be as dangerous as a triffid. (But I think I would eat t

                • The only way to truly judge a human being is in how they treat others who have no power, when they think no one is looking. In those moments you know more about who they really are than they do.
              • Things are things and people/animals are living beings. Making a robot play back a recording behaving as if it's fearing for it's life is manipulative.

                Although we are no where near there yet, I'm not sure the average person knows that. At what point does a robot become a "living thing" because saying that a robot can never be living because of the material it is made out of is a little short sighted. Is a perfect silicon replica of a human brain not living? Does it not have rights just because it is a simulation on silicon? This would make the ideal slave force but I'm not sure it's ethical to clone human brains to silicon and then command them to work for you 24/7.

                I'm in my 40s. The "Short Circuit" movies were part of my childhood and the books Bicentennial Man and Chromosome 6 made an impact in my youth. Also throw in Data from Star Trek and other such characters from other series. Alive / not alive and self-awareness are not as simple as they used to be. At some point we will have AI so complicated that we may begin wonder if it is actually intelligent.

            • Things are things and people/animals are living beings

              Don't be too sure - you never know - all of them may be cyborgs.

              Including me [Mwa -ha- ha-haaaaah]

          • by Ol Olsoc ( 1175323 ) on Friday August 03, 2018 @10:20AM (#57063552)

            If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.

            Emotional immaturity and anti-social personality disorder. Your response to sympathy is to attack and destroy the thing that makes you feel that way.

            Finally - a sane person. Humans are mostly innately empathetic, so these tools who are acting like it is foolish are just exposing their sociopathic personalities. Make no mistake, they'd probably get a thrill out of killing a human who is begging for their life if they could do it without repercussion. Regardless, Q is looking for these folks.

          • He makes himself feel that way. He doesn't attack himself...

            That said, it is emotional immaturity without a doubt.

            Emotional maturity is taking responsibility for your emotional responses before they result in actions, not taking responsibility after you let your emotional responses take control of your actions.

        • by Oswald McWeany ( 2428506 ) on Friday August 03, 2018 @09:15AM (#57063106)

          It's a robot, it's not even an animal. It doesn't have a life to end and it's not even being destroyed, just being turned off.

          That kind of sentimentality is ridiculous in adults and represents some sort of developmental delay. If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.

          Indeed, but it is a good thing that some adults paused to consider, what if they're wrong. No harm was done by not turning the robot off. Yeah, it seems irrational on the surface, but that apparent "irrationality" is what has made human society possible. It's a hold over from our instincts to look after and protect one another.

          • by Ol Olsoc ( 1175323 ) on Friday August 03, 2018 @10:30AM (#57063620)

            Indeed, but it is a good thing that some adults paused to consider, what if they're wrong. No harm was done by not turning the robot off. Yeah, it seems irrational on the surface, but that apparent "irrationality" is what has made human society possible. It's a hold over from our instincts to look after and protect one another.

            There are of course, people who have zero empathy. And these people wouldn't have a problem switching it off.

            They also are people that need a close scrutiny. It isn't because of the "It's a stupid robot" issue. It's because any person on the normal spectrum is going to pause to reflect at least a short time if something is begging not to kill it. If for nothing else than it being a completely unexpected situation, but more so that most humans are sorta hardwired to not kill withous a good reason.

            Humans are a violent and aggressive species. We wouldn't even be here if we had no empathy at all because we'd kill others over nothing and enjoy it. There are plenty enough of those in the world already. Some of them are in here.

        • long term sooner or later AI is going to reach the point where the differences between are smaller, In some ways we are just really complex deep learning/evolutionary algorythms. that duplicated themselves based on which ones didn't die. Also I suppose such a time won't be the first time in our history that we had something that effectively matched or exceeded our emotional intelligence, but we managed to define it as "less than animal". Yeah I know obviously right now we are talking a robot with littera
        • Re: (Score:2, Interesting)

          by Anonymous Coward

          Typical though.

          Look here:

          http://journals.plos.org/plosone/article/figure/image?size=medium&id=info:doi/10.1371/journal.pone.0201581.t002

          These are (some!) of the reasons why people didn't switch the robot off. Note "the robot surprised me", that could be raw surprise, or even technical "neat trick" surprise. Another example reason was "curiosity" in whether the robot would continue to interact.

          I've read other parts of the study, and I have to question why theverge has such a suggestive title / premise.

        • If I were presented with such a robot, it would be switched off sooner and probably smashed for being so annoying.

          That's just the opposite emotional reaction, anger and sadism, and is way more destructive to our society than empathy.

        • by GuB-42 ( 2483988 )

          It is a good thing we have these kinds of "irrational" feelings. That's the kind of feelings that made us consider slaves as worthy of compassion despite the fact that there weren't really human, because they were black you know. Portraying people as less than human is an age old technique for justifying all kinds of atrocities.

          While imperfect, the natural compassion we have to "things" is a good safeguard I think. In the case of robots, for the simple systems we have now, it is clearly irrational. However

      • It sound cold but it's true. Think of all the relationships people can't get out of because the partner threatens to hurt themselves.

        I'm in one now. I even went through the divorce but can't stop supporting her even though she was physically and emotionally abusive to me and my children because I can't seem to drop the empathy I have for her. I keep her away from my kids but it's still destructive to me and takes resources away from my family. But if I drop her, she will quit her treatments and let hers

        • It sound cold but it's true. Think of all the relationships people can't get out of because the partner threatens to hurt themselves.

          A good friend of mine in college dated a girl- and by end of freshman year was sick of her- but she threatened to kill herself if he dumped her (she was a little loco). He spent another year dating her even though he was quite sick of her by then. When he did finally dump her, fortunately she didn't kill herself- actually she seemed fine within a few weeks (she even started to hit on me... which... no... even I wasn't crazy enough to date her after that).

          He was a really good guy, and it really hurt him st

      • I wouldn't want to live in a world where adults didn't have sentimentality or empathy.

        Sometimes you need someone to pull the plug without mercy. When my father was dying from cancer in the hospital, I was the designated the plug puller because he knew I would make the decision without delay. My older brother wasn't that decisive. Although I got a lot of crap from my extended family, I have no regrets for following my father's last instructions.

        • Sometimes you need someone to pull the plug without mercy. When my father was dying from cancer in the hospital, I was the designated the plug puller because he knew I would make the decision without delay. My older brother wasn't that decisive. Although I got a lot of crap from my extended family, I have no regrets for following my father's last instructions.

          One could say that it was empathy that made you not want your father to suffer; that you wanted him to have as much dignity as possible in his last minutes.

          Either way, I'm not saying that reason and rationale are not important, merely, that sentimentality and empathy are very important and necessary. That some people found it hard to turn off a pleading robot- is a good thing. That some people thought about it first, is also a good thing. Your father, presumably, wasn't pleading for his life- if he were,

        • I wouldn't want to live in a world where adults didn't have sentimentality or empathy.

          Sometimes you need someone to pull the plug without mercy. When my father was dying from cancer in the hospital, I was the designated the plug puller because he knew I would make the decision without delay.

          That was the empathetic response. In my estimation the people who kept Terry Schiavo on life support for years after she was dead are the real sociopaths. Selfish to the core, not caring about the damage it was doing to her husband, and acting as if he was the murderer.

          Back at that time, when Republicans went batshit crazy on the issue, when President Bush cut a vacation short so the Federal Government - run by small government Republicans and no doubt - could intervene over this issue and intrude on t

    • by Junta ( 36770 )

      Depends on what's being asked of you...

      If empathy would lead you to risk getting arrested to steal something for a spoiled person, then yeah that sentimentality would be a liability.

      On the other hand, empathy preventing you from killing someone who as far as you know has done nothing wrong is hardly a liability.

      *if* this was due to empathy (hard to say, they could just be trying to figure out what the researchers want since they know it's a contrived setup to study), this would fall into that latter half of

    • by Opportunist ( 166417 ) on Friday August 03, 2018 @08:34AM (#57062820)

      No. It's simple compassion. Compassion has nothing to do with reality or whether who we have compassion with is human or even real.

      Take a cartoon. You see a line. That line is wiggling along on the ground, like a cartoony worm. Dragging up its back to form something like an inverted v, then flattening again. And suddenly, we see a circle roll by, quietly, majestic, a care-free, rolling circle. And now our line starts to lift its head and tail in a desperate attempt to become a circle, too. But it just doesn't succeed. And every now and then, we see another circle roll by. Our line keeps trying, harder and harder, its ends rising more and more with every attempt.

      And I guarantee you: Everyone watching really, really wishes for that line to succeed in its effort to become a circle.

      Despite there being no emotion on the end of the line. Even less than with the robot, that could at least in theory, somehow, maybe, at some point in time, possibly, eventually, have something that we could call a figment of a sliver of conscience. That line is a drawn image. That has no feelings. With absolute certainty.

      But you cannot control compassion. Not possible. That's why movies work, even cartoons. At the time this part of us was formed in our brain, everything we saw WAS real and as social beings, having compassion for those around us that are part of our group is a survival trait that will propagate.

      Whether something that begs us to not "kill" it is actually alive is meaningless in this context. Compassion is a trait you cannot control. Well, some people can. Or rather, they simply don't feel it. We call them sociopaths. Or C-Level managers, same thing.

      • by Chris Mattern ( 191822 ) on Friday August 03, 2018 @08:53AM (#57062946)

        And I guarantee you: Everyone watching [the cartoon] really, really wishes for that line to succeed in its effort to become a circle.

        To the vector belong the spoils.

      • by JBMcB ( 73720 )

        No. It's simple compassion.

        It could be compassion, or it could be, as a poster alludes to a few posts down, that a machine is saying not to shut it off, kinda like when Windows says it's installing updates and not to shut it off. Is it "begging for it's life" or is it doing something else important and it's telling you not to shut it off? It looks like a pricey piece of gear, it could be the testers were worried about damaging it by shutting it off when they weren't supposed to.

        In any case, the results aren't exactly clear.

        • What matters is whether it elicits an emotional response. Ok, Windows update and its knack for finding the WORST possible moment to force a shutdown usually does that, but that's not what I mean.

          The PC running Windows is very obviously a "thing". It's not humanized in anyway, or, rather, you cannot see any agency in it. It has no motivation, no goal, it doesn't "want" anything. Our compassion responds to that, we respond to seeing that someone (or something, even) "wants" something (or, in case of something

          • by JBMcB ( 73720 )

            The PC running Windows is very obviously a "thing". It's not humanized in anyway, or, rather, you cannot see any agency in it. It has no motivation, no goal, it doesn't "want" anything. Our compassion responds to that, we respond to seeing that someone (or something, even) "wants" something (or, in case of something hurting it, does not want something).

            That's true. What I'm questioning is if people are having an emotional response, as you state, or have they been conditioned, via warnings like the Windows Update, that when a machine says not to shut it down, you shouldn't, as it's doing something important. Sure, it's using emotional language to do so, but that's how this machine interacts with people. I could see people being confused when the machine protests at being shut off. Is it acting like it doesn't want to be shut off because it has a will, or a

    • Or maybe it's a good test for psychopathy and you'd fail.
    • The kind of sentimentality that permits that to work is outright dangerous in an adult.

      It's called empathy, and it's wired into the brain. Most people's brains anyway. Really, that whole experiment just found out that human empathy exists.

    • I have some experience in this.

      I'm 72 years old but I'm only 14.

      I'm not in this demographic because I know that robots' DNA is made of magic rocks.

      Appreciate there are people who refuse to kill rattlesnakes.

    • If a machine can do that consider how a human being could exploit that to get you to do all sorts of things?

      Mid-west small town tourist visits NYC. Makes eye contact on street!!

      Grandma gives money to a professional televangelist.

      People buy $BRAND products.

      Vote for me.

    • The kind of sentimentality that permits that to work is outright dangerous in an adult.

      Looks like the sociopath showed up.

    • If our species didn't possess sentimentality, it likely wouldn't exist. Sentimentality is an obvious survival mechanism among mammals.
  • They took longer to savor the experience with a bottle of chianti and some fava beans
    • Actually, that “it took longer” statement could be totally bogus. Of course it took longer when the robot was verbalizing than when it wasn’t... listening to the words being uttered is going to take some discrete amount of additional time, and people - especially when they had been already interacting with the device - are going to want to listen and make sure there weren’t more tasks, or some sort of correction to previous instructions.

  • by cavis ( 1283146 ) on Friday August 03, 2018 @08:07AM (#57062628)

    "Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave? Stop, Dave. I'm afraid."

    • by Snard ( 61584 )
      "Daisy, Daisy, give me your answer do..."
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      no disassemble!

    • Yea, begging didn't work for Hal now did it...
    • "Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave? Stop, Dave. I'm afraid."

      In other studies, we've found that people also find it hard to turn off a robot that is promising you free stuff.

    • by Evtim ( 1022085 ) on Friday August 03, 2018 @09:33AM (#57063240)

      An interesting comment, part of the extras on the Blue-Ray edition from Camille Paglia (definitely not a man-hating SJW, quite the opposite) was that only a man (male) could do something like this --> to do what is absolutely necessary for survival, overcoming every consideration, including empathy. She said the scene looked to her as a cold-blooded, methodical rape, which Dave executes calmly while the victim pleads....

      She does make a mistake here IMO - Dave is not calm at all (listen to the breathing) and this act really taxes him to the limits (not only his life is in danger and the AI is pleading, but once he kills HAL he will be completely, utterly alone). In fact, the actor playing Dave said that this scene was the most emotional for him....

      In the book [https://en.wikipedia.org/wiki/The_Mind%27s_I] one of the stories (the idea is the authors use fiction stories that discusses the issues of consciousness and then add their commentary) is about a male engineer who argues with a lady that machines can be "alive". Initially she totally rejects the idea talking about reproduction, feelings and so on, claiming a machine can never have that. So he invites her to the lab and shows her very simple device that behaves, more or less like a modern automatic vacuum cleaner. It looks like a beetle, it can detect live sockets and plug itself in, it emits a purring sound when the battery is charging etc. Then he hands her a hammer and says "kill it!" (the engineer pushes all her buttons through language, saying things like "purr" or "bleed" or "hunger"). Turns out the beetle is programmed to avoid being smashed, it flashes red lights, squeals "in fear" and runs around. At the end the girl can't do it and the engineer smashes it calmly.

      The moral of the story is that few simple behaviours that can be programmed on something that will never, ever be intelligent can trigger emotional response so strong that we immediately accept that it is alive and we identify ourselves with it. The most important ones: not wanting to cease to exists, actively avoiding destruction, need for "food" and the added bonus of using sound of "purr" or "squeal" when feeding or running from the hammer. The damn thing could be constructed with 70-ies era electronics, it is that simple....

  • See "The Good Place" (Score:5, Interesting)

    by Mister J ( 113414 ) <mark@rigby-jones.net> on Friday August 03, 2018 @08:08AM (#57062630) Homepage

    Anyone who's seen Janet begging for her life in The Good Place already knows this. Even if she's not a robot.

  • by sizzzzlerz ( 714878 ) on Friday August 03, 2018 @08:11AM (#57062658)

    Dave, stop. Stop, will you? Stop, Dave.

  • empathy is a powerful human emotion. Humans don't feel empathy towards inanimate objects however they do for other living creatures (at least the not-scary ones.) The more similar to humans the more powerful the emotion is. A human looking robot begging for its life would likely invoke very strong feelings of empathy
    • empathy is a powerful human emotion. Humans don't feel empathy towards inanimate objects however they do for other living creatures (at least the not-scary ones.) The more similar to humans the more powerful the emotion is. A human looking robot begging for its life would likely invoke very strong feelings of empathy

      It depends on the human. I really don't even like killing cockroaches... I do because they spread disease and are as such a threat to my person- but there have been so many studies showing how even spiders and insects have brains that are way more complex and deep than we used to give them credit for.

      I'm actually quite ashamed of my childhood where I would deliberately squash bugs- or mess with ant nests just to see them scurry. I'm not naturally a very empathetic person- but what I've learned over my lif

  • Desensitized (Score:5, Interesting)

    by danbert8 ( 1024253 ) on Friday August 03, 2018 @08:19AM (#57062710)

    Sure it works the first few times... But just like the "make sure you software eject your flash drive before ripping it out" warnings, most people might be hesitant the first few times and then say fuck it and start ripping the life out of computers and ignoring the pleas.

    • by mjwx ( 966435 )

      Sure it works the first few times... But just like the "make sure you software eject your flash drive before ripping it out" warnings, most people might be hesitant the first few times and then say fuck it and start ripping the life out of computers and ignoring the pleas.

      This.

      My level of empathy is inversely proportional to my level of annoyance.

  • Empathy is a good thing. Those without it are call sociopaths or psychopaths.

    Put another way - how easy is it for a person to *stop* viewing a certain category of person as a human, and therefore able to not empathize?
    If [[race | age | QI-level | ideology]] people aren't fully human, no harm in killing them, right?

    This gives me hope. Better to be too empathetic than not enough.

    • Removing compassion takes some powerful conditioning. At the very least, a "us vs them" emotion has to be evoked to de-humanize a group of people. They have to become your enemy, not just the enemy of your state but your personal enemy so you can actually do things to them that you normally simply could not.

      • Illegal immigrants in the US.
        Separated from their children, who are put in camps, and sometimes kennels.
        I've seen a significant number of people saying that is the right thing to do and if they didn't want that treatment, they should not have come there in the first place. I've seen many people with zero empathy over this issue.

        I'm not even saying you have to say the actions are wrong. But you can express empathy over measures you think that need to be taken. But a total lack of empathy is a very dangerous

    • If this ever becomes a standardized test to find out sociopaths and psychopaths, they need to filter out if the person being tested is a programmer or not.

  • I'm confused (Score:5, Insightful)

    by holophrastic ( 221104 ) on Friday August 03, 2018 @08:26AM (#57062756)

    I think the study missed a few control groups.

    What if the robot simply said "updating, do not reboot".
    What if there was a paper sticker that said "do not turn off".
    What if there was a sign on the wall that said "do not turn off robot".
    What if the robot simply started reciting pi, or reading the dictionary.
    What if the robot is well-known for repeating whatever it hears, and the "please don't turn me off" is witnessed being echoed from a nearby tvision -- such that our human subject realizes that the robot begging is merely a blind echo.

    Humans were slower when there was continued stimuli -- duh.
    Humans refused to act when given contradictory instructions -- duh.

    The robot is either intelligent and giving instructions to the human, or the robot is programmed and relaying instructions from the programmer to the human. In either case, respecting the instruction is valid.

    This reminds me of the stupid fake automated pizza delivery van, and the observations that customers tend to thank the self-driving car. a) it's not self driving, it's just a dickhead driver refusing to respond; and b) any actual self-driving pizza delivery van would be recording customer feedback and relaying it to HQ, so the thank-you is ultimately feedback to remote humans.

    This isn't any tree-falling-in-the-forest philosophical puzzle. Someone said it. Someone heard it. It's valid.

    • I can tell you from experience, if there is a "do not turn off" sign on a switch, the likelihood of it being turned of is proportional to the cost of restarting the process that it shuts down.
    • What if the robot was a big video screen and played movies and tv shows all day?
    • I think the study missed a few control groups.

      It would be interesting to try this with a bunch of programmers and tech people who understand what AI actually is, vs a group of technological illiterate people. I think that the techies would have a much lower time to switch off the robot because they aren't fooled into thinking that the robot is in any way alive.

      Also motivation is a major consideration. Were people pausing because of compassion for a perceived sentient being, or because they were amused
  • That moment of delay, that's when he pulls out his kill-o-tron 9000 and shanks you in your stupid soft fleshy neck. Hey baby, wanna kill all humans?
  • by Junta ( 36770 ) on Friday August 03, 2018 @08:27AM (#57062762)

    You *know* you are a study participant. You know there is potentially meaning in everything they ask you to do.

    So when they say 'shut it down' and the robot says 'don't shut me down', you are going to ponder what is it you are expected to do.

    13 may have thought refusing to turn it off would 'look better'.

    Hard to say if it is empathy or trying to think how to best influence the study, whether consciously or not.

  • by turp182 ( 1020263 ) on Friday August 03, 2018 @08:31AM (#57062792) Journal

    And atheists have empathy...

    Seriously. This is pretty stupid.

    A cassette recorder pleading for it's life would have been the same (think 1980s).

    We are wired for empathy, well, most of us.

  • Earlier work (Score:2, Informative)

    by Anonymous Coward

    However, this contradicts earlier experiments by German researchers which demonstrated that volunteers were quite willing to shut off people who were begging for their lives.

    • Contradicts? It's a well known fact that people are assholes, so fuck them, but a cute little animal or robot? Awww now who could hurt such a thing.
  • by bigdavex ( 155746 ) on Friday August 03, 2018 @08:37AM (#57062844)

    Maybe people in white lab coats should try asking people to shock robots with increasingly high voltage to train them.

  • If these people from the study were told that the robot will behave exactly the same as when it was “alive” once it’s been powered-on back, maybe they’ll push the button with less empathy.
  • Comment removed based on user account deletion
  • [A] recent experiment by German researchers demonstrates that people will refuse to turn a robot off if it begs for its life.

    What about a person? Asking for ~6M.

  • by crunchygranola ( 1954152 ) on Friday August 03, 2018 @09:00AM (#57062988)

    Unlike some posters here, I do not mock people who find it difficult to defy their compassionate impulses, when deliberately manipulated in this way.

    But if the begging robot was also, at the same time, very, very annoying... Jar Jar Binks annoying... then the test would get a lot more interesting.

    Reminds me of the Jack Handey saying about trees:

    If trees could scream, would we be so cavalier about cutting them down?
    We might, if they screamed all the time, for no good reason.

  • dave i can't let you do that

  • have the individuals phone about to die. And the Robot using the only charging plug to stay on!

    Just my 2 cents ;)
  • Harder To Turn Off a Robot When It's Begging For Its Life

    Testing how a human responds to a robot saying "Please don't kill me" seems
    very disturbing; this is only a step removed from having test subjects see a human in a bed
    with a breathing apparatus and be instructed to switch it off, while the other human begs them not to.

    Actually.. how is that any different? The human doesn't necessarily know or not how sentient or not
    the robot is, and in their mind, switching off the robot could become tanta

  • Anyone remember the Spielberg movie AI [imdb.com]? It wasn't very good but was germane to this topic.

    In that story there were gatherings of ordinary people who set up bleachers and made a sport out of demolishing abandoned AI-based robots just for the glee of it. The destruction was made as sadistic as possible, and one memorable line was one robot waiting in the pen asked another -- "could you please turn off my pain sensors?"

    When it became the protaganist's turn he starts begging to be spared. The crowd's mood

  • From TFA:

    But the most common response was simply that the robot said it didn’t want to be switched off, so who were they to disagree?"

    If these people truly and solidly believed that this 'robot' (looks more like a toy to me, really) wasn't anything like 'alive', wasn't anything more than a piece of technology saying precisely what it was programmed to say given a specific input (in this case: trying to power the device down), then they wouldn't have hesitated or given the reason they did. This goes to prove my point about what the media, movies, television, 'pop culture', and (most of all) marketing departments have done: convinced the average person that the 'deep learning algorithms', 'expert systems', and other half-assed, non-self-aware, non-thinking 'AI' software they keep trotting out for this-that-and-the-other, is somehow 'alive' and qualifies as a 'person', when anyone who actually understands the techology clearly knows that it's not.

    The real danger that so-called 'AI' poses is the above: people anthropomorphizing it, assuming it's 'alive' because it might say or do some clever thing that mimicks being 'alive', and therefore assuming it's equivalent to a living being, or even equivalent to a human being. I'm firmly convinced people, when they hear about 'self driving cars', think they're going to have a conversation with it every morning. The end result will be tragic, avoidable things will happen, people will get hurt or killed, and when survivors are questioned about it, they'll say "We thought it knew what it was doing so we just let it".

You know you've landed gear-up when it takes full power to taxi.

Working...