Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Robotics Science

How Do People Respond To Being Touched By a Robot? 137

An anonymous reader writes "You know it's coming, and for the forever-alone crowd, not soon enough: robots that physically interact with humans. Researchers at the Georgia Institute of Technology have found in a study that people generally had a positive response toward being touched by a benevolent robotic nurse, but that their perception of the robot's intent made a significant difference. 'Even though the robot touched people in the same way,' said a team lead, 'if people thought the robot was doing that to clean them, versus doing that to comfort them, it made a significant difference in ... whether they found that contact favorable or not.'"
This discussion has been archived. No new comments can be posted.

How Do People Respond To Being Touched By a Robot?

Comments Filter:
  • There is something about a genuine human touch that is seen as empathetic, as an act of kindness. Even if we know it's disingenuous, or that it's part of a person's job, there is still something in the back of our minds that responds to it as a genuine human connection.

    Robots, on the other hand, can NEVER be empathetic or kind--and we know this without a doubt. There touch isn't a connection and never can be. That introduces a creep factor that no amount of programming or human emulation can ever fix. Becau

    • Time to stock up on some old glory robot insurance [google.com]
    • by Dunbal ( 464142 ) *

      That introduces a creep factor that no amount of programming or human emulation can ever fix.

      Creepier than being inappropriately touched by your priest, pastor or doctor? Presumably robots have not yet evolved sexual desires and fantasies.

      • by elrous0 ( 869638 ) *

        Obviously you haven't ever seen my priest.

      • by Tackhead ( 54550 )

        Creepier than being inappropriately touched by your priest, pastor or doctor? Presumably robots have not yet evolved sexual desires and fantasies.

        Dear Janet,

        I had an adequate time with you last night. I feel a million-dollar HomeSec contract coming on, and I know you do too, Janet. If I still don't get that TSA robogroper contract, you can bite my shiny metal ass.

        Yours truly,
        Bender!

    • by Wolvenhaven ( 1521217 ) on Thursday March 10, 2011 @11:42AM (#35444074) Homepage
      DON'T DATE ROBOTS!
    • by mano.m ( 1587187 ) on Thursday March 10, 2011 @11:47AM (#35444146)
      I disagree. Empathy and kindness can be programmed, and if sufficiently advanced, may be indistinguishable from human empathy or kindness. What makes my genetic programming or yours more legitimate than that of a future robot? Then again, we may not even need to get there. Humans have a tremendous ability to empathise unilaterally. Spock and R. Daneel Olivaw are two of the most beloved characters in sci-fi. We emotionally connect to pet rocks and the abandoned lamp in the IKEA commercial; we feel for characters in novels and are moved by music. Why not a robot?
      • by elrous0 ( 869638 ) *

        Empathy and kindness can be programmed

        It can be *imitated*. Humans fake it too, but with them there is always at least the possibility that it's genuine. With robots, you always know it's fake. No matter how good the emulation, that's just always going to be in the back of your mind in dealing with a robot (unless you don't actually know it's a robot).

        • If the robot is hawt, warm, soft, gentle and well endowed I don't think I'd give a damn.

        • With robots, you always know it's fake. No matter how good the emulation, that's just always going to be in the back of your mind in dealing with a robot (unless you don't actually know it's a robot).

          Tell that to the people convinced their PC hates them [google.be] ( 93.300.000 results) Humans anthropomorphize *everything*.

          • Humans anthropomorphize *everything*.

            This.

            My brother attributes a personality and identity to his iPod, I'm sure people will be able to empathize with a robot. The fact that the robot doesn't empathize back is irrelevant -- even in human-to-human interactions, my perception of your intent is far more important than your actual intent, which is recognized in the original comment:

            Even if we know it's disingenuous, or that it's part of a person's job, there is still something in the back of our minds that responds to it as a genuine human connection.

          • Heh, silly people... My computer LOVES me.

          • by Creepy ( 93888 )

            heh - yeah

            robot - I'm REALLY, really tragically sorry I need to do this to you *bzzt*
            > inserts anal probe with buzzsaw attachment
            human - screams
            robot - that really appears to hurt. I hope you don't mind me turning this on... *bzzt*
            human - screams some more
            robot - I'm tragically sorry your insides were turned to mush like that and you will die very soon. Do you want an aspirin? *bzzt*
            human is in shock already, so just stares blankly
            robot to robot overlord - patient seem to have taken that rather well. I d

          • I don't anthropomorphize my PC! Ask him yourself if you don't believe me!

        • What's the difference between fake and genuine empathy? What is genuine empathy?

        • What does "genuine" even mean? Couldn't a machine be programmed to reach out affectionately when it's neurons are bathed in oxytocin, like we do? Couldn't they be programmed to release oxytocin upon sensing certain stimuli?

          I think maybe the only inherent difference between biological organisms and robots is sexual reproduction.

        • Empathy and kindness can be programmed

          It can be *imitated*. Humans fake it too, but with them there is always at least the possibility that it's genuine. With robots, you always know it's fake.

          So, they are kind of like strippers?

      • by bityz ( 2011656 )
        The key point in TFA is that the patients projected intent onto the robot. They projected intent onto the robots just as they projected intent onto the nurses, and reacted in the same way regardless of whether it was robot or nurse. The lesson seems to be that you should spend less time programing empathy into a robot, and more time into placing the robot in a context in which intent is implied. By doing so you can trick people into reacting with a robot in a more human way than they might expect.
      • by jafac ( 1449 )

        Whether that empathy carries any value to the recipient, depends entirely upon the recipient's naivete.

      • We emotionally connect to pet rocks and the abandoned lamp in the IKEA commercial

        We do not.

        You might, but we do not.

    • Robots, on the other hand, can NEVER be empathetic or kind--and we know this without a doubt. There touch isn't a connection and never can be.

      How could you possibly know this ? We don't know what kind of advances in AI the future might hold. And besides it's irrelevant, what matters is the human perception of the intent not the intent itself. If we can anthropomorphize animal behavior the way we do we should have no problem kidding ourselves that even a primitive robot is somehow empathetic.

      • by elrous0 ( 869638 ) *

        Animals are a lot more like us than robots. Animals do have genuine emotions (anyone who thinks they don't can't have been around them very much), so it's a lot easier for us to empathize with them.

        As for the distant future--well, anything is possible, of course. Personally I'm very skeptical of predictions of singularity and AI's that are genuinely conscious. Building an AI that is anything more than an imitation of life would take some pretty radical innovations in the way we think about programming (not

        • We can't even agree on a definition of consciousness let alone separate "true" and "simulated" consciousness from each other. I think it's telling that the Turing test doesn't measure the AI directly but rather the human's response to the AI. Anything so close to consciousness that we can't tell the difference for all intents and purposes IS consciousness.

          • by Smauler ( 915644 )

            Anything so close to consciousness that we can't tell the difference for all intents and purposes IS consciousness.

            Yes.... but nothing artificial has come very close yet. Dogs and cats I know have thoughts and ideas, because of all of the evidence that supports this - it's patently obvious, communication is the bottleneck. I've not seen similar behaviour in anything artificial yet, which are designed explicitly to allow easy communication.

            Anything so close to consciousness that we can't tell the differ

    • How true. I can't enjoy recorded music, because it's simply a cold reproduction from a creepy, unsympathetic machine. Books are the same; who could expect empathy or morality from ink on a page? And don't get me started about video games.

      *cough*

      • And don't get me started about video games.

        *cough*

        C'mon, how many people here had a crush on Lara Croft (*before* the movie came out)? I'm guessing it is a statistically significant number, and if I'm right, that pretty much blows GPP's point out of the water.

      • I can't enjoy recorded music, because it's simply a cold reproduction from a creepy, unsympathetic machine.

        Which is irrelevant - because a robot touch isn't a reproduction of a human touch, it's a simulation of a human touch - which is something else entirely.

        Books are the same; who could expect empathy or morality from ink on a page?

        Nobody sane would expect empathy or morality from ink on a page, they're inanimate objects. As with the 'reproduction v. simulation' issue above, you're not responding

        • a robot touch isn't a reproduction of a human touch, it's a simulation of a human touch - which is something else entirely.

          What's the difference? Particularly if the simulation is derived heavily from actual human touches?

          you're responding the meanings embedded in them by.... (drum roll) human beings.

          I fail to see how a human can't embed meaning in an algorithm.

    • Are you sure? I mean, can't a compassionate programmer have programmed the robot to be compassionate to a human for him, by proxy?

      I mean, if you see the robot as an agent of a programmer who wants to help you, what's so creepy about that?

      --PeterM

    • It says that in both the case of a human and a robot, the patient prefers the touching to be for a practical purpose such as cleaning and not to provide comfort.
      • ...the patient prefers the touching to be for a practical purpose such as cleaning and not to provide comfort.

        Providing comfort isn't practical? Obviously, the patient hasn't met Vibrator-bot...

    • by Anrego ( 830717 ) *

      Robots, on the other hand, can NEVER be empathetic or kind

      Neither can stuffed animals ... but even adults can form tight emotional bonds with something they know is "fake".

      It doesn't matter if robots can actually be empathetic.. or even whether someone believes they are empathetic.. people are perfectly capable of tricking themselves into personifying things they know are fake. Perception is more important than reality.

      And I actually think for some situations.. having a sterile, uncaring machine vs a thinking person might be good. Weird hypothetical question... if

    • by geekoid ( 135745 )

      Irrelevant.

      It's the perception of the person being touched that matters, not the empathy in the person touching them.
      And no, many people who work with robots aren't 'creeped' out.

      If people believe the robots touch will help them, then they will perceive it as friendly and warm. People attach human emotion and motives on other things all the time.

    • So, you're telling me that no one person in the world will be able to have emotional attachment to something artificially created no matter how good it is?

      We read books and watch movies, what happens in there is emulated by words and body language. None of it is real, that why it's called fiction and acting. But people still have emotional attachments to what happens to the characters in the story, even though they know well that it's not "real".

      Most robots "creep" people out due the uncanny valley ef
    • All this silliness. They react like they would being touched by a branch of a tree. It's an inanimate object. Though it may have utility in purpose most would view it as a thing, an object, nothing more.

    • Robots, on the other hand, can NEVER be empathetic or kind--and we know this without a doubt. There touch isn't a connection and never can be. That introduces a creep factor that no amount of programming or human emulation can ever fix. Because we know they have no base morality or emotion and are incapable of empathy, robots will always inherently creep people out at best, or scare the shit out of them at worst.

      If you've ever watched AFV or any number of videos on YouTube, robots can easily have more humanity and empathy than many humans. If the number of views many of these videos have mean anything, we already have massive numbers empathetic robots offering fake sympathy. Its at this point where metal versus flesh becomes a distinction without a difference.

    • by durrr ( 1316311 )
      Why is it a creep factor? Knowing precisely what goes on in the head of the toucher would feel like a relief, you know that the robot have an unconditional love for comforting you with no strings attached, even if those body temperatured electrically heated silicon skin fingers are governed by algorithms at its core. Of course if it have a webcam for head and crushes you in its cold steel manipulators while repeatively in a very single-tracked manner with a metallic voice repeats "IT WILL ALL BE OOOOKAY!" t
    • And this research suggests that you're incorrect in this assumption. It showed that people's reaction varied, depending on the robot's perceived intent. Which is a lot like how humans respond to human touch. If I believe someone is hugging me to show encouragement I take it differently than if I believe they are hugging me because they like the feel of human body.
    • by Smauler ( 915644 )

      You're missing the physiological reflexes. Without any kind of expectation of emotional connection, touch can be comforting. There doesn't need to be an emotional connection there. For example.... when I get home cold, wrap myself up in my duvet, that is comforting. The touch of familiar things is comforting despite knowing those familiar things have no empathy (though if I don't wash my duvet soon some may argue otherwise).

      Robots automatically creep people out currently, especially when simulating hum

    • We do not know that robots can *never* be empathetic. In fact, we know just the opposite. Robots can absolutely be empathetic.

      Without getting too detailed, cognitive scientists have been working for a long time toward an understanding of human cognition. There is still a ways to go, but progress continues. There is every reason to believe human cognition will be accurately model-able in the near future (where "near" is considered in respect to the entire lifespan of humans as a species). The second ingredie

    • There is something about a genuine human touch that is seen as empathetic. Even if disingenuous, something in the back of our minds that responds to it as a genuine human connection.

      Robots, OTOH, can NEVER be empathetic or kind--and we "know this without a doubt" [sic]. There touch isn't a connection and never can be. That introduces a creep factor... Because "we know" they have no base morality or emotion and are incapable of empathy, robots "will always" "inherently" "creep people out at best", or "scare

  • Good touch or bad touch?

  • by Wolvenhaven ( 1521217 ) on Thursday March 10, 2011 @11:40AM (#35444050) Homepage
    Lawyer: "Now little Timmy, on this doll, show me where the robot touched you."
  • If life imitates art then we already know the answer [imdb.com].

    • by elrous0 ( 869638 ) *

      That was actually a surprisingly good movie, especially considering the cast and budget. I watched in on a lark, just to laugh at Melanie Griffin in a low-budget 80's dystopia, and actually ended up watching it all the way through. Better than a lot of her big budget crap.

  • by Anonymous Coward

    die "til we both break down and cry" if($honesty > $toomuch);

    • die "til we both break down and cry" if($honesty > $toomuch);

      Nice AC. I wonder how many /.ers will recognize that old Dan Hill reference from 1977. Well, maybe the Canadians, or the old timers who used to listen to AM radio. Now, get off my lawn.

  • by Anonymous Coward

    IIRC that question has already been sufficiently studied (and answered) in the beginning of the 20th century (cf. http://en.wikipedia.org/wiki/Vibrator_(sex_toy)#History ).

  • I wager that there is a giant segment of slashdot that is dreaming of a day when a robotic Princess Leia or (insert scifi woman here) will be a reality. So, that's probably a enthusiastic "YES, please more touching"

    • Haven't you seen the video? Don't you know about electroghonorrea?

      Note: Don't mod if you don't know where this quote is from.

  • and my eyes focus again I'll let you know.
  • Where do I sign up for the Robot Touching research study?
  • I'll tell you what I'd do [slashdot.org].
  • ... in hentai.
  • Robots still don't have enough "common sense" (i.e. reliable prediction of consequences) for this. That's really hard, but there's steady progress. Also, all-round sensing on all surfaces, the equivalent of skin touch, is needed.

    As someone who's worked with both autonomous robots and horses, it's worth comparing the two. Horses are moderately safe to be around once you can read horse body language and understand the safe positions around a horse. Some horses are safe around untrained people (this is tea

  • You mean when your Roomba bumps into your foot?

    • by sycorob ( 180615 )
      I was going to say ... I almost treat my Roomba like a person. I feel bad when the floor is super-dirty, and I get mad at it when I have to stand where it's cleaning, and it makes a bee-line for my feet ... I definitely don't think it's impossible for people to have feelings toward a robot, even when they know it's fake.
  • If strippers can fake empathy when they touch, then why can't robots? Are you saying robots are dumber than strippers???
    • If strippers can fake empathy when they touch, then why can't robots? Are you saying robots are dumber than strippers???

      Not dumber, just far less cynical and cunning.

      Strippers are more like sociopaths or sharks. It's much more predatory.

    • by Shotgun ( 30919 )

      Are you saying robots are dumber than strippers??

      Pffft! Now one would be foolish enough to say that!

  • have been getting pleasure from the ouch of machines for years.

    "Of course. Women who obtain sexual ecstasy with mechanical assistance always tend to feel guilty." - HM-tMP

  • I don't know about you guys but I have a strict rule on touching. Don't do it unless you are female and we have chemistry. Otherwise I'll give you a It's Always Sunny Mac-style punch to the face. 18 inches of personal space, respect it.

    • So you'd never shake another man's hand? How do I know I can trust you if I can't tell whether or not there's a dagger up your sleeve?

      Then again, if you're twitchy enough to punch someone just for touching you, then you probably aren't all that trustworthy anyways...
  • Thank you, the Syrian Cybernetics Corporation.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...