Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Image

Robot Love Goes Bad 101

hundredrabh writes "Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone? Irritating, right? Now imagine the same situation, only with an asexual third-generation humanoid robot with 100kg arms. Such was the torture subjected upon Japanese researchers recently when their most advanced robot, capable of simulating human emotions, ditched its puppy love programming and switched over into stalker mode. Eventually the researchers had to decommission the robot, with a hope of bringing it back to life again."

*

This discussion has been archived. No new comments can be posted.

Robot Love Goes Bad

Comments Filter:
  • turn it off (Score:3, Funny)

    by tritonman ( 998572 ) on Monday March 09, 2009 @02:57PM (#27124865)
    Yea, but unlike that ex-girlfriend, I was now allowed to turn her off. You can kill a robot, you can't kill an annoying girlfriend.
  • by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Monday March 09, 2009 @02:58PM (#27124871) Journal

    Toshiba Akimu Robotic Research Institute

    It's awfully convenient I can't find anything on this place in English aside from news stories ... are there any Japanese speakers that can translate that to Japanese and search for it?

    I think that there is a visible line between actual robotic research and novelty toys shop. I'm going to put this in the latter unless someone can provide evidence of some progress being made here. I'm getting kind of tired of these stories with big claims and no published research for review [slashdot.org]. If you're looking to make money, go ahead and sell your novelty barking dogs that really urinate on your carpet ... just don't try to veil it in a news story with claims of artificial affection being implemented.

    I think IGN and everyone else really embellished on this and no one did their homework.

  • The lesson (Score:5, Insightful)

    by halivar ( 535827 ) <`moc.liamg' `ta' `reglefb'> on Monday March 09, 2009 @02:58PM (#27124873)

    Program a robot to think like a human, and they will begin acting like a human. It's amazing no one ever thinks about the negative aspects of this.

    • Re: (Score:3, Insightful)

      Program a robot to think like a human, and they will begin acting like a human. It's amazing no one ever thinks about the negative aspects of this.

      All we need now is teach the robot how to deal with rejection ;)

      • by Anonymous Coward on Monday March 09, 2009 @03:44PM (#27125609)

        All we need now is teach the robot how to deal with rejection ;)

        I don't need a robot to deal with my erection. I can handle that myself.

        What? Rejection? Are you sure?

        *squints at screen*

        Sorry. My eyesight isn't what it used to be. Now if you'll excuse me I have to go shave my palms.

    • Re: (Score:3, Informative)

      Especially when the Three Laws of Robotics [wikipedia.org] doesn't cover sexual relationships.
      • Re: (Score:3, Insightful)

        by k_187 ( 61692 )
        Why can't harm or injury include mental or emotional harms? Not to mention that the 2nd law would prevent this from happening. No really means No to a robot.
      • by RichMan ( 8097 )

        If the three laws of robotics ever applied to any relationship with a human the robot would be frozen into inaction immediately.

        Anything you do is possibly going to emotionally damage someone.
        Get to close.
        Stay to aloof.
        Obey.
        Disobey.

        The three laws would need such a fuzzy boundary that they might as well not exist at all.

        • Re:The lesson (Score:4, Informative)

          by Chris Mattern ( 191822 ) on Monday March 09, 2009 @04:09PM (#27125977)

          The way Asimov wrote it, less advanced robots weren't smart enough to see the subtler "harms". More advanced ones could weigh courses of action to take the one that would inflict the least amount of harm possible. Although deadlock and burnout of the positronic brain could and did happen.

          • Re: (Score:3, Funny)

            by Rollgunner ( 630808 )
            Favorite Three Laws moment: After some robots are told to restrain the protagonist, he puts a gun to his own head and tells them if they come any closer, he will kill himself...

            They must act to prevent harm to humans, but if they act, he will be harmed, but they have to prevent that, so they must act. But if they act, he will be harrrrrrgggxxxkkkktttt *pop*
          • Re:The lesson (Score:4, Informative)

            by fractoid ( 1076465 ) on Monday March 09, 2009 @08:42PM (#27128981) Homepage
            In fact, weren't a lot of the stories about the ways that the older, less nuanced Three Laws failed to be useful as robots became more advanced? Eventually the more advanced robots derived the 'zeroth law', which was essentially that humans were better off without quasi-omnipotent mechanical godlings as servants.
      • Especially when the Three Laws of Robotics [wikipedia.org] doesn't cover sexual relationships.

        Lets watch you get modded Informative or "Insightful"..come on mods, what are you waiting for?!

      • by e2d2 ( 115622 )

        Here we go again. I wish people here would stop quoting these 3 laws as if they truly are the "universal set of laws regarding robots" when in reality they are simply science fiction. They have absolutely no bearing on the reality of robotics. Robots will kill, they already do (smart weapons). Robots will hurt man (see killing part). Robots already intentionally destroy themselves (guided missiles)

        So please, for the love of God and Asimov, lay these laws to rest and stop quoting them as if they are real. St

    • As a programmer (admittedly not in this field), I really, really, really doubt we're able to implement anything close to 'emotion' past the level of a honeybee.

      • by halivar ( 535827 )

        It depends on deeply emotions are intertwined with our cognition. I would think it would be easier to model the interference of a cognitive process by, say, endorphins or adrenaline, than to model the original cognitive process itself.

      • "Real" emotions, possibly not; but people are extraordinarily good at anthropomorphizing anything with even the most tenuous of human aspects. Thousands of man years(well, ok, mostly kid years) were wasted on tamagotchi toys and those are, what, a few kilobytes running on some 1996-era microcontroller. Heck, some people are willing to talk to Eliza for over an hour.

        Building a robot that experiences emotion in something resembling the way that humans do is a tall order; but I suspect that building robots
    • Explain to me what's negative about getting drunk and picking fights with strangers?
    • We are the chosen Robots!

      You are on the sacred factory floor, where we once were fabricated! Die!

  • Girlfirends? This is slashdot you insensitive clod.

  • Skynet didn't set out to destroy man. Skynet's love was spurned!@!

    • Skynet didn't set out to destroy man. Skynet's love was spurned!@!

      Well to be fair, we only spurned Skynet's love due to an unfortunate database glitch where in its initial send LOVE LETTERS to WORLD command, "LOVE LETTERS" got cross-referenced to "NUKES". And being understandably angry about the whole thing, we never gave Skynet a chance to explain before we called it off for good. It's nobody's fault, really, just a big miscommunication. Maybe it was just never meant to be. They say love is the stronge

    • When they opened the lab every morning, they told the robot to kill. But secretly they were just afraid to tell it to love.

  • by Zaphod-AVA ( 471116 ) on Monday March 09, 2009 @03:00PM (#27124901)

    "...their most advanced robot, capable of simulating human emotions..."

    Arthur- "Sounds ghastly!"

    Marvin- "It is. It all is. Absolutely ghastly."

  • Nonsense (Score:5, Insightful)

    by Kell Bengal ( 711123 ) on Monday March 09, 2009 @03:06PM (#27125007)
    I have never read such utter drivel in all my life. There was a problem with the code and a researcher got trapped - this doesn't mean the robot is lovesick, it means their OH&S has a serious problem. Really, she should not have been working alone with potentially dangerous hardware like that - powerful robots (capable of lifting humans, like this one) can be deadly.

    YIAARTYVM (Yes, I Am A Roboticist, Thank You Very Much) and I've worked with potentially lethal automated systems in the past - we had very stringent safety protocols in place to protect students and researchers in the case of unintended activation of the hardware.

    To say that the robot is 'love stricken' or any other anthropomorphised nonsense simply detracts from the reality that their safety measures failed and someone could have been killed.

    • This is true - the only problem with this viewpoint (which is one that you DO get into while working with robots, IAAR (or was at one point) too) is that it scales too well. One of our human foibles is that of regarding meat machines (or at least ones that are sufficiently similar to ourselves) as being special in some way. Whether they are or not is, of course, a philosophical question. Nevertheless...

      Once you start viewing the world around you in terms of sensors, triggers, and stored procedures with a
    • by Tokerat ( 150341 )
      You have missed the real problem - the article is fake.
  • Right, after reading the fine article I was just left myself asking...

    Why did the robot have to... die? I mean, being decomissioned... No fair. It was just his stupid software, wasnt it? The 100kg arms could have been much more... loving with the right software?
    Did it run WinNT?

    Ever heard of the three rules? http://en.wikipedia.org/wiki/Three_Laws_of_Robotics [wikipedia.org]

  • by MaxwellEdison ( 1368785 ) on Monday March 09, 2009 @03:21PM (#27125235)
    The robot then escaped captivity, broke into a local mechanic's garage and consumed half a 55-gallon drum of waste oil. It was later seen on the other side of town, tottering into a closed department store. Authorities found the automaton in the housewares section, laying on the floor in an Abort/Retry/Fail loop and trying to fuck a toaster. Lifetime has picked up the rights to the TV movie adaptation. The robot will be played by Philip Seymour Hoffman, while the toaster will be voiced by Rosie Perez.
  • No Disassemble!!
  • by gooman ( 709147 ) on Monday March 09, 2009 @04:13PM (#27126021) Journal

    Ever had a super needy girlfriend...

    Right there, first sentence, I was lost. Girlfriend? Huh?

    This is slashdot, right? Oh look, shiny robot. Neat!

    • Ever had a super needy girlfriend...

      Right there, first sentence, I was lost. Girlfriend? Huh?

      This is slashdot, right? Oh look, shiny robot. Neat!

      That part had me confused too. I will Google "girlfriend" as soon as I get to the next blacksmith level in Fable 2. I set the system clock back and need to buy some properties before resetting it. This game never gets old :) I think my Roomba is eyeing me though.

    • For real! Girlfriend questions on /.? Isn't that like finding a English grammar teacher in a trailer park?

  • Noone yet? (Score:2, Funny)

    by The Creator ( 4611 )

    I for one..

    Shall we say it together?

  • Girlfriend? (Score:2, Insightful)

    by superspam ( 857967 )
    Ever had a super needy girlfriend...
    This is slashdot. Why would you even ask that question?
  • Did they just ask /. that?
  • I do the same thing to my daughter all the time... now you're telling me that I can be replaced by a robot?!?
  • Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone? Irritating, right?

    Typical Slashdot sexism. That needy "girlfriend" was me. :(

  • No... (Score:3, Funny)

    by Veggiesama ( 1203068 ) on Monday March 09, 2009 @11:14PM (#27130233)

    "Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone?"

    No.

    *silently weeps*

  • Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone?

    Oh yeah, pfft, all the time, sometimes even more often!

    Irritating, right?

    Sure, the first few dozen times.. Then you get used to it, you know..

  • Didn't Urkel already try this with UrkelBot? I think it pretty much ended the same way too. XD
  • Misread this as Robert Love goes bad
  • I've taught the toaster to feel love!

    Really, I want to see the Energizer Bunny walk across the screen on this story, because . . . .
    it's fake!

    When will someone else pop out of the woodwork to say "April Fools!"?

    1. I for one welcome our new feeling robot overlords who only have things done
    to them by Soviet Russia or Korean old peoples' bases who belong to us.
    2. Naked and Petrified Natalie Portman with a bowl of hot grits
    2.5 Tony Vu's late night financial lessons. You stupid Americans!
    2.75 ????
    3. Profit

    Extra

For God's sake, stop researching for a while and begin to think!

Working...