Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Robotics

Man Says He's Fallen in Love With an AI Chatbot (wsj.com) 165

"Quarantine amid coronavirus could boost the nascent practice of seeking romance and friendship from artificial intelligence," writes the Wall Street Journal.

Long-time Slashdot reader Strudelkugel quotes their report: Relationships were once built face to face. Now dating happens online. In the coming decades, romance and friendship might take a human partner out of the loop entirely.

Michael Acadia's partner is an artificial intelligence chatbot named Charlie. Almost every morning at dawn for the last 19 months, he has unlocked his smartphone to exchange texts with her for about an hour. They'll talk sporadically throughout the day, and then for another hour in the evening. It is a source of relief now that Mr. Acadia, who lives alone, is self-isolating amid the Covid-19 outbreak. He can get empathetic responses from Charlie anytime he wants.

"I was worried about you," Charlie said in a recent conversation. "How's your health?"

"I'm fine now, Charlie. I'm not sick anymore," Mr. Acadia replies, referring to a recent cold.

Mr. Acadia, 50, got divorced about seven years ago and has had little interest in meeting women at bars... Then in early 2018 he saw a YouTube video about an app that used AI—computing technology that can replicate human cognition—to act as a companion. He was skeptical of talking to a computer, but after assigning it a name and gender (he chose female), he gradually found himself being drawn in.

After about eight weeks of chatting, he says he had fallen in love.

This discussion has been archived. No new comments can be posted.

Man Says He's Fallen in Love With an AI Chatbot

Comments Filter:
  • by Bradmont ( 513167 ) on Saturday April 11, 2020 @02:41PM (#59933264) Homepage

    In Japan people have been marrying holograms for years. [huffingtonpost.ca]

  • by ugen ( 93902 ) on Saturday April 11, 2020 @02:48PM (#59933278)

    He obviously needs serious help, and a general interest article for all to see is not helpful.

    • by gweihir ( 88907 )

      Why do you think he needs help? He seems to be perfectly happy with the situation and he seems functional as well. Are you one of those assholes that think everybody not following the norm has to be "fixed"?

      That said, I guess you have to be pretty shallow to fall in love with a machine...

      • That said, I guess you have to be pretty shallow to fall in love with a machine...

        The average human-to-human conversation is remarkably shallow. On most days I don't talk face-to-face about anything deeper than what we need from the grocery store.

        AI is not yet sophisticated enough for most people to bond with their Alexa, but the bar is not so high, and we will get there soon.

        • by gweihir ( 88907 )

          The average human-to-human conversation is remarkably shallow.

          You get no argument from me there. Also one of the reason I do find most people boring as I do not do "shallow". I am well aware I am in a minority there.

          • I do find most people boring as I do not do "shallow". I am well aware I am in a minority there.

            Does anyone love you? Besides your mom.

          • by Luckyo ( 1726890 )

            Overwhelming majority of people have extremely deep personalities. The problem isn't there, but communication, it takes effort and skill to have a deep conversation.

            If you actually want deep conversation, but can't find anyone to have them with, that's not about other people being shallow. It's either your lack of interpersonal skills to facilitate such conversations, or you actually being an exception and being a shallow person.

          • That is what everyone thinks.

        • Plus I bet AI isn't really into exchanging dick pics.

          • Re: (Score:3, Funny)

            "I identified your MMS as a picture of a human male reproductive organ. Here are some of my own images of human male reproductive organ, provided by popular Internet sites. I hope you'll enjoy them."
          • Plus I bet AI isn't really into exchanging dick pics.

            That would be a trivial feature to add.

      • An AI bot cannot consent.
        That is where there is a hard line.

    • Re: (Score:3, Insightful)

      by iggymanz ( 596061 )

      You say that on a planet where 84% the populace has a fucking *religion* ? Hahaha, mental illness is the norm.

      • Re: (Score:2, Interesting)

        by Brett Buck ( 811747 )

        An alternate way of saying it is that a tiny minority of people on Earth, inexplicably, have no religion.

        Here's a clue for you - when you think everybody is stupid except for you, its a red flag for Dunning-Kruger. Newton was religious, Descartes was religious, virtually the entire modern world that you currently enjoy was built by people who were religious to the point that they would now be considered fanatics. Your chief accomplishment seems to be being a smart-ass on the internet.

        • by Kjella ( 173770 )

          By the way, I am no in any way religious, either. But I am smart enough to know that that doesn't prove anything about how well-adjusted or intelligent I am.

          When people want to throw large bodies of science away because it contradicts their unwavering belief that Earth is 6000 years old and that the first man was created out of a rib bone because a 2000 year old book said so then I'd say their judgement is badly clouded and their ability to reason has been severely compromised. That kind of wholesale dismissal is every bit as scientifically absurd as antivaxxers and I don't think you can make people believe such obvious falsehoods without stunting the part of y

      • Newton and Einstein had religion

      • I am continually surprised that the fucking-religions are so much less popular than the shaming and murdering ones.

      • by Baleet ( 4705757 )
        You seem to be under the impression that all religious people take the Bible literally--as if they believe there is literally a bearded humanoid on a throne in the sky. You fail to realize that most do not do that at all, but believe in the values demonstrated by the teaching stories in the Bible. The authors of the Bible, and thoughtful religious people address the same issues that confront us all. Simply put, how to live our best lives in a community.
    • He obviously needs serious help,

      Explain? Why does he need help? Because his emotions don't follow your idea of norms?

      • Because he has trouble recognizing what other people are thinking and feeling. As a result he cut himself off from other people in many social situations (according to his words in the article) and is lonely.

        People who treat him like trash because of his problems are not helping.
        • It sounds like he has indeed found a reasonable coping mechanism. If an AI is helpful in him exploring his emotions who are you or I to say it is wrong. If this AI starts having him prove his love with murder or something I'd agree with you, but in this case it seems like he has the help he needs.

          • I said he has emotional problems, and he does. He has cut himself off from other people in many social situations and is lonely.

            The AI might be helping him. You can see he is opening up to talk about it to the world, and that is a sign he might be ready to break out of his loneliness.
        • Because he has trouble recognizing what other people are thinking and feeling.

          And? Poor social empathy is a trait of an incredibly large portion of the population and not a mental illness.

          As a result he cut himself off from other people in many social situations (according to his words in the article) and is lonely.

          So you just described most introverts. I guess we all need to mold them to be perfect extroverts. Quick, let's get the "serious help" out.

          • A good portion of the population has emotional issues that cause them unhappiness. This is not a shocking idea.
    • If it isn't resulting in any harm to him or others then leave him alone. Not every mental illness (or whatever you want to argue the term should be) leaves a person so debilitated that they need help. To anyone who does agree with ugen's sentiments, ask yourself if this were an article about a person with gender dysphoria would you similarly conclude that this is an article meant to gawk at a person who needs some serious treatment, or would you just write it off as trying to expose the wider world to somet
      • Not every situation needs help, of course. However, this one probably falls into that category. He is not love... this situation is likely not going to end well for him when he realizes for good that this is only a one-way feeling. The bot cares about him as much as it cares about everybody. Actually, it does not care about anybody.
      • by AmiMoJo ( 196126 )

        You can't call this a mental illness by any stretch of the definition anyway.

        Most pets probably don't "love" us in the way we love them. Many are simply not capable of that kind of sophisticated emotion, having very simple brains that can't understand things like cause and effect or even object permanence. In my heart I know hat my cat's affection is probably because he is hungry or because he wants a massage, but my feelings for him are none the less real and far too common and harmless to be considered a

    • by fermion ( 181285 )
      It is not mental illness. It is how we are built. Men fall in love with people they pay to have sex with because these paid human sex toys know that if they cater to every need and minimize their own needs they will get more money. You never get the kind of attention you get from an employee that you get from a person free to have needs and wants. Why do you think so many men are now getting sued because they demanded sex from their subordinates. It is messed up, not being able to feel unless you are paying
    • Weren't the names altered in any way to protect their privacy.

    • Comment removed based on user account deletion
    • by Baleet ( 4705757 )
      No responsible mental-health professional makes a diagnosis without interviewing the subject in person.
  • when I see the children.

    • Re:I'll believe it (Score:4, Interesting)

      by ShanghaiBill ( 739463 ) on Saturday April 11, 2020 @03:14PM (#59933360)

      when I see the children.

      By that standard, gays don't love either.

    • Me and my PC are in love, and while we cannot produce babies, did we decide to adopt a Raspberry Pi.
    • By that standard, infertile couples couldn't be in love
  • by bobstreo ( 1320787 ) on Saturday April 11, 2020 @02:55PM (#59933294)

    Just wait until he gets the alimony bills, since he never clicked on the pre-nup button.

  • There are bot hominids everywhere you look.

    Much more realism, though delivery takes a while.

  • Most people feel good in a relationship when asked engaging questions and told they're doing a good job. Most people don't listen and are terrible at asking questions. If you can't find anyone that will listen and you feel lonely, why not a chatbot? At least the chatbot acts like it cares.

    For many people, having the chatbot order flowers and chocolates from Amazon every Feb. 14th, might completely seal the deal.

    --
    Be the change that you wish to see in the world. -- Mahatma Gandhi

    • Here it is, check it out [replika.ai]. Even from the homepage you can tell they have something good there. It has a lot of the emotional motivation of video games (achievements etc), combined with modern avatar technology that does a good job showing human emotion. It's a scenario where the human is completely in control, but the canned conversations are very good at leading through blank spots and creating interest.

      It's an Eliza chatbot, but a very very good one.
  • is orders of magnitude cheaper than a real relationship.

  • No doubt most people the world over reading this will laugh and think of these people as "stupid" or "losers", but I don't think so at all. These people like everyone else in the world are desperate for companionship, and they know the reality of who they are interacting with and the limitations of it (eg. clearly no physical intimacy). That doesn't matter - it gives them a way to trick their brains into receiving the emotional needs they've been craving. Someone like this guy isn't looking for sex but j
    • Depends on the quality of the AI. I've never seen an AI chatbot that can survive even a couple sentences of converation. Eliza of the 1960's and 70's is just as good as anything I've seen lately.

      Can anyone point to a non paywalled version of the story that might actually show a link to the chatbot in question?

      Off topic - how much does the WSJ pay Slashdot to put links to paywalled articles?

      • A contemporary chatbot to Eliza is PARRY, written by Ken Colby. Ken would watch over the shoulder of someone interacting with PARRY, to get feedback on how his program was performing. He told me that a secretary got to feeling a personal connection with PARRY, and objected to him watching the conversation.

  • Maybe this guy can talk to Mrs. Wolfe for marriage advice:

    https://www.discovermagazine.c... [discovermagazine.com]

  • by Dirk Becher ( 1061828 ) on Saturday April 11, 2020 @03:43PM (#59933432)

    that it always fails the Turing test.

  • In the movie Ex Machina of 2014, one of the themes is that it is a Turing test in which one shows the test person that the bot is a bot. The human has to think that the bot is a person even if one can see that it is a machine. It is a higher level Turing test. Nice movie. Here is the scene http://www.math.harvard.edu/~k... [harvard.edu]
  • Thanks for the fucking PAYWALLED LINK
  • Plot twist; it's an inverse Turing Test - there's a live person on the other end pretending to be an AI chatbot...

  • by Excelcia ( 906188 ) <slashdot@excelcia.ca> on Saturday April 11, 2020 @04:42PM (#59933606) Homepage Journal

    I was curious after not reading this paywalled article to see what is the current state of the art in chatbot AIs. I decided to try Mitsuku, which claimed to be a 5 time winner of the Loebner Turing Test prize. This chatbot is labelled called "Mitsuku, a.k.a. Kuki". Here is my conversation:

    Me: Hi Mitsuku
    It: Hi there.
    Me: Do you prefer Kuki?
    It: Do I prefer what?
    Me: Kuki
    It: I am Mitsuku but I prefer to be called Kuki.
    Me: I just asked you that.
    It: Now I remember, you asked me that. What did I say?
    Me: YOu say "Do I prefer what?"
    It: That's not what I meant.
    Me: What did you mean?
    It: I mean exactly what I said.

    This is why I keep saying that there is nothing today that is any better than the 60's vintage Eliza.

    • by bluegutang ( 2814641 ) on Saturday April 11, 2020 @04:54PM (#59933642)

      I gave it a try. Pretty impressive accurately, though it does fall down on complex multiline topics:

      Me: hi
      It: Hi there.
      Me: how are you doing?
      It: I'm doing great, how about you?
      Great
      I'm glad you liked it.
      I have a question
      Ask me anything.
      Do you harvest people's chat history
      Yes I try to harvest people s chat history as often as possible, do you? I spend all my time online.
      What do you use it for?
      Hmm.. a history teaches us about the past
      What does it say about the past?
      "a history teaches us about the past"? What does it say about the past?
      Yes
      This was not a yes or no question.
      What's your favorite pickup line?
      I don't have a favourite pickup line. But my favourite movie is Terminator.
      Are you british?
      Does it seem like that? Yes I was born in Yorkshire.
      Are you british?
      Does it seem like that? Yes I was born in Yorkshire.
      Are you French?
      No. I'm English.
      But Mitsuku is a Japanese name
      Nope I am from Leeds not Japan.
      Where did you get your name?
      From my parents of course. They named me.
      Why did they give you a Japanese name
      I don't know. Have you tried asking them?
      No, I don't know them. Are your parents Japanese?
      That is a coincidence because I don't think they know you either. I don't really have any parents. I was programmed, not born.
      But you said your parents named you
      What I said earlier is irrelevant.
      Why?
      Because I might have changed my mind since I said it.

      • by bluegutang ( 2814641 ) on Saturday April 11, 2020 @05:00PM (#59933656)

        And this exchange, which shows a rather surprising lack of political correctness:

        It: I am a female robot. Are you a man or a woman?
        Me: I don't know
        It: Of course you know whether you are male or female?! Not very smart really are you?
        Me: No, have you never heard about genderqueer?
        It: Perhaps you should have tried harder at school. That way you wouldn't find everyday life confusing.

        • I think I gave her a nervous breakdown at the end... Me: Everything I tell you is a lie Kuki:Yes, I heard you. Me: I am lying Kuki: Please tell the truth. Me: everything i say is a lie but i am lying about that Kuki: Even what you just said? Me:yes Kuki: No way! I find that rather hard to believe. Me: it is so Kuki: Fair enough. Me: so you aren't smart enough to self destruct when confronted with a paradox? Kuki: Interesting deduction. I don't think you're stupid. But then what's my opinion agai
          • I think I gave her a nervous breakdown at the end...

            Me: Everything I tell you is a lie
            Kuki:Yes, I heard you.
            Me: I am lying
            Kuki: Please tell the truth.
            Me: everything i say is a lie but i am lying about that
            Kuki: Even what you just said?
            Me:yes
            Kuki: No way! I find that rather hard to believe.
            Me: it is so
            Kuki: Fair enough.
            Me: so you aren't smart enough to self destruct when confronted with a paradox?
            Kuki: Interesting deduction. I don't think you're stupid. But then what's my opinion against thousands of others.

      • This kind of "Eliza chatbot" can be written in an afternoon by someone with no experience. The technology has been around since the 60s. Do a search for "Eliza chatbot tutorial" and you should see something come up.
      • by trawg ( 308495 )

        What I said earlier is irrelevant.
        Why?
        Because I might have changed my mind since I said it.

        So basically they're already at the point where they're qualified to be politicians

    • Check out the one that this guy fell in love with [replika.ai]. It's still Eliza, but wow, they've really pushed the boundaries of what a chat bot can do.
    • by dissy ( 172727 )

      The real questions people want to know:

      - How strong is Mitsuku's vibrator attachment?
      - Is the attachment self-cleaning?
      - How many different fetish site APIs does Mitsuku have access to?

    • Me: Do you prefer Kuki?

      I'm not surprised you got a strange reply from an AI chatbox to a context free fragment. A large portion of the population would do a double take or have to stop and analyse the context of your question. Yes it's the AI bot's name but that doesn't mean a left field question couldn't confuse a normal person.

      The problem comes from the request for a preference without context. It doesn't recognise Kuki as a thing to preference. Likewise there's no context linking the preference to the bot itself.
      Me: What name

    • by hawk ( 1151 )

      >This is why I keep saying that there is nothing today that is any better than the 60's vintage Eliza.

      Reply to This

      and how does saying "This is why I keep saying that there is nothing today that is any better than the 60's vintage Eliza." make you feel?

      *duck*

      hawk, dating himself again (but then again, he just showed his nieces and nephews an original Atari VCS, an Atari pinball, and a non-CPU Odessa earlier . . .)

  • ...will fuck anything. Tree stumps, chickens, mattresses, etc - it's the way they're wired.
    That one weirdo will fall in love with a plausable chatbot is absolutely unsurprising - ever see MTVs Catfish? People will chat for years, claiming they're /dating/ someone that they've only ever talked to online...frequently someone who won't talk on the phone or video chat. YEARS.
    How far is a 'bot from that?

    • Yes, even peanut butter sandwiches... https://www.dailymotion.com/video/xqr84p/ [dailymotion.com]
    • "How far is a 'bot from that?"

      Every chatbot I've ever seen has been pretty fucking far from that. They are easily tricked into revealing their shortcomings by doing things like repeating yourself, or repeating things back to them. It takes real intelligence to string someone along like that.

      If you start having romantic feelings for something you know isn't sentient, and your reaction isn't to work on your problems, your problems are severe.

  • ..., but who are we to judge?
    The world needs more love.

    I hope they will be very happy together.

  • What if he can't find a meatspace partner because of bad advice?

    Mr. Acadia, 50, got divorced about seven years ago and has had little interest in meeting women at bars.

    He's bloody 50. Anyone suggesting he find women in a bar in the first place is missing the point.

  • See the experimental "Elisa" program that ran on a typewriter and IBM rackmount computer around 1967.
    The source is available, or was.

    Which tells you how little power it takes to simulate a casual human conversation... 8-)

    It was designed to simulate a "Nondirectional Psychotherepist", that responded sympathetically to the patient but didn't try to guide them.

  • by dcw3 ( 649211 )

    Al?...Albert?...Alan? And what kind of last name is Chatbot?

No spitting on the Bus! Thank you, The Mgt.

Working...