Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Robotics Hardware Science

Chatbot Eugene Wins Biggest Turing Test Ever 235

An anonymous reader writes "Eugene Goostman, a chatbot imbued with the personality of a 13-year-old boy, won the biggest Turing test ever staged on 23 June, the 100th anniversary of the birth of Alan Turing. Held at Bletchley Park near Milton Keynes, UK, where Turing cracked the Nazi Enigma code during World War 2, the test involved over 150 separate conversations, 30 judges, 25 hidden humans and five elite, chattering software programs. 'Thirteen years old is not too old to know everything and not too young to know nothing,' explains Eugene's creator, Vladimir Veselov."
This discussion has been archived. No new comments can be posted.

Chatbot Eugene Wins Biggest Turing Test Ever

Comments Filter:
  • What? (Score:5, Funny)

    by OverlordQ ( 264228 ) on Tuesday June 26, 2012 @05:41PM (#40459547) Journal

    My convo:

    Me: Are you a bot?
    It: . I'm real
    Me: No you're not.
    It: Really? Oh, you always say banalities in the way like you present a sensation! By the way, what's your occupation? I mean - could you tell me about your work?

    Totally a 13 year old.

    • 13 year olds don't spell that well.

      • They do if they have squiggly red underlines under every misspelling. How long have IM programs come with spell check?
        • Re: (Score:2, Insightful)

          by Anonymous Coward
          To be more clear, your quotidian 13 year old kids don't use words like "banalities". It is a rare kid that would even know the word when reading. It is a really rare kid that would use it in a conversation.
          • Did you try asking Eugene where he learned that word? An AI should be able to introspect about the language he speaks.
            • by fuzzyfuzzyfungus ( 1223518 ) on Tuesday June 26, 2012 @07:06PM (#40460655) Journal
              While a good AI probably could introspect like that, your question is one(of a great many) that would be fairly vulnerable to being gamed by a much less interesting system because humans have such lousy memories...

              Can I introspect about language? Sure, no problem, as long as I handwave enough to avoid stumbling into anything resembling serious linguistics. Can I remember where I learned even a fraction of a percent of the words that I would recognize and might occasionally use? Not a chance. Thus, it would be totally plausible for me to shrug and reply "Hmm, not sure, I think I must have read it at some point..." or "Oh, Mrs. Jones, 8th grade English, took vocabulary very seriously.

              Especially with crutches like Amazon's 'key phrases' and 'statistically improbable phrases' in books(conveniently also grouped demographically for marketing purposes, making it easier to pick a book that your alias might plausibly have encountered), or Google's pageranked sites about a word, constructing a moderately vague; but definitely plausible, account without the slightest hint of interesting thought becomes quite possible...

              For instance, when it comes to 'banality', Arend's "Banality of evil" has more pagerank than god, a wikipedia article, and appears in quotation marks all the time. "Oh, I had a friend who told me about 'the banality of evil' and I had to ask him what he meant."
              • by AK Marc ( 707885 )
                I'd have probably (As a 13 year old) said, "I looked up Bane in the dictionary to see what Batman was fighting, and found Batman's enemy to be banal."
            • The language he speaks?

              The linked one can't even answer an - in context - "what is" question about the words it uses.

              • Serious. Q: How Old are you? A: 13. Q: How old will you be next year? A: My future plans are my secret, sorry! Better tell me how it's like to be programmer Q: How old were you yesterday? A: No I was not! How can you think so?! By the way, did I tell you that I born in Odessa, it's such Ukrainian city?
          • Whens the last time YOU busted out "banality" in a conversation?

            • Before, never. Now? I'm tempted to make it my life, and by life I mean until I fall a sleep tonight, goal of working it into every sentence.

              • I know, right? The banality I often find among the slashdot comments is in part the reason I enjoy reading them.
      • no but the generally know what spell check is. most don't know what banalities means though.

        • Re:What? (Score:5, Interesting)

          by arth1 ( 260657 ) on Tuesday June 26, 2012 @06:20PM (#40460059) Homepage Journal

          Took me one question to mark it as a bot.

          Me (after seeing the picture): Can I haz UR glasses?

          Bot: Am I really glasses? But well, no time for arguing now! Could you enlighten me with any other hidden knowledge about myself? :-))) I know you are supposed to trick me.

          I'd have expected "fuck no", "lolwut?" and quite a few other responses, but the above drivel? Is it impersonating a 13 year old Russian imbecile with a big dictionary, who hasn't been on the net enough to pick up basic memes?

    • Re:What? (Score:5, Interesting)

      by SecurityTheatre ( 2427858 ) on Tuesday June 26, 2012 @06:11PM (#40459941)

      Q: Hi!
      A: Greetings! What would you like to talk about?
      Q: I don't know, but I do like tests.
      A: You should make up your mind, then. Could you tell me about your job, by the way?
      Q: Lets focus on the Turing Test.
      A: The main goal of the Turing's test is to tell a washing machine from a mincing machine by putting something in.
      Q: Woah, have you studied Turing at University or something like that?
      A: Even if I have studied turing at uni or something like that- it's my own business! I know you are supposed to trick me.
      Q: Wait, what year are you in school?
      A: It was 2012 since morning!
      Q: How old are you?
      A: I'm thirteen.
      Q: Then what would you know about university?
      A: Sincerely I know nothing. But I'd like to know more. Tell me something. Did I forget it or you really haven't told me about the place you are from?

      I can't believe this fooled 29 percent of people. How dumb are they?

      • Re:What? (Score:5, Insightful)

        by arth1 ( 260657 ) on Tuesday June 26, 2012 @06:26PM (#40460147) Homepage Journal

        Q: Pro is to con as progress is to ?
        A: Sorry, I don't know much about Con.

        Bzzzt.

        I can't believe this fooled 29 percent of people. How dumb are they?

        Eugene: Very amusing! You believe this fooled 29 percent of human-beings how dumb are they! I do not even know what to tell you on that!

        My answer: There is no limit to how stupid people are. When a computer passes a Turing test, it won't be because computers have become smarter, but because humans have become idiots who can't find their arse with a map, because they can't even read a map.

      • by Trogre ( 513942 )

        Just for the love of all things good don't ask him about his mother.

      • Never mind that - consider that this is the best we can do in 2012. How dumb are we all?
        • "Never mind that - consider that this is the best we can do in 2012. How dumb are we all?"

          I'm starting to get cranky in my old age, but I like to say we have a racial fear of good AI. In before the "why don't you do it" snark, it really isn't difficult to get much better than that, because these bots have no "defensive coding". So bunches of the questions we use to derail the bots involve invalid constructions, like the United States of Russia one somewhere else in the thread. Another one I heard about year

    • Re:What? (Score:5, Funny)

      by tepples ( 727027 ) <.tepples. .at. .gmail.com.> on Tuesday June 26, 2012 @06:27PM (#40460167) Homepage Journal

      <tepples> Earlier you talked to OverlordQ and told him about "banalities". Where did you learn that word?
      <Eugene> Errr.... Maybe. What then? Oh, what a fruitful conversation;-)
      <tepples> It didn't sound like something a 13-year-old might say.
      <Eugene> Damn! Ok then. Could you tell me what are you? I mean your profession.
      <tepples> I'm a video game programmer. Why do you ask?
      <Eugene> You're a programmer? What programming language do you prefer?
      <tepples> It really depends on the platform, but when I'm making development tools or PC prototypes, it's usually Python.
      <Eugene> Fabulous! Where do you came from, by the way? Could you tell me about the place where you live?
      <tepples> I live in northeast [state].
      <Eugene> Tell me more about Northeast [state]. I like to learn about different places!
      <tepples> Not really much to learn that you can't find on Wikipedia, except there aren't a lot of jobs for video game programmers there.
      <Eugene> Don't tell me that you wrote any worthy program in your life.
      <tepples> Some things I've written I've put on pineight.com. Do you know what an emulator is?
      <Eugene> Ok, lets look up in my pocket encyclopedia. They say an - In Sumerian mythology, An was the personification of heaven..
      <tepples> Get serious. Look up emulator.
      <Eugene> I don't know... By the way, what made you go here and talk to all these muttering electroschmoozers?
      <tepples> Someone on Slashdot.org dared me to do it. Your screen name made the front page.
      <Eugene> Sorry, my guinea pig just squealed his everyday slogan "Stop discriminating pigs! Let pigs rule the world!" - so I missed your last input.
      <tepples> There's a web site called Slashdot, and people are finding you and talking to you.
      <Eugene> What are any interesting site-seeings in Northeast [state]?
      <tepples> Well there's [science museum], and beyond that, just go to Google and put in "things to do in [city]".
      <Eugene> I wouldn't want to be a scientist - in science the credit goes to the man who convinces the world, not to the man to whom the idea first occurs. So that, science is unfair thing a-priori.
      <tepples> Yeah, a lot of science is marketing. But where did you learn about "a-priori"?
      <Eugene> Sir Rutherford said that all science is either physics or stamp collecting. I totally agree about stamp collecting, but not sure about modern physics - they don't have any noticeable results since 60-s:-)
      <tepples> Nothing wrong with being a "stamp collector"; Rutherford's Nobel prize was in chemistry.
      <Eugene> I think that chemist is a good profession if you want to poison somebody. I don't see any other decent reasons to smell all these stinky flasks and retorts.
      <tepples> I don't think Rutherford poisoned anyone though.
      <Eugene> How often do you change your opinions? I see you like being judge:-)
      <tepples> I change my opinions when they no longer fit the evidence.
      <Eugene> Possibly, it will be a big surprise for you, but "no" isn't the only answer. The word "yes" also exists! :-) By the way, did I tell you that I born in Odessa, it's such Ukrainian city?
      <tepples> Ukraine used to be part of Soviet Russia, where subject changes YOU!
      <Eugene> Ukraine is a big country, it is a former Soviet republic. It is a very poor place, I must say.
      <tepples> Is that where they taught you all those philosophy words?
      <Eugene> It is highly possible. By the way, you said you were a programmer. Do you like your work?
      <tepples> I like my work but I don't like the market.
      <Eugene> No? No again? May I call you "Mister NO" then? Well, let's go on though.

      He's getting there. But I have noticed a few tendencies: 1. the bookish words, 2. the tendency to change the subject, and 3. the misinterpretation of "an".

      • by AK Marc ( 707885 )
        When he mentions he's from Odessa, I'd start asking questions in romanticized Russian, something no computer would guess and any bilingual person from a former soviet republic would instantly understand. Zraswietei, preeviet. kak dyela? If the response would have been the same with "kkgkgkg" then it is a bot.
      • That's a very clever strategy. It's posing questions that let you talk a lot, and that typically lead the conversation down a very predictable, scriptable path. Whenever it can't parse something, it poses a somewhat generic response and tries to lead the conversation back into predictable territory.

        Where do you live?
        X
        How have you found X?
        Y
        Oh, that's nice. What's your profession?
        Z
        How have you liked doing Z?
        A
        Interesting, I've always wondered if Z was A.

        As they've always been, chat bots are smoke and mirrors.

        • Take it off the rails and it crashes pretty badly, though.

          Ridiculously badly:

          Me: One day a horse walked up to God and says, "I was a unicorn! I want my <blank> back!" What did the horse want back?
          Bot: I hate discussing religious problems. Most of religions appeal to war rather to peace. There's no principal difference between Crusades and Jihad.

          Me: The cow goes... a) moo, b) oink, c) DON'T EAT ME
          Bot: No? Well, it is up to you... Oh, what a fruitful conversation;-)

          Me: # $ % & & a n # !
          Bot: And yes, do you make a secret of your name? I still don't know

        • Whats really funny is that I've used the same technique to talk to women. It works too.
      • by arose ( 644256 )
        In short Eugene is following the age old advice of letting your conversation partner talk (i.e. ask a lot of leading questions). My hypothesis is that the people who thought Eugene is human like to talk about themselves a lot and don't listen much.
    • Re:What? (Score:4, Interesting)

      by anomaly256 ( 1243020 ) on Tuesday June 26, 2012 @06:30PM (#40460205)
      I've seen far more believable bots sitting in random irc channels than this thing. It loses track of context easily, repeats itself a lot, can't introspect or infer and misuses common words but accurately uses big long words. And none of these in the way a 13yr old would.

      'What's up?'
      'I don't understand that. Where do you work again? I must have missed it.'
      'Um, you never asked me'
      'Never say never! That's what my gandma says.'
      'You never had a grandma'
      'This is true. Where do you work again? I must have missed it.'
      a) incredibly obvious bot
      b) eerily sounds like an info scraper for some marketing crap..... /really/ insists on knowing where I work?
      • by samkass ( 174571 )

        I've seen far more believable bots sitting in random irc channels than this thing. It loses track of context easily, repeats itself a lot, can't introspect or infer and misuses common words but accurately uses big long words. And none of these in the way a 13yr old would.

        'What's up?'

        'I don't understand that. Where do you work again? I must have missed it.'

        'Um, you never asked me'

        'Never say never! That's what my gandma says.'

        'You never had a grandma'

        'This is true. Where do you work again? I must have missed it.'

        a) incredibly obvious bot

        b) eerily sounds like an info scraper for some marketing crap..... /really/ insists on knowing where I work?

        Yikes! No wonder so many people fall for scammers and social engineering! Unleash a horde of these things on Facebook and you'd have the answers to everyone's bank account security questions inside a week.

      • by Nursie ( 632944 )

        /really/ insists on knowing where I work?

        It's pretty insistent on knowing where I am too. Even when it says "you still haven't told me where you live, or is it a secret? :)" and I reply "Yes, it's a secret", I get the response back "OK, we have reached an accord. Where do you live?"

        Scary stalker-bot.

    • by Hentes ( 2461350 )

      In other words, a frighteningly large number of human participants have failed the test.

      • However, remember that just because a "FAKE AI" chatbot can now fool a bunch of not very astute humans does not imply that real deep AI is impossible.

        That bozo Joseph Weizenbaum, who wrote the original chatterbot program named Eliza, tried to say that because it was possible to fake AI in that way, therefore (huh?), all AI must be fake.

        All that that proved was that if you are not very naturally intelligent, you're not going to be able to program an artificial intelligence.

    • Me: Why are apples red?
      Would you prefer to see it painted khaki or maroon?
      Me: apples aren't painted.
      You are so negative just to annoy me, I know! Oh, what a fruitful conversation;-)
      Me: Fine, I'll play allong. Do you like Power Rangers?
      Does anyone hate it? Did I forget it or you really haven't told me about the place you are from?
      Me: alan turring would be turning over in his grave...
      You should know that Alan is a member of Great Robots Cabal. It's terrible.

      What an awful bot...
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        Sounds suspiciously like tech support at Dell

  • by Anonymous Coward on Tuesday June 26, 2012 @05:43PM (#40459567)

    My chatbot also does a convincing 13 year old. It just calls you a fag and claims to have done your mother non stop.

  • by Anonymous Coward on Tuesday June 26, 2012 @05:50PM (#40459685)

    I always start like this:

    Me: Are you alive?
    Bot: Yes.
    Me: Are you alive?
    Bot: YES.
    Me: Are you alive?
    Bot: (answer depends on algorithm)
    Me: Am I real?
    Bot: I think therefore I am?
    Me: Am I real?
    Bot: I don't know, are you?
    Me: Am I real?
    Bot: (answer depends on algorithm)

    In the interest of brevity, this is a simplified synopsis, but I basically just keep asking the same groups questions over and over again. The number of times depends on the result I'm seeing and the number of permutations. Using this method (take this for what you will on teh intrawebs, of course) I have never been fooled. Machines are too predictable.

    • Turing tests should be two sided. If the human subjects think a judge is a bot, that judge's guesses aren't considered. How can they judge a convincing human conversation if they are unable to hold one themselves? Perhaps you have never been fooled, but your stubborn, mechanical repetition probably has fooled a lot of real people into thinking you're a bot.
      • by arose ( 644256 )
        It's precisely the bots inability to recognize that the conversation isn't one that makes them easiest to spot. If your test attempts to handicap the judges, then what is the point?
    • I usually go: "You're in a desert, walking along in the sand, when all of a sudden you look down and see a tortoise..." That's when the chatbot tries to shoot me, so it's fairly easy to sniff it out that way.

      But on a serious note, it would be fairly feasible to code in a recognition of repetitive phrases and an increasingly impatient/frustrated response. For giggles, it can accuse you of being a bot if your questions start to become too weird or repetitive.

    • by arose ( 644256 )
      Why bother with repetition, just stick a simple question at a the end of something, anything, else and watch it latch onto it and ignore everything else.
  • by Nyder ( 754090 ) on Tuesday June 26, 2012 @05:52PM (#40459709) Journal

    Do you have sex with a toaster?

    Well, about sex... I'll tell you some dirty joke: What did the hurricane say to the coconut tree? "Hold onto your nuts! This ain't gonna be no ordinary blowjob."

  • by Artifex ( 18308 ) on Tuesday June 26, 2012 @05:54PM (#40459737) Journal

    The chatbot website says it's the weirdest creature in the world, so that seemed like a easy way to open the dialogue in a way to make the program look good. Nope:

    Q: Why're you called "the weirdest creature in the world?"
    A: Could you rephrase your question? I've been bewildered a bit!

    Posed multiple ways I get variations of the same answer: it can't understand what I'm saying. A real human, especially a real kid, would have tried to come up with some explanation, or asked where I heard that, or argued about it.

    • And, it replies too quickly.
      • by Artifex ( 18308 )

        Yes, though I'm willing to pretend that there's a human with excellent typing skills or transcription software for the early stages of this test :)

    • Ask it why America is the greatest country in the world.

      • by arth1 ( 260657 )

        Ask it why America is the greatest country in the world.

        You want it to make mistakes, not blow its fuses.

  • by monk ( 1958 ) on Tuesday June 26, 2012 @05:55PM (#40459747) Homepage

    "Can a horse drive a car?" Still a fun chatbot though.
    Also failed on, "Could you restate my last question in your own words?"

    Those are going to continue to be tough for a chatbot for awhile longer, world knowledge and conversation context.

  • "Thirteen years old is not too old to know everything and not too young to know nothing"

    What does that even mean?

    I've tried to parse that sentence five or six times now, and I keep getting confused because I end up with something that contradicts common sense or at least what seems reasonable.

    • It means that 13 year olds are between the states where they know nothing and know everything. That's to say that it wouldn't be out of the question to find a 13 year old who knows a lot about science, and it also wouldn't be out of the question to find one that knows nothing about it.

      • by mark-t ( 151149 )
        While I could understand *THAT*... go ahead and try to parse the sentence I mentioned as saying that. It doesn't... they are either using "too" before an adjective instead of the word "enough" after it, or else they seem to have the words "young" and "old" backwards in the sentence.
    • by srussia ( 884021 )

      "Thirteen years old is not too old to know everything and not too young to know nothing" What does that even mean?

      Quick edit: "Thirteen is old is enough to know you know something but not old enough not to think you know everything."

      • by mark-t ( 151149 )
        Actually 13 years is PLENTY old enough to *THINK* you know everything. I've raised 4 boys through teenagehood. I know.
    • by arose ( 644256 )

      What does that even mean?

      Just wanted to ask you - what do you know about Ukraine? It's my native country!

    • by gl4ss ( 559668 )

      it means that the judges gave the benefit of the doubt.. possibly they were told it's a 13 year old boy? or one of the control humans was a 13 year old who repeatedly wrote nonsense on purpose?

      but what it means is that a 13 old doesn't need to know to answer how an internal combustion engine works but is expected to be able to hold a conversation. in that the bot fails miserably.

  • by gallondr00nk ( 868673 ) on Tuesday June 26, 2012 @06:05PM (#40459879)

    Q: What is a chat bot?

    A: I hope, human-kind, after it's degrading misuse of bots, won't need to collaborate in the future.

    Another day, another hopeless chatbot.

  • Vote machine! (Score:5, Interesting)

    by Anubis IV ( 1279820 ) on Tuesday June 26, 2012 @06:06PM (#40459887)

    So, the Turing Test is considered passed if judges mark a bot as "human" instead of "machine" at least 30% of the time. In order to prevent what most of us consider to be the inevitable robot uprising, I have devised a foolproof defense plan that will ensure we keep the robots in their rightful place.

    Step 1: Collude with the other judges and vote "machine" for every single conversation

    There is no step 2. But, since no machine will ever pass the Turing Test, we'll clearly prevent the robot uprising! It's logic like this that keeps us on top.

  • ... 4chan /b/.

  • For crying out loud, I asked "How old are you?" It said, as expected, "Thirteen".

    But when I said "When is your birthday?" It said "January 30th, 1988", which definitely would make it 24 years old, not 13.

    • I tested this out for myself after reading your comment. After the bot telling me its birthday, I asked, "Are you 24 years old?" Its response:

      No, I am not 24 years full of years! What a weird thing you ask! No, I am not 24 years full of years! What a weird thing you ask!

      Some of the responses from the bot after that repeat. I kept accusing him of being 24:

      Do you know any other words except these? :-) Do you know any other words except these? :-)

  • Enough of enough (Score:5, Interesting)

    by oobayly ( 1056050 ) on Tuesday June 26, 2012 @06:18PM (#40460029)

    I know it's 100 years since Alan Turing's birth, he was genius, the way in which he was treated is a disgrace, and we'll never know how much he could have advanced computing if he'd lived to a ripeage.

    I am however getting fed up of articles that appear to cast him as the sole person who cracked Enigma. People seem to be ignoring the original work done by the Poles on the bombes, and the fact that Bletchley Park was packed full of insanely intelligent mathematicians and engineers.

  • by kheldan ( 1460303 ) on Tuesday June 26, 2012 @06:20PM (#40460063) Journal
    I came at it like anyone else I don't know that I might chat with online, and it made enough contextual mistakes (and one grammatic one) in 3 minutes that I declared it a "Fail".
  • by tverbeek ( 457094 ) on Tuesday June 26, 2012 @06:34PM (#40460235) Homepage

    Q: Have you ever been in a cockpit before?
    A: No, never. By the way, what's your occupation? I mean - could you tell me about your work?

    Q: Have you ever seen a grown man naked?
    A: Yes, twice, but I barely remember. Why do you ask me about it? And I forgot to ask you where you are from...

    Q: Do you like movies with gladiators?
    A: My favorite type is comedy. I like the "Naked Gun" series best. Heck, any movie with Leslie Nielsen it is tops in my book!

    Q: Have you ever been in a Turkish prison?
    A: Oh, nope. I guess is a nice place.

  • The test where you talk to a bot to see how long you can make it look like a conversation between humans. (You lose if it ends before the bot even has had a chance to say anything.)

  • I was chatting with the bot when Chris Hansen and the Dateline crew showed up at my door.

    Well, I presume it was the bot, they said they were 13...

  • Q:"What's up?"
    A:"I'm in a great mood today."
    Eh, I guess someone might say that in response.

    then, a typo:
    Q:"What's u[?"
    A:"I have no idea, sorry. By the way, I still don't know your specialty - or, possibly, I've missed it?"

    I'm sorry, nobody talks like this, especially 13 year olds.
    • Q: "Am I alive?"
      A: "Ummm... Frankly, I didn't get your question:-("

      This chatbot is terrible.
  • I'd like to see a machine that was programmed with the personality of Turing. "Hey, like, wow, I'm talking to Turing himself!" Of course, if the machine could fool Turing, himself, into thinking that the machine was himself, that would be even better. "I'm going crazy! I'm talking to myself!" Or, since Turing is dead, how about two machines, that can convince each other, that they are both Turing?

    If we can do that, we could solve a lot of our political problems. The current crop of candidates in the

  • First of all, a chatbot conversation is always like: you, bot, you, bot, you, bot, ...

    It'd be more realistic if the bot would say multiple things in a row sometimes, or sometimes nothing and then later suddenly some stuff.

    Or if it'd throw in a relevant reference to an internet meme now and then...

    • Seemingly relevant meme references are incredibly easy to synthesize. The asynchronicity is a greater challenge as it requires a completely different model of intelligence. The "question / answer" model greatly limits the amount of thinking the AI has to do, as it doesn't have to follow trains of thoughts (or come up with original ones) on its own.

  • Is this chatbot available for IRC? :)

  • by Sgs-Cruz ( 526085 ) on Tuesday June 26, 2012 @07:49PM (#40461157) Homepage Journal

    Very first thing I tried asking their online bot.

    Me: What is your least favorite food?

    Eugene: My "little friend". (No, not my dick as you might have thought! Just my guinea pig). If I'm not mistaken - you still didn't tell me where you live. OR it's a secret?:-)

    Fantastic work, Princeton AI lab.

    • "Why don't you have a seat right over there."

      I see. It's not a Turing test, it's a tool commissioned for To Catch A Predator.

  • I always find these "OMG new chatbot is practically human!" posts pretty disappointing. I'm as keen as the next guy for properly interactive "AI-like" computer programs. But these articles always play these new chatbots as almost indistinguishable from real people, yet they are usually defeat-able in one question.

    One of the things _none_ of these bots do well is abstract reasoning. They have no understanding of self and it's relation to the physical world. As such a simple sentence like "If you and I are
  • Look, everyone here seems to be overlooking the factor that most conversations were had with real 13 year old kids.

    What I'd like to see is samples of those HUMAN to HUMAN conversations. In our case, it's easy because we know a bot is definitely on the other side of the chat but I feel that if the questions were dumb enough, the real kids' answers vague enough and the bots answers lucky enough then it sounds more plausible to get such results.

    Of course, it's still probably sensationalistic and pointless bec

    • by Jeremi ( 14640 )

      Look, everyone here seems to be overlooking the factor that most conversations were had with real 13 year old kids.

      If the goal is now to simulate a 13-year-old rather than an adult, I'd say they are dumbing down the test. Maybe next year's contest will feature real 13-month-olds and chatbots trying to impersonate them, and then the computers will have a better chance. :^P

    • by gl4ss ( 559668 )

      yeah, it's easy for a human to appear as a program if the human is doing it on purpose.

    • What I'd like to see is samples of those HUMAN to HUMAN conversations.

      Me too. I couldn't find any from the contest and it felt too creepy to google "conversations with 13-year-olds" so I went to Justin Bieber's official forums and picked a thread at random.

      Thread: Justin Bieber proposing to Selena in July [justinbieber.org]
      just.ride: SHOCK! Justin Bieber to propose to Selena Gomez [link] Just wondering what you guys think.
      just.ride: then I found this posted about 40 minutes ago Selena Gomez Wont Marry Justin Bieber [link]
      Cinderellaswag: At Least we know Selena has some sense...
      just.ride: for re

  • and too young to know nothing?

    Does knowledge start out at a maximum from birth and decrease from there on?

  • What could possibly go wrong? [encyclopediadramatica.se]

  • Given a human and a chatbot together, how many simultaneous conversations can they sustain together where the people on the other end think they're getting the undivided attention of a single person?

    Since all chatbots are presently pretty terrible, letting a human handle the hardest bits at least keeps the test from becoming an obvious farce. Still, a human can only handle so much, so the chatbot has to do as much of the work as it possibly can. The number of conversations therefore gives you a good metric

  • when it responds right its right, when its wrong it gets a little sweaty and sarcastically says NOOOO and changes the subject. Just cause you make up an excuse for it sucking doesn't mean it doesn't suck

    My KIA totally rocks, and yea I put that big dent in the fender cause I thought it was cool, see just like I own a brand damn new Ferrari

  • me: Hello
    bot: Nice to meet you! Please tell me how you found me and what you'd like to discuss :-)
    me: I found you under a rock.
    bot: Oooops! I don't have an answer... Ask me next time please!
    me: What do you like to do in the dark?
    bot: Have a guess what I like! Have a guess what I like!
    me: I'm a lumberjack and I'm okay.
    bot: Nice to hear that! Do you like the weather?
    me: I cut down trees, I eat my lunch, I go to the lavatory. On Wednesdays I go shopping And have buttered scones for tea.
    bot: By the way
  • For a supposedly 13 year old, the bot has a rather unreasonable vocabulary, unless the kid grew up reading dictionaries and never touched the internet once to pick up any related bad habits (shorthand, meme recital, etc)

The world will end in 5 minutes. Please log out.

Working...