Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Education Robotics

Words with Multiple Meanings Pose a Special Challenge To Algorithms (theatlantic.com) 173

Sample this: Me: So that's the marshmallow but you're going to eat it with this graham cracker and chocolate.
[My son looks at me like I am the dumbest person alive.]
Sebastian: No, I'm going to eat it with my MOUTH.
[End of play.]
That's from "S'MORES. A Real-Life One-Act Play", a conversation between Hamilton impresario Lin-Manuel Miranda which his young son Sebastian. In that brief interaction, young Sebastian Miranda inadvertently hit upon a kind of ambiguity that reveals a great deal about how people learn and process language -- and how we might teach computers to do the same.

The misinterpretation on which the s'mores story hinges is hiding in the humble preposition with. Imagine the many ways one could finish this sentence: I'm going to eat this marshmallow with ... If you're in the mood for s'mores, then "graham cracker and chocolate" is an appropriate object of the preposition with. But if you want to split the marshmallow with a friend, you could say you're going to eat it "with my buddy Charlie." The Atlantic elaborates: Somehow speakers of English master these many possible uses of the word with without anyone specifically spelling it out for them. At least that's the case for native speakers -- in a class for English as a foreign language, the teacher likely would tease apart these nuances. But what if you wanted to provide the same linguistic education to a machine?

As it happens, just days after Miranda sent his tweet, computational linguists presented a conference paper exploring exactly why such ambiguous language is challenging for a computer-based system to figure out. The researchers did so using an online game that serves as a handy introduction to some intriguing work currently being done in the field of natural language processing (NLP). The game, called Madly Ambiguous , was developed by the linguist Michael White and his colleagues at Ohio State University. In it, you are given a challenge: to stump a bot named Mr. Computer Head by filling the blank in the sentence Jane ate spaghetti with ____________. Then the computer tries to determine which kind of with you intended. Playful images drive the point home. [Editor's note: check the article for corresponding images.]

In the sentence Jane ate spaghetti with a fork, Mr. Computer Head should be able to figure out that the fork is a utensil, and not something that is eaten in addition to the spaghetti. Likewise, if the sentence is Jane ate spaghetti with meatballs, it should be obvious that meatballs are part of the dish, not an instrument for eating spaghetti.

This discussion has been archived. No new comments can be posted.

Words with Multiple Meanings Pose a Special Challenge To Algorithms

Comments Filter:
  • Want a context-free language, easily parseable, with plenty of computer-driven tooling, without this irritating English ambiguity? Lojban is learnable today: https://mw.lojban.org/papri/la... [lojban.org]

    In all seriousness, it is mind-blowing to me that our tribe of computer scientists continue to expend so much effort deriving meaning from English utterances. If we only wanted to encode meaning in a computer-manageable way, we could have been doing it decades ago.

    • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday June 28, 2018 @09:13AM (#56859212) Homepage Journal

      Want a context-free language, easily parseable, with plenty of computer-driven tooling, without this irritating English ambiguity?

      Sure, but I also want to be able to converse with people on the street. Since statistically nobody speaks lojban there is no sense in learning it. If I were going to spend effort learning another language, it would be one people actually use.

    • by javaman235 ( 461502 ) on Thursday June 28, 2018 @09:25AM (#56859282)

      Computer scientists focus on this because it highlights a really interesting difference between how our brains represent information and how computers do. The reality is human minds have no problem holding onto a word or phrase in a state of semantic superposition. For instance, if someone tells you to "turn left at the bronze rooster", you will keep an eye out for a business with that name, or an actual bronze statue of a rooster. Computers don't have this ability, to declare
      Int x = 54 or 75 or 23;
      Intuitively, and the ability to do so seems to give our minds a lot of unique powers.

      • The problem (as I see it) is one of context. Each answer is possible, but given "context", one answer will be more likely than another. It's this notion of context that's the difficult part to grok, and must be accounted for in any system intended for "intelligent" processing.
      • by sycodon ( 149926 )

        Cyc [wikipedia.org] has been working on this for quite a while and actually has products available.

        Another example of the challenges:

        She saw the bicycle in the window and wanted it.

        What did she want? The window or the bike?

        • The sentence itself may be ambiguous but there is probably more context int the rest of the text to disambiguate. Also, if you have a big enough annotated corpus to train on (meaning supervised machine learning), your model may already expect bicycle to be the most probably match.
          Because if you look at it, without further context, this is also ambiguous for humans, we just expect it to be the bike, but it could actually be the window she wants.
          Same with the sentence "Batman hits the villain with a wrench"
    • by Anonymous Coward

      There's an XKCD for this https://xkcd.com/191/

    • In Japanese, the "with" meaning "using this tool" would be the particle "de". "with" meaning "together" would be the particle "to", but without further elaboration, there's still ambiguity between "eat two ingredients together" and "two participants eat together" depending on how the sentence got constructed. Who knows an actual language people use that makes this completely unambiguous?

      • I think it's important to note that all context is tied to action. State (nouns) carries no inate sense of context with it. It's only when an action takes place that dependencies and conditionals come into play. The fact that you are awake or asleep is just a state. If I try to talk (action) to you, THEN your state becomes significant (context). Change of state is only accomplished through action ergo all context is aquired through action. Verb you, ya bunch of nouns!
  • These two people can be on the same side in WWII:
    "I'm going to fight with the Allies."
    "I'm going to fight with Hitler."

  • by goombah99 ( 560566 ) on Thursday June 28, 2018 @09:12AM (#56859208)

    THis really isn't that complicated. It's not that words have two meaning but there are different cases. In many languages there is a dative suffix for words taking a supporting roll. I threw the ball out the window. case endings can cleanly separate subject (I), direct object (ball) and participating clause object (window). IN english we lack a dative suffix on most words. So we have helper words and word orders to tell us which are the dative.

    In the case of all the examples give, "with" here is just saying the object named participated. "fork, meatball, Buddy". It doesn't say how it participated. That is completely not the content of the sentence.

    the point I'm making is that the sentence scans identically. It's not ambiguous. It's exact. It's entirely possible that I ground up my buddy to make meatballs out of him and that I actually like eating small forks. That information is not intended to be present. It would come from external context. The senstence is not ambiguous.

    • by rossdee ( 243626 )

        "supporting roll.'

      lol

    • by Anonymous Coward

      Context comes from our knowledge about those objects. We know a great deal about utensils. We know about what's food and what isn't based upon both what we've eaten in the past, as well as what society mores allow. It's one of the reasons the first delving into AI consisted in codifying a lot of human knowledge into a idiot savant, but brittle outside it's sphere of knowledge.

    • See this for how hard English is. Funny, but fits the topic.

      https://www.youtube.com/watch?... [youtube.com]

      And don't get me started on the word "FUCK", which is just about every word type in English: Noun, Verb, Adverb, Adjective, Exclamation, point of emphasis ...

      • by g01d4 ( 888748 )
        The comedy in most Burns and Allen skits is based on Gracie's parsing the ambiguity of the language. Passing the Turing test should include the computer replying in those instances with "I see what you did there".
      • by epine ( 68316 )

        And don't get me started on the word "FUCK", which is just about every word type in English: Noun, Verb, Adverb, Adjective, Exclamation, point of emphasis ...

        And being quite so promiscuous, FUCK can be treated as a hairy wart—commanding attention without in any way being instrumental—in just about every sentence more elaborate than "See Jack fuck Jill."

      • The F word is just a synonym for smurf. It is the easiest word to use, in any language.

    • by Anonymous Coward

      The fact that may have you ground up your buddy to make meatballs and put them in your spaghetti or that you may eat your spaghetti in his presence is precisely what makes the sentence ambiguous. In other words, it is ambiguous because you cannot tell the meaning of the sentence without external context.

      dom

    • by PPH ( 736903 )

      the point I'm making is that the sentence scans identically.

      Actually, the sentence can be diagrammed three different ways. In a breadth-first search of solution space, all three solutions will be found and considered. Another search layer will then take each possibility and attempt to solve it, given some deeper world knowledge. The probabilities assigned to the diagrams asserting that you served pasta with a helping of fork or your buddy will come out pretty low. Meatballs would result in this particular graph winning.

      Since breadth first searches can be pretty exp

    • by Kjella ( 173770 )

      That information is not intended to be present. It would come from external context.

      Well that's what language is all about, I can understand what you wrote because we have a common understanding of English. And that extends to expressions, euphemisms, allegories, slang, sarcastic and ironic usage that can't be taken literally or deconstructed into simpler terms. If "I had spaghetti and meatballs for dinner" means you ate them and that's the meaning most everybody agree on then you can quibble all you want about "to have" not implying "to eat", but that's just your opinion of how it should

  • by Dallas May ( 4891515 ) on Thursday June 28, 2018 @09:14AM (#56859222)

    Me: Your new magazine you got from your grandmother for your birthday came in the mail. Do you want to read it tonight for bedtime?
    4yo: [Confused] Magazines aren't for reading.
    Me: What are they for?
    4yo: [Stated with an tone of obviousness] Magazines are for cutting.

    [End scene]

    • No point to that. I just would like to see more conversations with children misunderstanding some seemingly basic concepts in dialogue form here.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      I thought magazines were for holding ammunition. How they cut, I'm not sure.

    • 4yo: [Stated with an tone of obviousness] Magazines are for cutting.

      That makes sense to anyone who's done a collage. Getting information from paper is pretty awful; you can't even do a simple substring search without executing it manually. But it can be a nice source of art materials, especially when stuffed with advertisements designed to wrest your attention away from whatever actual content was put in there to attempt to justify it.

    • No, Magazines are for storing ammunition on a 19th Century warship.
  • Ambiguity (Score:3, Insightful)

    by Anonymous Coward on Thursday June 28, 2018 @09:15AM (#56859226)

    Time flies like an arrow

    Fruit flies like a banana.

    • "I saw a woman on the beach with binoculars." (Was she holding them? Or was I looking through binoculars?)

      These problems are almost as old as AI itself, nearly half a century old, and we haven't made any real improvement in solving them.
      • There's not enough data to establish context.
        We need a linguistic Enhance button, I guess.

      • by Shotgun ( 30919 )

        These problems are almost as old as AI itself, nearly half a century old, and we haven't made any real improvement in solving them.

        If by that you mean "as old as men trying to talk to women" , which is MUCH older than AI, then yes,I'd agree with you.

        Spoiler alert:

        They guy was wrong.

    • We won the Lottery! ... Why are you excited? *WE* won the lottery. You didn't.

    • Time flies like an arrow

      Fruit flies like a banana.

      There is no ambiguity there. Both sentences' meanings are perfectly clear. Their juxtaposition is amusing, but the question of whether an AI can appreciate jokes is a totally different issue.

  • Vey smart people have found that in the 1970, 1980, 1990. 2000... 2018. With decades of funding and experts.
    https://en.wikipedia.org/wiki/... [wikipedia.org]
    Maybe some Seeding Intelligence https://www.wired.com/1997/07/... [wired.com]
    "...program only basic behaviors into the device, give it a way to experience sensory perception, and allow it to learn from experience.. "
  • reductio ad absurdum (Score:3, Interesting)

    by turpialito ( 993680 ) on Thursday June 28, 2018 @09:32AM (#56859326)
    Children are very highly specialized in reducing an argument to absurdity. This is a very important tool in logic and, in my opinion, it is no better exemplified elsewhere other than in Lewis Carroll's books, which I think is why Miranda's kid interpreted his father's statement as absurd. As a mathematician, Carroll was quite aware of how easily logical errors come about. I like citing when Humpty Dumpty explains what an un-birthday is to Alice: “I mean, what is an un-birthday present?" A present given when it isn't your birthday, of course." Alice considered a little. "I like birthday presents best," she said at last. You don't know what you're talking about!" cried Humpty Dumpty. "How many days are there in a year?" Three hundred and sixty-five," said Alice. And how many birthdays have you?" One.” I suppose Miranda could have been more explicit and said "along with", rather than just using "with" alone. As another example of this, we say "a glass of water", despite the glass not being made of water. Yet we don't say "a glass with water". And we certainly don't drink the glass, but rather its contents. I find all this fascinating. Brits have trouble understanding some American phrases as well for pretty much the same reason. Same thing goes for European Spanish and Latin American Spanish. Maybe feeding an AI with several samples of such phrases, as said and interpreted by multiple cultures would better train it to deal with such "inconsistencies"?
    • `I see nobody on the road,' said Alice.

      `I only wish I had such eyes,' the King remarked in a fretful tone. `To be able to see Nobody! And at that distance, too! Why, it's as much as I can do to see real people, by this light!'

  • jane ate spaghetti with dick.

    Now .. what does that mean?

    • Comment removed based on user account deletion
    • It isn't really that hard though, it only needs a big enough database of idioms.

      Almost everything people think of the computers are bad at because nobody is doing the legwork, instead they're trying to cheat with big datacenters and "AI" algorithms. What they need is a big team of linguists to catalog more.

      Not only that, but a system that attempts to track state should be able to tell from the context that there would be other references to Jane's poverty or money problems. Otherwise, nobody cares if it get

  • This is old news. Terry Winograd (Larry Page's advisor) and Fernando Flores wrote a whole book about it in 1987: Understanding Computers and Cognition: A New Foundation for Design [amazon.com]

    Take a look at it and give yourself a treat.

    • There's also the apocryphal story in which one of the earliest Machine Translation systems translated English to Russian and, as an evaluation, translated the Russian result back into English. Input: "The spirit is willing, but the flesh is weak." (= Matthew 26:41) The supposed output: "The vodka is good, but the meat is rotten." Although this never happened, it does illustrate that ambiguity has been known as a problem for computers for many decades.

  • the fuck did I just read?

  • But context gets complicated, quickly. And when things get too complicated, managers and grad students retreat to their happy places. End of AI story.

    • If they'd just transfer the expert systems out of AI and into one of the programming majors then it would get done really quick.

  • fuck that shit. (Score:2, Insightful)

    by Gravis Zero ( 934156 )

    If that shit can't figure the fuck I mean, then shit's on them, so fuck 'em! Fuck those fucking fuckers because no fucks given for that shit. ;)

    • Parent's sentence is a far better example of something rife with potential for algorithmic ambiguities than that given in the article.
  • by Anonymous Coward

    I've always thought that if you know code and/or other artificial languages the next real challenge is trying to use it to decipher human natural languages. A person learns really quickly the huge amounts of strange quirks and mental leaps we humans have to make just to communicate somewhat rationally. When we actively choose to do so anyway.

    There's even whole other worlds of artificially structured "semi-natural" languages used to define things further to prevent and/or promote misunderstanding. An incompl

  • Oh Reaaallly??

    Next, you're going to tell me they have trouble with sarcasm.

  • Let's eat, Grandma.
    Let's eat Grandma

    Hmm...I think this is where things went wrong with Skynet, someone left out the comma!
    • Let's eat, Grandma.
      Let's eat Grandma.

      Hmm...I think this is where things went wrong with Skynet, someone left out the comma!

      A panda walks into a café. He orders a sandwich, eats it, then draws a gun and proceeds to fire it at the other patrons.

      "Why?" asks the confused, surviving waiter amidst the carnage, as the panda makes towards the exit. The panda produces a badly punctuated wildlife manual and tosses it over his shoulder.

      "Well, I'm a panda," he says. "Look it up."

      The waiter turns to the relevant entry in the manual and, sure enough, finds an explanation. "Panda. Large black-and-white bear-like mammal, native to

    • So does capitalization.
      I helped my uncle Jack off a horse.
      I helped my uncle jack off a horse.
    • Let's eat, Grandma.

      Let's eat Grandma

      Hmm...I think this is where things went wrong with Skynet, someone left out the comma!

      Could be worse. Have you ever helped your uncle Jack off a horse?

  • by kaur ( 1948056 ) on Thursday June 28, 2018 @10:06AM (#56859512)

    Ambiguity exists in all natural languages, and in many forms.

    • We overlap contexts to resolve ambiguities. One could think of English's object orientation like this (with comments to the right of each line):

      a dog sat in the yard. // instantiate class "dog"
      The dog's hair is brown. // assign "hair" attribute of the last instantiated instance of the "dog" class
      a dog is running down the road. // instantiate class dog
      The dog is fast. // refers to the last dog instantiated.
      The dog with brown hair is slow. // refers to the last dog with the aforementioned brown hair attribut

    • by mjwx ( 966435 )

      Ambiguity exists in all natural languages, and in many forms.

      However only English has refined it to a weapons grade.

      When most English speakers cant handle words with esoteric or odd meanings, I dont expect AI to.

  • When humans find the contextual reference vague or ambiguous they usually query for more information. Can't machines do this as well?

    • Nobody is trying to program computers to do this stuff, so nobody is there to program them to do it.

      They just use AI algorithms that put code in a blender, and tests chunks to see if something didn't fail, and eventually they can "train" it to do some simple task. Except they're not "training" anything, they're just establishing the success conditions. So they have a hard time intentionally teaching it a nuance to a trick; they don't even know what tricks it is using!

      Eventually some humans will write some c

  • Comment removed based on user account deletion
  • I couldn't have said it better myself! https://twitter.com/iamdevlope... [twitter.com]

  • Comment removed based on user account deletion
  • This is known as word sense disambiguation - there are a number of ways to do so. Training systems for disambiguation is more resource intensive, and in many use cases provides little gain, so most don't bother.

    https://en.wikipedia.org/wiki/... [wikipedia.org]

  • The main context is called "general", akin to "void main( argc, argv)" (in C/C++). Each context consists of a sequence of recognizers (patterns for recognizing user statements), organized from longest (number of pertinent words) to shortest. For example, in MorgaScript (I think (and hope) any programmer could figure out how this works by looking at it):

    Context "general"
    Synonyms yes: yep, yeah, sure
    Synonyms no: nope, nah, no way
    Group personName: "Jane", "Joe", "

  • In natural language processing polysemous words in a dictionary are often discerned with hashes. In the example above "with", with respect to "with graham crackers", would have one hash, whereas "with" with respect to "with my friend", would have a different hash. Through a well-developed NLP algorithm, and a lot of training, discerning between the two different contexts is certainly doable.
  • This was already known in the 60s, but that did not prevent the AI community to hype things even more extravagantly than today. How much longer before the next AI Winter?
  • It's quite ironic that in a story that discusses how the word "with" can imply many different things with different meanings that the summary contained a sentence that should have used the word "with" but didn't:

    That's from "S'MORES. A Real-Life One-Act Play", a conversation between Hamilton impresario Lin-Manuel Miranda which his young son Sebastian

    Surely the word "with" should have been used instead of "which" in that sentence.

  • When Kiddo was 2 or 3 years old he was using his plastic dinos to make dino tracks in Silly Putty. Eventually he made so many tracks that the Silly Putty was nothing but a bunch of marks. I wanted to make some fresh tracks with a different dino so I said

    "Here, let me make it smooth" and I rolled it between my two palms making it into a smooth ball and I showed it to him

    "Can I have the smooth?" he asked.

    I smiled because he took the word smooth to describe the shape of it, not the texture of it.

  • In the sentence Jane ate spaghetti with a fork, Mr. Computer Head should be able to figure out that the fork is a utensil, and not something that is eaten in addition to the spaghetti. Likewise, if the sentence is Jane ate spaghetti with meatballs, it should be obvious that meatballs are part of the dish, not an instrument for eating spaghetti.

    For instance if you say Jane ate spaghetti with veramissimo no one knows what you're talking about until you know whether that's an herb, an adjective, or a utensil

  • Now I wonder how these algorithms deal with this sentence:
    A ship shipping ship, shipping shipping ships [fleetmon.com].
    • by skids ( 119237 )

      You can try it out on a few natural language sentence diagram applications e.g. http://www.link.cs.cmu.edu/cgi... [cmu.edu] (though that server seems to be unresponsive at the moment.)

      Here's one that hits many of the words listed here [wordnik.com].

      give or take point get set and mark go for good line plays make the dead run a light roll

  • I'll eat spaghetti with hot grits in my pants.

    Why should NLP remain sane?

  • "Use context? Sorry Dave, I don't know how."

  • I will leave you with some chicken https://www.youtube.com/watch?... [youtube.com].

  • Buffalo buffalo buffalo buffalo buffalo buffalo buffalo.

    Similar sentences exist in other languages.
    German: Die Männer, die vor dem Schokoladenladen Laden laden, laden Ladenmädchen zum Tanzen ein, meaning "The men, who loaded chests in front of the chocolate shop, asked shop girls for a dance".
    Or Wenn hinter Fliegen Fliegen fliegen, fliegen Fliegen Fliegen nach, meaning "When flies fly behind flies, flies fly after flies".
    Danish, Swedish, and Norwegian: Bar barbarbarbarbar bar bar barbarbarbarbar,

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...