Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Input Devices Science

BrainPort Lets the Blind "See" With Their Tongues 131

Hugh Pickens writes "Scientific American reports that a new device called 'BrainPort' aims to restore the experience of vision for the blind and visually impaired by relying on the nerves on the tongue's surface to send light signals to the brain. BrainPort collects visual data through a small digital video camera and converts the signal into electrical pulses sent to the tongue via a 'lollipop' that sits directly on the tongue, where densely packed nerves receive the incoming electrical signals. White pixels yield a strong electrical pulse and the electrodes spatially correlate with the pixels, so that if the camera detects light fixtures in the middle of a dark hallway, electrical stimulations will occur along the center of the tongue. Within 15 minutes of using the device, blind people can begin interpreting spatial information. 'At first, I was amazed at what the device could do,' says research director William Seiple. 'One guy started to cry when he saw his first letter.'" There is some indication that the signals from the tongue are processed by the visual cortex. The company developing the BrainPort will submit it to the FDA for approval later this month, and it could be on sale (for around $10,000) by the end of the year.
This discussion has been archived. No new comments can be posted.

BrainPort Lets the Blind "See" With Their Tongues

Comments Filter:
  • Hey Sexy! (Score:5, Funny)

    by Profane MuthaFucka ( 574406 ) <busheatskok@gmail.com> on Saturday August 22, 2009 @09:47PM (#29160243) Homepage Journal

    Oooh hello pretty lady, come on over here and let me get a good lick at you.

  • by Scott64 ( 1181495 ) on Saturday August 22, 2009 @09:50PM (#29160255)
    "Taste the rainbow" was the first thing that came to mind when I read this.
  • Does that mean ugly girls "taste like shit"?

  • Nothing new here... (Score:4, Informative)

    by imikedaman ( 1268650 ) on Saturday August 22, 2009 @09:51PM (#29160267)
    I'm pretty sure I read about this exact thing years ago. Weren't there issues with the tongue being "low resolution" and interfering with eating and talking?
  • by curmudgeon99 ( 1040054 ) on Saturday August 22, 2009 @10:03PM (#29160331)
    The human brain is adept at processing pattern streams. These are two-dimensional datasets that change over regular intervals of time. In the specific case of this tongue-sight project, they are taking advantage of the ability of the tongue to transmit many "pixels" of sensory information in a square grid. Which pins poked into the tongue governed what the brain got that instant of time. So, by reading the changing pattern of the dots, the brain can learn to process that pattern stream in the same way it learns to process the pattern stream that is the million or so "pixels" of information each eye sends, each unit of time. The left brain hemisphere processes Linear-Sequential Information. The right brain hemisphere processes Visual-Simultaneous Information. We know that from the Nobel-prize-winning [1980] research of Dr. Roger Sperry. Current computers process information in a linear, sequential fashion--much like the left hemisphere works. The true breakthroughs in AI will come when we can process and interpret the pattern streams that reach the right hemisphere, the image-oriented streams. The complex interplay between the faster linear-sequential hemisphere and the holistic visual-simultaneous hemisphere is what creates consciousness. This tongue-stream is a great idea.
    • complex interplay...is what creates consciousness Isn't that like saying that a "complex interplay" between a male and a female "creates" life? In other words, "we are still in the stone-age folks, sorry...but hey, we got a complex interplay here! We know that for sure."
    • Re: (Score:3, Insightful)

      by tolan-b ( 230077 )

      You were doing so well for about half a post then it all went to shit :)

      The order in which data is input into a computer makes no difference, processing is typically done frame by frame for a visual dataset, so each cycle (not CPU..) of processing acts on the whole 2D structure in one go.

      Besides, hardware based artificial neural networks for processing images process 2D pixel arrays in parallel.

      Also I think you're miles off when it comes to consciousness. Despite many claims I don't think anyone's really an

    • The complex interplay between the faster linear-sequential hemisphere and the holistic visual-simultaneous hemisphere is what creates consciousness.

      This is a long jump to a conclusion. It's much more plausible that there is simply some "program" somewhere that fools us into believing that we are conscious. A much more complex version of:

      10 PRINT "I AM CONSCIOUS"
      20 GOTO 10

      • I don't yet know what it's going to take but I mean I think it will require science to replicate the functions of both hemispheres: the one that processes language and the other that processes images. I think consciousness is a function of both styles feeding each other and memory.
  • Colours (Score:2, Funny)

    by Anonymous Coward

    So what does blue taste like?

  • Sure, the resolution won't be as fine but it will be a lot less obtrusive to wear a sensor wrapped around your torso than to have something on your tongue with a wire sticking out of your mouth.

    A practical version of that sensor net the blind lady wore on Star Trek back in the '60s will likely be on the market before 2067, assuming technology doesn't leapfrog it entirely.

    • by johncadengo ( 940343 ) on Saturday August 22, 2009 @10:21PM (#29160419) Homepage

      Sure, the resolution won't be as fine but it will be a lot less obtrusive to wear a sensor wrapped around your torso than to have something on your tongue with a wire sticking out of your mouth.

      A practical version of that sensor net the blind lady wore on Star Trek back in the '60s will likely be on the market before 2067, assuming technology doesn't leapfrog it entirely.

      From TFA:

      The key to the device may be its utilization of the tongue, which seems to be an ideal organ for sensing electrical current. Saliva there functions as a good conductor, Seiple said. Also it might help that the tongue's nerve fibers are densely packaged and that these fibers are closer to the tongue's surface relative to other touch organs. (The surfaces of fingers, for example, are covered with a layer of dead cells called stratum corneum.)

      • by HiThere ( 15173 )

        The first version of this I ever heard of was worn on the back. It worked. So the explanation in the article is either incorrect or misleading.

        My suspicion is that the thing that's made this possible is the recent improvements in camera technology.

        • The first version of this I ever heard of was worn on the back. It worked. So the explanation in the article is either incorrect or misleading.

          My suspicion is that the thing that's made this possible is the recent improvements in camera technology.

          Rather than explaining away the article and summary as incorrect or misleading, we could consider the possibility that it simply works better on the tongue. Perhaps that's why this is being considered a breakthrough. Not improvements in camera technology, or the desire for doctors to stick lollipops (read this as you will) on the tongues of patients, but the discovery that the tongue is particularly good as sensory input for what is becoming pseudo-sight.

          • by HiThere ( 15173 )

            That the tongue is better has been known for a long time. Sensory endings are more densely clustered, e.g. But the people wearing the back stimulator processed it as visual imagery, so the explanation that the tongue is uniquely suited because of neural mapping is either misleading or incorrect. The brain is flexible enough to adapt multiple parts of the body as visual substitutes. And, for an example of the opposite, there are reports that people can learn to hear the visual printout of a processed mic

    • It's been done. This device appears to stem directly from Paul Bach-y-Rita's experiments as early as 1972 in providing blind people with vision via a video camera connected to a grid (16x16 or 20x20) of tappers on the back or belly of the subject, who quite quickly learned to interpret those signals in a manner that appeared to be similar to vision. I can only assume the tongue method is better, since that's what they've moved onto now.
    • The big problem with the back or the chest is that the voltage required to breach your skin and contact your nerves is also the amount of voltage that fries those nerves. The tongue has all those nerves right there with minimal protection.
  • I don't know if I'm the only one who's absolutely amazed by this.
    • by fuzzyfuzzyfungus ( 1223518 ) on Saturday August 22, 2009 @10:35PM (#29160479) Journal
      The hardware seems like a fairly pedestrian evolution of cheap image sensors and high-density fabrication techniques.

      The fact that the brain will, fairly swiftly, being interpreting electrical pulses on the tongue as visual input blows my insufficiently capacious mind.
      • by Mal-2 ( 675116 )

        The fact that the brain will, fairly swiftly, being interpreting electrical pulses on the tongue as visual input blows my insufficiently capacious mind.

        I don't really see why. Put a (clean) marble in your mouth without looking at it, and you will probably visualize its spherical shape even though you have obtained all your high-resolution sensation through your tongue. Stumble around in the dark and grab something off your table. Even though you can't see it, chances are that as soon as you pick it up you

      • by TheLink ( 130905 )
        > The fact that the brain will, fairly swiftly, being interpreting electrical pulses on the tongue as visual input blows my insufficiently capacious mind.

        It shouldn't if you've played that game where people write letters on your back (or hand or elsewhere) and you are supposed to "read" them by touch alone.

        Anyway, add:
        1) higher resolution
        2) alternate input channels (tongue is rather inconvenient)
        3) output (there's tech that allows humans and other animals to control stuff just by thinking)
        4) wireless/wir
      • by Ant P. ( 974313 )

        How is that amazing? It's just a higher-resolution version of Braille.

    • by tuxicle ( 996538 )
      You, me, and the guy who cried after being able to read again (from TFA)
  • Seeing Sound? (Score:5, Interesting)

    by Tablizer ( 95088 ) on Saturday August 22, 2009 @10:22PM (#29160425) Journal

    About 20 years ago I thought of a device for deaf people to "see sound" after reading that researchers have learned to read spoken words from from gray-scale sound spectrograms (frequency plots).

    Now an off-the-shelf PDA or iPhone could probably do the trick of showing a plot with the right software. Some slashdot readers claimed it's too hard to learn if you never heard sound before. But it may be worth a try. Besides, some deaf people used to hear before an injury or illness. It's basically pattern-recognition, something humans are pretty good at given sufficient feedback.

    Perhaps these devices can be combined and the frequency plots could flow through the tongue. However, I suspect there's insufficient resolution that way, and eyeballing it would be better. But, it's worth a try.

    • Re: (Score:1, Interesting)

      by Anonymous Coward

      There are many people with sensory disorders that can see sound. Most of them don't like it but they're usually not deaf.

      • Yeah, but they're not actually observing the world with their cross-sensory perceptions. Just because you pop chocolate in your mouth and taste blue, does not mean the chocolate is blue colored.
    • Re:Seeing Sound? (Score:4, Interesting)

      by ozydingo ( 922211 ) on Sunday August 23, 2009 @12:01AM (#29160951)
      An iPhone or PDA easily has enough computing power to do a real-time spectrogram, but to be nitpicky, it's a time-frequency plot, not just a frequency plot. In my experience it's pretty hard to pick up the ability to read spectrograms of speech accurately and quickly, then again it's not my only access to speech. At the very least it would increase a deaf user's awareness of sound in his or her environment, and there would be at least a minimum level of discrimination between various types of sounds.

      As for alternate modes of sensation (assuming something like a cochlear implant is a no-go), look into some of the work being done in vibro-tactile devices - http://techtv.mit.edu/genres/18-education/videos/3557-speakers-and-signers-ted-moallem-sensible-technologies----sensory-communication-aids-for-the-developing-world [mit.edu]
    • My hunch - which I am unqualified to back up - would be that the brain might link the spectral plot to frequencies if it always occurred in the same area. eg, 440Hz always triggers the same nerve, and a chord always triggers the same set of nerves. The PDA could be anywhere in your field of vision, so the brain would have to interpret the images through visual processing, which is not well suited to handling sound. Perhaps it would be possible by holding the PDA at a fixed distance and fixating on a dot

      • by Tablizer ( 95088 )

        would be that the brain might link the spectral plot to frequencies if it always occurred in the same area. eg, 440Hz always triggers the same nerve, and a chord always triggers the same set of nerves.

        I would note that even relatively-simple simulated neural nets can adjust for relative position of general test images, and thus don't require an absolute position to trigger the same response.

        Perhaps low frequencies all the way down an arm, and then bands of high frequencies along the fingers

        That's a very in

  • Who'll be the first to choke to death on Goatse?

  • ... to dark chocolate.

    Now they must build magnetic orientation belts, infrared patches and smelly radiation detectors.
  • Wonderful! (Score:3, Informative)

    by Zitchas ( 713512 ) on Saturday August 22, 2009 @11:12PM (#29160685) Journal
    I don't have the link ready to hand, but the technology behind this was posted to slashdot quiet a while ago. (At least many months, possibly over a year ago) Anyway, I was wondering when we would hear about this technology again, since it has tremendous potential both for sight-restoration applications, as well as furthur development towards the integration of machine and brains. If the resolution was high enough, for instance, a pilot could use this to see underneath the plane, or in other directions normally blocked. The potential application for guided search and rescue, and other remote controlled devices is also large. "being" there is better than simply seeing on a screen, after all, even if virtually. I hope that the various gov't and none-profit groups that support the visually impaired take note of this as a way to help people become active and contributing parts of society again. It's nice to take care of the impaired, but better to help them regain their independence.
    • In 2006 & 2007 I was working at the American company that was working on this. I was working on a different project so I don't know the exact details, but I know they were using the BrainPort to let blind people see things (sending electrical spikes onto the tongue with a resolution of 64 points on the tongue). But it was actually funded by the military to allow hi-tech soldiers to get extra information through their tongue.

      The demo they were working on at the time was to allow the soldier to get an
    • If the resolution was high enough, for instance, a pilot could use this to see underneath the plane, or in other directions normally blocked.

      Do you mean a blind pilot? Wouldn't he just use a normal camera?

  • This is the most promising bit of cybernetics news I have seen in quite a while. I've been hoping that some day within my lifespan artificial senses could be used. Well, now it looks like they can. Maybe they make for low-resolution video, maybe they can be used for information readout. Yeah, it would look weird, but this can give you (for example) a read heads-up display that doesn't interfere with your vision. Or an interface for processing senses from remotely controlled robots. Imagine the fun bus

    • I'd prefer they don't. Greater bandwidth doesn't mean greater processing capability.

    • Reading you email while driving without hitting anything? Won't someone think of the children?

    • by Zitchas ( 713512 )
      Exactly. The possibilities of this are endless, and it could be a major step down the path towards true cybernetic integration (or the Mind/Machine Interface, as some think of it). Starting out with applying it to the blind and otherwise visually impaired serves two important points:

      1) Public perception. There is bound to eventually be an outcry in some sectors about the sanctity of human beings and how machines shouldn't be wired into people and vice versa, machines reading our minds, etc. If the technolog

  • http://www.eyecandycan.com/ [eyecandycan.com]

    Not actually for sale yet so who knows, but I'd love to give it a try.
  • by smcdow ( 114828 ) on Sunday August 23, 2009 @02:05AM (#29161551) Homepage

    You can read all about the work leading up to this device, why it works, amazing stories of recovery from brain injury, and other cool stuff in a book called The Brain That Changes Itself [amazon.com].

    This is one of the best books I've ever read.

  • Would this still work for them? Seems like it would not.

  • I'll get one of these, mount the camera part on the back of my bike helmet, and be able to "see" both forward and backwards at the same time!

    I can finally get rid of that helmet-mounted mirror, so I won't look like a dork anymore!

    • by bi_boy ( 630968 )

      I like those helmet mounted mirrors, I see them on bicyclists sometimes when they ride by the house. Makes me think of some cool futuristic personal HUD.

      ...Which I guess is still dorky. But its a good dorky cos it's cool.

  • Danger... Hot Food (Score:4, Interesting)

    by fractalVisionz ( 989785 ) on Sunday August 23, 2009 @02:32AM (#29161683) Homepage
    What happens when you burn your tongue? Does your "sight" degrade or get blurry while your taste buds are being repaired?
  • Memory wipe (Score:3, Funny)

    by tuxicle ( 996538 ) on Sunday August 23, 2009 @03:14AM (#29161891)
    Fry: Did everything just taste purple for a second?
  • The insides of the cheeks are not as sensitive but the available area is larger, binocular vision might be possible, and it might be possible to leave the electrodes in while talking and perhaps even while eating.

    (My wife's idea, not mine.)

  • by Mprx ( 82435 ) on Sunday August 23, 2009 @08:10AM (#29162953)

    Tactile reaction time is faster than visual reaction time. If the resolution is high enough and the switching time fast enough, could this system be advantageous where fast reactions are needed (eg. games, sport, driving, combat, etc.). Could it be combined with normal vision for a kind of minor precognition?

    How about using it for extended vision with more frequency channels, wider or narrower field of vision, faster automatic brightness control, etc? Touch has multiple channels but how many are high enough resolution to be useful?

    As anyone who's used psychedelic drugs will know, the human visual system is bottlenecked by the eyes. The brain can certainly handle more powerful sensors so we should be working on making them.

    • Comment removed based on user account deletion
      • by Mprx ( 82435 )

        With reaction time for triggering a predetermined behavior (clicking a button, starting to run, catching a falling object, etc) in response to a predetermined stimulus, latency will be lowest with tactile stimulus.

        I meant extending vision into UV/IR, or just improving color vision.

        Pupil size adjustment acts like "dynamic contrast" on LCDs, it's not real dynamic range, and it's definitely too slow. Right now imitating the eye with a camera aperture is probably the best option, but maybe in the future someth

        • Comment removed based on user account deletion
          • by Mprx ( 82435 )

            Looking for references, tactile reaction time seems to vary depending on location and nature of the stimulus. It may sometimes be slower than visual reaction time.

            IR vision wouldn't be annoying if it were on a separate channel to normal color vision. A few humans are suspected of having tetrachromatic vision, so the brain probably supports more color channels.

    • I remember reading about this tech a long time ago and it was suggested that this sort of thing would be useful for pilots. In a modern fighter jet, there are several displays in front of the pilot, each one having lots of information behind menus. The pilot's eyes are already saturated with information, and the audio channel is used up by human-to-human communication and critical warnings. So if we want to get more information into the pilot, we could try to make use of other senses. A "tongue interfac
      • by Mprx ( 82435 )
        It could be that it's only processed by the visual cortex in blind people.
  • If you coated a retainer with the visual surface, it would then attach to the teeth, and you could see by pressing the tounge to the roof of the mouth... or drop the tongue to speak/drink (not sure about eating, do you normally take out the retainer?)
  • The inverse (Score:2, Interesting)

    by MathiasRav ( 1210872 )
    I'd love to be able to see tastes - that is, have my sense of taste piped to the brain as vision instead of taste. I wouldn't want my vision permanently replaced, but I'd love to experience the brain visualising what I taste. And hey, instead of the usual 3 dimensions [wikipedia.org] (or more if you're lucky [wikipedia.org]), you'd have 5 factors [wikipedia.org] to go with.
  • This almost sounds like an induced form of synesthesia [wikipedia.org], a condition where someone's senses operate involuntarily as a merged experience. For example, sounds that generate visual feedback or brief changes in taste.

  • What I want to know is, will this work for a sighted person? If the tongue seer is looking at one scene, and your eyes are looking at another, can the brain sort them out and allow a person to see two different things at once. Meaningfully, I mean. If so, there could be a lot of applications. And implications. If the brain can see in two directions at once, we could extend our visual senses in all kinds of ways.
  • What happens when you feed the poor guy alphabits?

The goal of Computer Science is to build something that will last at least until we've finished building it.

Working...