Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Hardware Science

MIT Creates Chip to Model Synapses 220

Posted by Unknown Lamer
from the man-is-obsolete dept.
MrSeb writes with this excerpt from an Extreme Tech article: "With 400 transistors and standard CMOS manufacturing techniques, a group of MIT researchers have created the first computer chip that mimics the analog, ion-based communication in a synapse between two neurons. Scientists and engineers have tried to fashion brain-like neural networks before, but transistor-transistor logic is fundamentally digital — and the brain is completely analog. Neurons do not suddenly flip from '0' to '1' — they can occupy an almost-infinite scale of analog, in-between values. You can approximate the analog function of synapses by using fuzzy logic (and by ladling on more processors), but that approach only goes so far. MIT's chip is dedicated to modeling every biological caveat in a single synapse. 'We now have a way to capture each and every ionic process that's going on in a neuron,' says Chi-Sang Poon, an MIT researcher who worked on the project. The next step? Scaling up the number of synapses and building specific parts of the brain, such as our visual processing or motor control systems. The long-term goal would be to provide bionic components that augment or replace parts of the human physiology, perhaps in blind or crippled people — and, of course, artificial intelligence. With current state-of-the-art technology it takes hours or days to simulate a simple brain circuit. With MIT's brain chip, the simulation is faster than the biological system itself."
This discussion has been archived. No new comments can be posted.

MIT Creates Chip to Model Synapses

Comments Filter:
  • by agrif (960591) on Wednesday November 16, 2011 @07:30AM (#38072166) Homepage

    "Your species is obsolete," the ghost comments smugly. "Inappropriately adapted to artificial realities. Poorly optimized circuitry, excessively complex low-bandwidth sensors, messily global variables..."

    Accelerando [jus.uio.no], by Charles Stross

  • by leptogenesis (1305483) on Wednesday November 16, 2011 @08:44AM (#38072644)
    Mod parent up. The linked article (and the MIT press release) are misleading. The closest thing I can find to a peer-reviewed publication by Poon has an abstract is here (no, I can't find anything throught the official EMBC channels--what a disgustingly closed conference):

    https://embs.papercept.net/conferences/scripts/abstract.pl?ConfID=14&Number=2328 [papercept.net]

    And there's some background on Poon's goals here:

    http://www.frontiersin.org/Journal/FullText.aspx?ART_DOI=10.3389/fnins.2011.00108&name=neuromorphic_engineering [frontiersin.org]

    The goals seem to me to be about studying specific theories about information propagation across synapses as well as studying brain-computer interfaces. They never mention building a model of the entire visual system or any serious artificial intelligence. We have only the vaguest theories about how the visual system works beyond V1, and essentially no idea what properties of the synapse are important to make it happen.

    About two years ago, while I was still doing my undergraduate research in neural modeling, I recall that the particular theory they're talking about--spike-timing dependent plasticity [wikipedia.org]--was quite controversial. It might have been simply an artifact of the way the NMDA receptor worked. Nobody seemed to have any cohesive theory for why it would lead to intelligence or learning, other than vague references to the well-established Hebb rule.

    Nor is it anything new. Remember this [slashdot.org] story from ages ago? Remember how well that returned on its promises of creating a real brain? That was spike-timing dependent plasticity as well, and unsurprisingly it never did anything resembling thought.

    Slashdot, can we please stop posting stories about people trying to make brains on chips and post stories about real AI research?

Invest in physics -- own a piece of Dirac!

Working...