Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Hardware Technology

MIT's Tiny Artificial Brain Chip Could Bring Supercomputer Smarts To Mobile Devices (techcrunch.com) 15

An anonymous reader quotes a report from TechCrunch: Researchers at MIT have published a new paper that describes a new type of artificial brain synapse that offers performance improvements versus other existing versions, and which can be combined in volumes of tens of thousands on a chip that's smaller physically than a single piece of confetti. The results could help create devices that can handle complex AI computing locally, while remaining small and power-efficient, and without having to connect to a data center. The research team created what are known as "memristors" -- essentially simulated brain synapses created using silicon, but also using alloys of silver and copper in their construction. The result was a chip that could effectively "remember" and recall images in very high detail, repeatedly, with much crisper and more detailed "remembered" images than in other types of simulated brain circuits that have come before. What the team wants to ultimately do is recreate large, complex artificial neural networks that are currently based in software that require significant GPU computing power to run -- but as dedicated hardware, so that it can be localized in small devices, including potentially your phone, or a camera.

Unlike traditional transistors, which can switch between only two states (0 or 1) and which form the basis of modern computers, memsistors offer a gradient of values, much more like your brain, the original analog computer. They also can "remember" these states so they can easily recreate the same signal for the same received current multiple times over. What the researchers did here was borrow a concept from metallurgy: When metallurgists want to change the properties of a metal, they combine it with another that has that desired property, to create an alloy. Similarly, the researchers here found an element they could combine with the silver they use as the memristor's positive electrode, in order to make it better able to consistently and reliably transfer ions along even a very thin conduction channel. That's what enabled the team to create super small chips that contain tens of thousands of memristors that can nonetheless not only reliably recreate images from "memory," but also perform inference tasks like improving the detail of, or blurring the original image on command, better than other, previous memristors created by other scientists.

This discussion has been archived. No new comments can be posted.

MIT's Tiny Artificial Brain Chip Could Bring Supercomputer Smarts To Mobile Devices

Comments Filter:
  • by goombah99 ( 560566 ) on Monday June 08, 2020 @07:36PM (#60161902)

    not point in talking about this rubbish extrapolative press release. Lets discuss the weather instead

  • by im_thatoneguy ( 819432 ) on Monday June 08, 2020 @07:37PM (#60161904)

    Essentially this will be like a dedicated ASIC vs an FPGA. It's always far more efficient to have dedicated circuits to general purpose programmable chips.

    But these probably are much less useful at this exact moment in time since the neural networks we might want to bake into hardware still are very much in development.

    Although certainly we'll reach a point with something like voice recognition where the net is accurate enough and the utility is broad enough that it would be fine if a new phone includes a 99% accurate voice recognition network that is baked into the device and accessible to the SDK. Obviously we see this sort of thing in Image Processing Units which use computational photography to remove lens distortion and chromatic aberration. Tasks which all photography apps benefit from and which don't really need updates later.

  • "The images are manifest to man, but the light in them remains concealed in the image of the light of the father. He will become manifest, but his image will remain concealed by his light."

    Application can be a challenge, though. An image interpreting images about images. I'll go... conservative with this buy.

  • I for one welcome Commander Data and his Cylon army.

  • There's been talk about memristors and their potential impact on computing for a very long time (just use the Slashdot search for examples), but we don't seem to have seen much success. HP have been hyping them over many years to no real effect, but from the linked MIT press release and the Nature Nanotechnology paper at https://www.nature.com/article... [nature.com] (as opposed to the fluff piece quoted) it looks like MIT may have a more viable mechanism.

    • I'm pretty sure I heard about memristors back in the 70s and how they were going to change everything. Heard about 'em again in the 80s, and the 90s too. Still ain't seen shit.

      Maybe their time has come but I'll need to see more than a freshly-printed research paper before I get my first memristor tattoo.

    • by timholman ( 71886 ) on Monday June 08, 2020 @10:37PM (#60162338)

      HP have been hyping them over many years to no real effect, but from the linked MIT press release and the Nature Nanotechnology paper at https://www.nature.com/article [nature.com]... (as opposed to the fluff piece quoted) it looks like MIT may have a more viable mechanism.

      I did some consulting with HP in Boise back in September 2001, which I remember quite vividly because I was stranded there for several days after the 9/11 attack. At that time they were working on their memristor technology to create nonvolatile memory, and hired me because of my expertise in noise measurements.

      As it turned out, the memresistors were very 1/f noisy, making the logic "0" or "1" impossible to reliably determine in a compact memory cell design. And flicker noise, unlike Gaussian noise, cannot be averaged out, as the noise power increases right in step with the measurement time. All I could do was provide some recommendations on reducing 1/f noise in the sense amps, but it wasn't enough to solve the problem. With CMOS memory technology continuing to scale, their memristor memory soon fell by the wayside.

      Has MIT licked the flicker noise problem? If so, their technology might find some applications.

  • by Baron_Yam ( 643147 ) on Monday June 08, 2020 @09:36PM (#60162196)

    AI is the most spectacular application, giving the ability to create artificial synapses in hardware rather than software. You'd think that's the big deal here, but it turns out it isn't.

    You can arrange multiple memristors to behave like a standard transistor... only it can remember its state without power, and the combined memristors are smaller than a transistor.

    Years ago, I believe it was IBM talking about creating computers where short term volatile storage, long term stable storage, and the CPU were all the same thing. A computer that turns on and off instantly, because it doesn't need to write state information anywhere to survive the power turning off - it just goes in and out of standby on command.

    So as much as I love the AI potential, I know we haven't figured out how those systems need to be organized to work on large scales yet. We do know how to make a computer, though, so I think seeing a new generation of otherwise standard computers first the thing to hold your breath for.

  • 10's of thousands synapses are not good enough for ai, in brain single neuron has 10000 connections

    • Also, real brain neurons have *vastly* more features. Like actually growing and changing with its input for ever. Like field effects, using chemicals that spread through the brain. Like many different neurotransmitters and signal patterns and types! Like directionality / direction-dependent sensitivity!

      Not just a ... matrix of weights.

    • by HiThere ( 15173 )

      You have a restricted view of what constitutes AI. Most AI isn't intended to converse about Plato while driving a car. For many purposes tens of neurons is sufficient, much less tens of thousands. This won't get you an AGI, but it might get you a plug that recognizes what signal coding it's receiving.

"It is easier to fight for principles than to live up to them." -- Alfred Adler

Working...