Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Microsoft Hardware Science

Microsoft's Analog Optical Computer Shows AI Promise (microsoft.com) 33

Four years ago a small Microsoft Research team started creating an analog optical computer. They used commercially available parts like sensors from smartphone cameras, optical lenses, and micro-LED lights finer than a human hair. "As the light passes through the sensor at different intensities, the analog optical computer can add and multiply numbers," explains a Microsoft blog post.

They envision the technology scaling to a computer that for certain problems is 100X faster and 100X more energy efficient — running AI workloads "with a fraction of the energy needed and at much greater speed than the GPUs running today's large language models." The results are described in a paper published in the scientific journal Nature, according to the blog post: At the same time, Microsoft is publicly sharing its "optimization solver" algorithm and the "digital twin" it developed so that researchers from other organizations can investigate this new computing paradigm and propose new problems to solve and new ways to solve them. Francesca Parmigiani, a Microsoft principal research manager who leads the team developing the AOC, explained that the digital twin is a computer-based model that mimics how the real analog optical computer [or "AOC"] behaves; it simulates the same inputs, processes and outputs, but in a digital environment — like a software version of the hardware. This allowed the Microsoft researchers and collaborators to solve optimization problems at a scale that would be useful in real situations. This digital twin will also allow other users to experiment with how problems, either in optimization or in AI, would be mapped and run on the analog optical computer hardware. "To have the kind of success we are dreaming about, we need other researchers to be experimenting and thinking about how this hardware can be used," Parmigiani said.

Hitesh Ballani, who directs research on future AI infrastructure at the Microsoft Research lab in Cambridge, U.K. said he believes the AOC could be a game changer. "We have actually delivered on the hard promise that it can make a big difference in two real-world problems in two domains, banking and healthcare," he said. Further, "we opened up a whole new application domain by showing that exactly the same hardware could serve AI models, too." In the healthcare example described in the Nature paper, the researchers used the digital twin to reconstruct MRI scans with a good degree of accuracy. The research indicates that the device could theoretically cut the time it takes to do those scans from 30 minutes to five. In the banking example, the AOC succeeded in resolving a complex optimization test case with a high degree of accuracy...

As researchers refine the AOC, adding more and more micro-LEDs, it could eventually have millions or even more than a billion weights. At the same time, it should get smaller and smaller as parts are miniaturized, researchers say.

Microsoft's Analog Optical Computer Shows AI Promise

Comments Filter:
  • But at least at a first read through TFA, it appears the titular optical computer doesn't actually exist yet - to this point, everything is happening on the "digital twin".

    • It took me a while to find it, but it looks like they have actually built something -> https://news.microsoft.com/sou... [microsoft.com]

      • by larryjoe ( 135075 ) on Sunday September 07, 2025 @10:52PM (#65645624)

        It took me a while to find it, but it looks like they have actually built something -> https://news.microsoft.com/sou... [microsoft.com]

        What they've built is a 8-variable optical computer. They're hoping to scale this up soon, but the amount of scaling isn't mentioned.

        Of course, this completely misses the key challenge of AI computing. The ALU/compute part is the easy part. It's a small part of the chip and it consumes a small part of the power. The key problem is data movement, particularly how to quickly and efficiently grab billions of variables from memory, send them to billions of compute units, then send those outputs to the next set of billions of computer units, and then back to memory. This is one of the reasons that GPU hardware has done so well in the AI space. Microsoft's optical computer, even if it's wildly successful only addresses a small part of the challenge.

        • The link directly above says: "computation and storage happen at the same place in AIM, unlike binary computers, which need memory in one location and compute in another to function."

          The webpage this article links to says: "we rely on a physical system to embody the computations and step away from several fundamentally limiting aspects of digital computing â" avoiding the separation of compute from memory..."

          I won't try to vouch for this technology or its scalability but it does sound like are atta

          • Re: (Score:3, Informative)

            by Anonymous Coward

            "computation and storage happen at the same place in AIM, unlike binary computers, which need memory in one location and compute in another to function."

            This is a limitation of the Von Neumann architecture, not due to it being "binary" per se, and is known as as the Von Neumann bottleneck.
            Memory+compute chips certainly can be designed so that arithmetic digital logic is physically adjacent to the data it's operating on. The idea is called compute-in-memory (or in-memory computing) and it's an area of active research: link to a paper on the topic [arxiv.org]
            Of course, computing in memory requires a completely different way of thinking about software than the CPUs or GPU

            • by Rei ( 128717 )

              Of course, computing in memory requires a completely different way of thinking about software than the CPUs or GPUs that we know and love.

              Which is... what they're doing.

              This is an implementation of something I've been talking about for years. The brain is much more efficient than today's AI chips - not because "wetware" is efficient (it's terribly inefficient, with a huge amount of overhead and numerous chained loss steps), but because analog accumulation is efficient compared to vector math for AI tasks. I

              • Anyways if an deep NN ever gets used very much then even the training costs, large as they are, will be swamped by inference costs over the lifetime of the model. So making an untrainable analog model would be like burning it into an ASIC.

                Maybe you could put a few trainable digital layers sprinkled here and there to make it viable for a long time and still come out way ahead on inference costs over a pure digital model. "How to speak english" and "the appearance of objects in the world" and "history" don

              • by cstacy ( 534252 )

                You want the laws of physics to do the "math" for you. For the input field of a neuron in a DNN, instead of multiplying two vectors.... flow paths...resistance.. to the flow on that path (in the case of light, optical..transparency, respectively). You then need a physical nonlinear activation function (with a bias) based on how much flow accumulated... leaves that neuron to the next layer...It might require predictive coding networks... if we ended up switching to PCNs, as they have all sorts of great properties (including realtime learning).

                Blah blah blah blah.

                All I got was: Positronic Brain!

                (And possibly: "Resistance is Flow, not Futile".)

          • The link directly above says: "computation and storage happen at the same place in AIM, unlike binary computers, which need memory in one location and compute in another to function."

            The webpage this article links to says: "we rely on a physical system to embody the computations and step away from several fundamentally limiting aspects of digital computing â" avoiding the separation of compute from memory..."

            I won't try to vouch for this technology or its scalability but it does sound like are attacking data movement.

            This idea of doing compute at storage or memory is an old idea. It works fairly well if the memory stream is serial and the serial streams are independent and fails completely for AI because the memory streams are basically everything to everything. The "easy" way to address this problem is a crossbar that allows all the data to flow anywhere, but, of course, that doesn't work because a crossbar of that scale is impractical in size, power, and efficiency. How to design this memory subsystem is the key ch

        • They are exploring light for movement across *different* GPUs:

          https://tech.yahoo.com/ai/arti... [yahoo.com]

          I am actually convinced this optical path will turn out right. The status quo is all the deterministic math you describes happens, to produce an output vector with a randomly chosen result. Why not use the soft stochastic qualities of analog light computing to get the same sculpted random result? Plus optical compute has neat features like free Fourier transforms for convents and vision applications.

          • by Rei ( 128717 )

            Why not use the soft stochastic qualities of analog light computing

            Yeah... at least during inference (training will be hard with this with traditional methods). Inference is extremely tolerant of noise. They should be able to use vanishingly short timesteps.

            • Noise tolerance in general struck me when I was playing with LLMs on my desktop. They would have these models pruned down with massively reduced bit depths, and they were almost as good as originals. It hints at some kind of vast reducibility to me, but not clear in discrete sense what it would be. But obviously brain is analogue and noisy so it seems promising.

        • by ceoyoyo ( 59147 ) on Monday September 08, 2025 @12:56AM (#65645702)

          This isn't the first optical computer, nor is it the first one to implement a neural network. There isn't an "ALU" or anything like that.

          The easiest way to do it is to take a piece of glass and etch a pattern into it so that when you shine light through it it implements the neural net. For example: https://opg.optica.org/prj/ful... [optica.org]

          Those approaches generally have the issue that optics are linear. The real magic of neural networks is that they can solve nonlinear problems, but only if they incorporate nonlinearity. You can do that with special optics, but it's not easy or easily controllable.

          You can also make hybrid systems with active optical components. Glancing at Microsoft's paper that seems to be more what they're doing. You use microLEDs to emit light, liquid crystal arrays to manipulate it, and a camera or array of photocells to convert it to electrical signals. You can then use some simple electronics to do things like rectify the signal, then feed that to the next layer of microLEDs.

    • or, analog twin. What they're building sounds like an analog computer. Electronic analog computers are pretty much nonexistant by that name these days, but use to be used for, as in this case, near instanraneous solutions to differential equations and other specific problems. Analog control systems exist, which are similar, but with different application.
  • by Mr. Dollar Ton ( 5495648 ) on Sunday September 07, 2025 @10:14PM (#65645578)

    Well, we did get the miracle of the opamp out of it once, maybe we'll get something useful out of this as well.

  • or at least it used to be...

  • by Rosco P. Coltrane ( 209368 ) on Sunday September 07, 2025 @11:48PM (#65645646)

    What next? An AI-power Microsoft mouse pad? AI-powered Microsoft toilet roll holder?

    Look, we know you sunk five kajillion dollars in AI and you ain't got nothing to show for it. Quite ramming it down everybody's throats already!

    • Tell me about it. AI is the new Crypto. All marketing and no value.

    • by cstacy ( 534252 )

      What next? An AI-power Microsoft mouse pad? AI-powered Microsoft toilet roll holder?

      Look, we know you sunk five kajillion dollars in AI and you ain't got nothing to show for it.

      I have made some funny audio tracks using AI generated celebrity/politician voices. Surely that was worth the trillion zillion dollars?

  • The snarkiest computer ever [youtube.com]

  • Like social media, hard to put the cat back into the bag !
    • by cstacy ( 534252 )

      Like social media, hard to put the cat back into the bag !

      I'm still trying to catch up on comprehending the technology of 25 years ago! Self-Driving Virtual Reality Quantum cats that you don't even know if they're alive or dead in that bag that you 3-D printed on the Blockchain IoT Cloud.

The aim of science is to seek the simplest explanations of complex facts. Seek simplicity and distrust it. -- Whitehead.

Working...