Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Microsoft Hardware

Microsoft's Light-Based, Transistor-less Computer Solves Complex Optimization Problems at the Speed of Light (techspot.com) 65

"Picture a world where computing is not limited by the binary confines of zeros and ones, but instead, is free to explore the vast possibilities of continuous value data." That's Microsoft's research blog, describing its newly-developed Analog Iterative Machine, an analog optical computer designed for solving difficult optimization problems.

"For a multidisciplinary group of researchers at the Microsoft Research Lab in Cambridge, U.K., the mission was to build a new kind of computer that would transcend the limitations of the binary systems," says a Microsoft blog post.

Neowin describes it as a computer "that uses photons and electrons, rather than transistors, to process data." Light "passes through several layers, making impressions on each part of what's known as a 'modular array'," writes PC Gamer. "It's this process of projecting light through the array that replaces the function of a standard transistor."

Microsoft says it can "solve practical problems at the speed of light." And "it's already shown potential for surpassing state-of-the art digital (silicon-based) technology," adds TechSpot, "or even the most powerful quantum computers being designed right now." The AIM machine is built using commodity opto-electronic technologies that are low-cost and scalable, Microsoft says, and is based on an "asynchronous data flow architecture" which doesn't require data exchange between storage units and "compute locations."

AIM isn't designed for general purpose computing tasks, though. The analog optical computer is useful to solve difficult "optimization problems" like the well-known travelling salesman riddle, Microsoft says, which are at the heart of many, math-intensive industries including finance, logistics, transportation, energy, healthcare, and manufacturing. When it comes to crunching all the possible combinations of an exponentially growing problem, traditional, digital computers struggle to provide a solution in a "timely, energy-efficient and cost-effective manner."

AIM was conceived to address two simultaneous trends, Microsoft explains, which are sidestepping the unraveling of Moore's Law and overcoming the limitations of specialized machines designed for solving optimization problems... AIM works at the speed of light, and it seemingly provides a 100x increase in performance compared to the most advanced digital approaches available today. For now, AIM is still a research project with limited access for potential customers. The machine, however, is already being tested by UK financial company Barclays, which is using it to track transactions of money into stock purchases.

Microsoft says it's now releasing its "AIM simulator as a service, allowing selected users to get first-hand experience. The initial users are the team's collaborators at Princeton University and at Cambridge University."
This discussion has been archived. No new comments can be posted.

Microsoft's Light-Based, Transistor-less Computer Solves Complex Optimization Problems at the Speed of Light

Comments Filter:
  • speed of light (Score:4, Insightful)

    by test321 ( 8891681 ) on Sunday July 02, 2023 @11:40AM (#63651046)

    Electron-based computers also work at the speed of light (at the speed of EM waves in metal). All current computers work at the speed of light, except if someone builds one based on water and valves, in which case it works at the speed of sound (in water).

    • Re:speed of light (Score:5, Informative)

      by Teresita ( 982888 ) <badinage1@netzero [ ] net ['dot' in gap]> on Sunday July 02, 2023 @11:57AM (#63651076) Homepage
      Computers work at the speed of their clock and their fetch-interpret-execute cycle. Not even the traces between transistors propagate signals at the speed of light. For one thing, there is a little bit of capacitance. You change voltages on a signal path you have to charge it up, or discharge. So there's allowances for when the signal path has a valid signal. In fine, this article is written for the kind of people who think the Marvel Multiverse is a thing.
      • by HiThere ( 15173 )

        The bit about "clock cycles" is good, and also applies to many phtotic computer designs, So the bit about capacitance is irrelevant. OTOH, this sounds like some sort of advanced analog computer.

        FWIW, there's nothing sacred about bits. At one point there was a 10 way switch that was the primary element of one particular computer. It was still a digital computer, because it didn't work with "a continuous spectrum". It turned out to be uneconomic to design things that way, though.

        Here the PR talks about "

        • Re:speed of light (Score:5, Informative)

          by bobby ( 109046 ) on Sunday July 02, 2023 @08:37PM (#63652102)

          the bit about capacitance is irrelevant.

          As an EE I politely disagree. As you know, binary digital computers are based on base 2 math, so you have 0 or 1. Zeros and ones are represented by defined voltage levels. To transition from a zero to a one takes time, and the more capacitance a circuit has, the more time it takes, which means slower overall speeds.

          Transistors, the basic core parts of digital logic circuits, have inherent capacitance.

          To be clear, "capacitance" means the ability to store electrical charge- to literally contain electrons.

          The science math (simple calculus) for a capacitor is:

          i = c dv/dt

          "i" = electrical current.
          "c" = capacitance (measured in "Farads", or micro, nano, picofarads, etc.)
          "dv" is calculus notation for change in voltage
          "dt" is calculus notation for change in time.

          It all means, the more capacitance you have, the more electric current (amps) is required to charge or discharge the inherent capacitances if you want to achieve shorter (quicker) transitions from 0 to 1 or 1 to 0, and faster overall logic (clock) speeds

          It's why many CPU, GPU, etc., require so many amps, and so much cooling as they've been getting faster and faster in the past 40 or so years.

          At some point limitations and compromises in circuit designs, like heat build up, make it more and more difficult to achieve faster and faster clock speeds.

          Smaller feature sizes (7nm for example) means the transistors and other circuits in a "chip" are smaller so there's less inherent capacitance to slow the circuits down.

          • by HiThere ( 15173 )

            Yes,. but it's the clock cycle that determines the speed. True, the limit to the clock cycle is set by other things (including capacitance), but the clock cycle dominates.

            Also, you are only considering one variety of digital computer. I specifically mentioned an early one that used base 10 rather than base 2. And it was still a digital computer. (IIRC one of the reasons it was impractical was that splitting the signal into 10 levels had too high an error rate, but we could probably do it now if there we

            • by Shaitan ( 22585 )

              "The basis of digital is particular discrete values of whatever enumeration."

              Indeed.

            • I thought you were a nut with no idea.

              Then I understood what you were saying.

              I'm the nut with no idea.

            • Whoever wrote the PR report did fine. It's quite common for computer scientists, especially theory-leaning academics, to use the word "binary" loosely to refer to any small-base discrete encoding, since the difference between using a base 2 and a base k encoding, for any fixed non-huge k, almost never has significant theoretical repercussions.
              • by HiThere ( 15173 )

                The PR blurb, as quoted in the summary, talked about "continuous values". This is fine if they meant an analog computer, but not if they meant a digital one, whatever the base.

                And' I'd be quite leery of trusting anything claimed by an academic theorist that misused binary to include ternary, quaternary, etc. without LOTS of surrounding explanation of why they were doing that.

      • Re:speed of light (Score:4, Insightful)

        by pz ( 113803 ) on Sunday July 02, 2023 @07:58PM (#63652032) Journal

        Not even the traces between transistors propagate signals at the speed of light.

        That statement contains a common fallacy. You need to include a very, very important qualifier for the statement to be correct: "in a vacuum." The signals in a computer do, very much, travel at the speed of light in that particular medium. The capacitance and inductance of a trace, for example, establish the speed of light in that medium. And, indeed, the signal flows at that speed.

        No, this is not being pedantic. When you design circuits, you very, very much are worried about the speed of propagation. You design traces as waveguides. You worry about reflections. You adjust parameters to set the speed of light in that medium.

        • by bobby ( 109046 )

          You probably know, but for others' sake: one of the many electrical characteristic specs for coaxial cable, which is typically used for antennas and other RF uses, is "velocity factor" and is represented as a percentage of the speed of light, and is often in the 60-90% range.

          https://www.protoexpress.com/blog/measuring-pcb-cable-and-interconnect-impedance-dielectric-constant-velocity-factor-and-lengths/#:~:text=Coaxial%20cable%20specifications%20often%20include,permittivity%20of%20the%20transmission%20line. [protoexpress.com]

          P

          • by pz ( 113803 )

            Point is, electricity does not travel at the speed of light in wires.

            Again, this is the fallacy. Electricity does travel at the speed of light, always, in wires, in vacuum, etc. It's just that the speed of light depends on the medium, and in wires, the speed of light is slower than in vacuum because of the cable characteristics. That's always true, no matter the medium.

            For example, the speed of light inside the sun is incredibly slow. It takes a photon 20,000 years to get from the core to the photosphere. It is impossible to travel faster. That *is* the speed of light

            • by bobby ( 109046 )

              You are correct, of course. As an engineer, theory and science are fun but I more deal with the practical. Molecular physics is great, but I care more about propagation time of a pulse going through a wire.

              Some may find it interesting: older oscilloscopes used a length of coax cable to delay the signal during its travel to the display (CRT tube). This was to allow the horizontal display motion to start before the vertical, so that you could see the beginning of a signal / pulse.

            • by ceoyoyo ( 59147 )

              Photons take so long to get out of the sun because they're continually absorbed and re-emitted. There's a separate effect where photons slow down in transparent media like glass and water, which does not involve absorption. Neither of those effects changes c, the actual speed limit. Solar neutrinos make the trip from the core of the sun to the surface in a couple of seconds, very nearly c. Cherenkov radiation is caused by charged particles exceeding the speed of photons in a medium.

              • by bobby ( 109046 )

                Very interesting. Do neutrinos pass through stars?

                • by ceoyoyo ( 59147 )

                  Of course. Observations of solar neutrinos are one of the main ways we know the details of what fusion reactions are happening in the sun. The "solar neutrino problem" was also one of our first hints that neutrinos have mass.

                  • by bobby ( 109046 )

                    Thanks. I was trying to ask, do neutrinos that come from somewhere else completely pass through a star?

                    • by ceoyoyo ( 59147 )

                      Sure. The ones generated in the core of the sun pass through half of it with almost no absorption, so similar energy neutrinos from elsewhere will also pass through. The famous statement is that a neutrino can pass through a light year of lead without being absorbed. The sun is much, much smaller than a light year, and is a lot less dense than lead. It's also much lower Z.

                      That's true for typical fusion neutrinos. Very high energy neutrinos won't pass through the sun, or Earth for that matter.

      • this article is written for the kind of people who think the Marvel Multiverse is a thing

        Exchuuse me sir, but I can assure you the marvel franchise owns both.

    • Re:speed of light (Score:5, Informative)

      by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Sunday July 02, 2023 @11:59AM (#63651082) Homepage Journal

      All current computers work at the speed of electrons being transmitted through a wire, which is slower than the speed of light in a vacuum (which is what's usually meant when the speed of light is brought up.) Optical processors could theoretically contain vacuum pipes through which signals were sent. I doubt it could be worth it since the switching speed is still a limitation, but the actual idea of optical processors is that they can/will be able to switch faster than ones made out of transistors.

      • Re:speed of light (Score:5, Insightful)

        by test321 ( 8891681 ) on Sunday July 02, 2023 @01:13PM (#63651218)

        slower than the speed of light in a vacuum (which is what's usually meant when the speed of light is brought up.)

        You're right, but their computer uses light in optical fiber, which also light in a medium (refraction index 1.52 for optical fiber in IR range). I think the "speed of light" is marketspeak, it is totally uninvolved in the speed of this computer, not even the speed of the optical switches. What they claim is to have built an analogue computer to solve the travelling salesman problem, which means what they probably do is propagating and interfering optical paths. Possibly an infinite number of paths that eliminate the exponential nature of the problem, not one that try them out faster. This principle could also be in principle implemented with electrons, but length of coherence of electron waves (in known or usable materials) is much shorter than for photons.

        • by Shaitan ( 22585 )

          "I think the "speed of light" is marketspeak"

          Indeed and it is technically the speed of light because it is optical. The light in my office also operates at the speed of light no matter how slow the switch is. Saying it works at the speed of light really tells you nothing.

      • by ceoyoyo ( 59147 )

        Electronic computers work at the speed of their clocks. Their signal delay, which does have an effect on how fast their clocks can run, is determined by the speed of electrical signals in wires. This is essentially the speed of light. The speed of the electrons is essentially zero.

    • The trick is build a computer that compute everything everywhere all at once.
      • It is very feasible to do an analog optical multiplication of all points on a 2D surface concurrently. How?
        1. Project (or mask) a UV image onto one surface of a cold thin sheet of photochromic glass. It will darken according to the image.
        2. Project (or mask) a different UV image onto the other surface of a cold thin sheet of photochromic glass. It will darken according to the image.
        3. To multiply the two images, shine a collimated beam of visible light through the glass. Alternativey, for a 2D, Fourier transform p
    • Capacitance and resistance seem to be what 'slows down' current computers. The gate on every transistor can be modeled as a capacitor, and every wire has resistance.
      • In modern high speed PC boards, the traces are laid out as transmission lines, where the distributed capacitance and inductance combine to allow waves to travel at a speed related to the dielectric constant of the base material. This is usually >0.5C but depends on the material. There is also delay in the devices themselves. For chip-chip connections, the propagation delay in the PCB usually dominates, for on-chip usually its device delay.
    • Electron-based computers also work at the speed of light

      Quite a bit slower, actually.
      But then again, light going through a fiber goes quite a bit slower than the speed of light (in a vacuum) as well.

      All current computers work at the speed of light

      No.
      In fact none has ever, nor ever will.

      Though this is pretty close.

  • by gweihir ( 88907 ) on Sunday July 02, 2023 @11:46AM (#63651052)

    And no, it does not. All switches, including optical ones have delays. And if you do away with them, you end up with what is essentially a pre-computed _table_. Still useful, but the concept of "computing speed" becomes meaningless.

    Also, what is it with the stories that hype Microsoft recently? They are still the same soulless evil company that is pushing 2nd and 3rd rated products on a lot of people and that grew large on criminal and immoral business practices but most definitely not on good engineering. In fact, their engineering has gotten worse over the past few years.

    • The bigger nonsense is that all they have right now is a simulated whatever-this-thing-is just like all those super-duper simulated quantum computers that solve world hunger and provide world peace.

      All these stories with big hype. But all they are at with development is where conventional binary computers were long ago when it was discovered a metal wire can be used to move electricity. Ok maybe they are at the 'we figured out the transistor' stage.
      • by gweihir ( 88907 )

        Indeed. And it seems unlikely they will ever really get any further. QCs will dies from complexity long before they can do anything useful. Same for this thing here.

        I am beginning to think there is a connection between "big hype" and the "Big Lie" approach.

      • by ceoyoyo ( 59147 )

        It's not simulated. There's a picture in the article of one of the researchers with the device. You can build something similar yourself with an old LCD screen, a lamp and a camera.

    • If you read their paper it will help you understand how their system works and where it has applications. Here are a couple key points:
      1-" First, the data transfer between digital and analog domains is fully eliminated until convergence to a solution is achieved."

      2-" Second, there is no separation between compute and memory since the variables are computed as light traverses through the optical modulation matrix"

      Their system is specifically designed to solve optimization problems like traveling salesp
  • Microsoft says it can "solve practical problems at the speed of light." Microsoft spews meaningless marketing gibberish.
  • by groobly ( 6155920 ) on Sunday July 02, 2023 @11:58AM (#63651078)

    This is just a press release. Optical computing has had active research for over 25 years that I know about personally, and probably much longer. The primary advantage as I understand it is that crossing light beams do not interact with each other, making interconnections simpler. But, there are so many other problems that it's never made it out of the lab for those 25 years.

  • Would reduce the possibility of bit flips https://radiolab.org/podcast/b... [radiolab.org]
  • A quote from the MS site explaining the engine: "As light travels incredibly fast – 5 nanoseconds per meter – each iteration within the AIM computer is significantly faster and consumes less electricity than running the same algorithm on a digital computer. "

    Light travels a meter in 3 nanoseconds in vacuum, so this is poorly worded at best. However signal speed through copper is 2/3c, or roughly 5 ns to go a meter. So this number is kind of whatever?

    Is there some way to interpret this as an impo

    • by ceoyoyo ( 59147 )

      It's a wrong explanation.

      There's been a little surge in papers on these things lately. The idea is that you can take a beam of light and run it through a spatial light modulator (optical modulation matrix), measure the result, and the effect is doing some computation. It's not a new idea.

      If you set up the modulator just right, you can do useful computations. It's very special purpose though. The modulator is an LCD or some equivalent, so you can't just switch it infinitely fast. If your computation works wi

  • by davidwr ( 791652 ) on Sunday July 02, 2023 @12:46PM (#63651174) Homepage Journal

    "Picture a world where computing is not limited by the binary confines of zeros and ones, but instead, is free to explore the vast possibilities of continuous value data."

    History [wikipedia.org] is calling, they want their analog computers [wikipedia.org] back.

    • Yup. Analogue computers definitely have their place and have been unreasonably ignored in the last 50 years in certain problem spaces. However if you need repeatable accuracy then they're no use.

    • As a Freshman in 1981 I took an intro to Comp Sci class. Among other blunders, the professor claimed that the difference between analog and digital computers was that the former were mechanical, while the latter were electrical. (As a ham radio operator, I was familiar with the Heathkit electronic analog computer available at the time.) I canceled the class, switched majors to archaeology, and never looked back.
    • History is calling, they want their analog computers back.

      Analog computers are amazing and can easily obliterate even our best and most efficient digital computers in terms of performance and power consumption.

      As Moore's law cools off and interest in domains with basically infinite computing requirements expand analog isn't going anywhere.

    • Exactly. Many of the other responders are so stuck in the digital world their bringing "switching" into every comment. This processor is massively parallel analog where the functions are expressed with optical components. Such computers calculate at the speed of light through the prisms and other components. The speed limitations are found in injecting the data, interpreting the answer, and the biggest one, adjusting the optical path to solve a new problem. They have been in use for very specialized real-ti
      • by ceoyoyo ( 59147 )

        The speed limitations are found in setting the thing up to do a calculation. For many analog computers that means building one.

        This one is a little better, you can use a computer to program the LCDs. Which means it's limited by... the switching speed of your LCDs and the digital computer running them.

    • by jasnw ( 1913892 )
      I remember building the GE kit mentioned in the wikipedia article in my 5th grade class. My first contact with computing.
  • I always knew they'd make a comback! Now i gotta go edit all my custom away messages!
  • barkley: this science station just isnt fast enough were gonna lose the array. computer transfer data to holodeck 1. barkley: computer activate neural interface. computer: unable to comply. device not in data banks. barkley: no problem this is how you make one.
  • Optical correlators were a topic with defense applications in target tracking. I heard hints about this in the 80s, but at the time no details. This review paper is interesting Tanone 2015 [routledgehandbooks.com]
    • by ceoyoyo ( 59147 )

      Optical correlators were fairly common. You might still find some in things like special purpose industrial cameras. You can also find lots of YouTubers building them. All you need is a couple lenses, a camera and some thick paper.

  • How about letting us submit instances of Traveling Salesman and judging how (lightning- ?) fast the solution is?
  • High tech analog computing but still the same - analog computers can do some calculations much faster than digital - e.g. The Travelling salesman problem in linear time ... this is nothing new (it's older than digital computers)

    the usual issue with analog computers is they do one calculation (with different parameters), and you need to have different computer for each calculation

    A light based computer should be simpler and quicker to reconfigure ... it has been the promise of optical computing for the last

Never test for an error condition you don't know how to handle. -- Steinbach

Working...