Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware Science

Light-Based Quantum Computer Exceeds Fastest Classical Supercomputers (scientificamerican.com) 60

An anonymous reader quotes a report from Scientific American: For the first time, a quantum computer made from photons -- particles of light -- has outperformed even the fastest classical supercomputers. Physicists led by Chao-Yang Lu and Jian-Wei Pan of the University of Science and Technology of China (USTC) in Shanghai performed a technique called Gaussian boson sampling with their quantum computer, named Jiuzhang. The result, reported in the journal Science, was 76 detected photons -- far above and beyond the previous record of five detected photons and the capabilities of classical supercomputers.

Unlike a traditional computer built from silicon processors, Jiuzhangis an elaborate tabletop setup of lasers, mirrors, prisms and photon detectors. It is not a universal computer that could one day send e-mails or store files, but it does demonstrate the potential of quantum computing. Last year, Google captured headlines when its quantum computer Sycamore took roughly three minutes to do what would take a supercomputer three days (or 10,000 years, depending on your estimation method). In their paper, the USTC team estimates that it would take the Sunway TaihuLight, the third most powerful supercomputer in the world, a staggering 2.5 billion years to perform the same calculation as Jiuzhang. [...] This latest demonstration of quantum computing's potential from the USTC group is critical because it differs dramatically from Google's approach. Sycamore uses superconducting loops of metal to form qubits; in Jiuzhang, the photons themselves are the qubits. Independent corroboration that quantum computing principles can lead to primacy even on totally different hardware "gives us confidence that in the long term, eventually, useful quantum simulators and a fault-tolerant quantum computer will become feasible," Lu says.

... [T]he USTC setup is dauntingly complicated. Jiuzhang begins with a laser that is split so it strikes 25 crystals made of potassium titanyl phosphate. After each crystal is hit, it reliably spits out two photons in opposite directions. The photons are then sent through 100 inputs, where they race through a track made of 300 prisms and 75 mirrors. Finally, the photons land in 100 slots where they are detected. Averaging over 200 seconds of runs, the USTC group detected about 43 photons per run. But in one run, they observed 76 photons -- more than enough to justify their quantum primacy claim. It is difficult to estimate just how much time would be needed for a supercomputer to solve a distribution with 76 detected photons -- in large part because it is not exactly feasible to spend 2.5 billion years running a supercomputer to directly check it. Instead, the researchers extrapolate from the time it takes to classically calculate for smaller numbers of detected photons. At best, solving for 50 photons, the researchers claim, would take a supercomputer two days, which is far slower than the 200-second run time of Jiuzhang.

This discussion has been archived. No new comments can be posted.

Light-Based Quantum Computer Exceeds Fastest Classical Supercomputers

Comments Filter:
  • by serviscope_minor ( 664417 ) on Friday December 04, 2020 @08:06AM (#60793372) Journal

    Here's how you know quantum computers run Java: every year there's an article telling you this year they're finally faster than classical computers (or C++).

    [This joke brought to you from 1999]

    • Yeah, but can it run Crysis? [This joke brought to you from 2007]
      • Can it run Doom...you mean...unless you don't but by implication of ... o my.
        Eno&Cale: I can't see the lines I used to think I could read between.
      • by suso ( 153703 ) *

        Yeah, but can you imagine a Beowulf cluster of these? [1999: Hi!]

    • In some ways back in the 1990's Java was faster than C++. It wasn't faster in terms of theoretical peak performance. As C/C++ can be extremely optimized for max performance. However Java at the time had a set of default libraries and classes, designed around many common tasks. I remember upgrading a C compiler to now have the string library included, where before we needed to make our own String routines. As well sort and hash... Because we had to do it all ourselves, it can be extremely optimized in

      • Because we had to do it all ourselves, it can be extremely optimized in theory. However in practice, and during a full product life cycle, the code became less efficient. Because we may have coded it at the end of the day, feeling a little sick, or having the Boss pressure you to get it done for release date, so you may had used a bubble sort vs a better sorting algorithm because you had to only sort 10 or so items. Where in production and a requirement change caused that routine to process thousands or millions of items.

        Java, it was rationalized to me in 2002, is a Turing complete language. I don't know what you mean by "extremely optimized" other than you actually wrote a program, not cobbled libraries with a vague understanding of their inputs and outputs. About a year later, I was meeting a friend in San Jose who introduced me to a houseful of Java programmers and I asked them for their best analogy of an object (OO)-- it's the tofu in the miso, they said.

        Great bunch of guys, but only one woman. I posed to them what I'

        • Okay, I'll bite:
          Are you an AI, programmed to write stuff that could mean something ( but doesn't ) ?

          • Could be.

            Or it could be that what he wrote is above your mental pay grade.

            • Yes. That's it. Definitely.
              Because I can see no reason why an intelligent human being would need someone to "rationalize" to them that a well-known programming language ( yes, in 2002 ) was Turing complete, when that would be bleeding obvious to anyone from reading a simple description of it. "Rationalisation" would not come into it, unless one was being pretentious.

              Yes, this broken AI, sorry, person, is definitely much smarter than me.
              And as for driveling minions like you...

      • C/C++ are only faster because they remove ALL the checks and bounds. Which is the main thing to blame for evrything and its mother being hacked nowadays, apart from user error.

        Here's a compiler for my new, even faster language, using the same logic of just removing essential properties. This time, precision:

        main() { print "1"; }

        I prefer languages that are designed to be able to do all those checks at compile time. Like OCAML/Haskell. All the checks and all the speed at the same time.
        Only problem: They can's

        • by Teckla ( 630646 )

          C/C++ are only faster because they remove ALL the checks and bounds.

          As a software developer that has been programming in both C and Java for many decades, this is simply not true.

          The lack of bounds checking is one of many reasons C is much faster than Java.

          Stop spreading misinformation.

        • It is surprising how few know that the kind of checking you are talking about that does not exist in C and C++ can be found in Ada and SPARK. These are also procedural and object oriented versus the functional programming approach of OCAML/Haskell. If you use contracts with Ada/SPARK it is even possible to remove many of the checks because the compiler can verify that certain checks are unnecessary.
      • Oh /., you never disappoint!

        Came for a discussion on quantum computers, stayed for a 20 year old rehash of the merits of C++ vs Java.

        Its like a fucked up version of Neverland where everyone is perpetually 35 and instead of flying and fighting pirates they attend the weekly IT all-hands to talk about the upcoming Q2 milestone.

        Never change! :)

    • by bradley13 ( 1118935 ) on Friday December 04, 2020 @10:06AM (#60793836) Homepage

      You laugh, but that's pretty much what it feels like. Quantum supremacy on some problem chosen specifically to demonstrate it. So what? One could also take an analog computer and find analog tasks where it outperforms digital computers. One can also take an ASIC and - sure enough - one can mine Bitcoin faster than a general purpose computer.

      So far, quantum computing is pretty underwhelming. It seems like it's in the same place as practical fusion and true AI: always just a few years away.

      • Except we went from 1 to a few to several to many photons now. And we have found ways to implement transistor-like and resistor-like structures, among other things.

        Newsflash: Basic research takes its own time. It's only business dicks who always want to put deadlines on unplannable things.

      • by Anonymous Coward

        Quantum supremacy on some problem chosen specifically to demonstrate it. So what? One could also take an analog computer and find analog tasks where it outperforms digital computers.

        Exactly. This feels like solving protein folding by making the protein and watching it fold. When a quantum computer solves a non-quantum problem faster than a classical computer, I'll finally be impressed.

        • Quantum supremacy on some problem chosen specifically to demonstrate it. So what? One could also take an analog computer and find analog tasks where it outperforms digital computers.

          Exactly. This feels like solving protein folding by making the protein and watching it fold. When a quantum computer solves a non-quantum problem faster than a classical computer, I'll finally be impressed.

          You forgot the grand claim of protein computing supremacy.

      • We can already do some calculations very quickly by using analog methods. For example, we can already sort numbers trivially in O(n). It requires a long O(n) setup, a quick O(1) "calculation", and an O(n) readout of results. Cut strands of dry spaghetti to the length of the numbers you want to sort, gather the spaghetti into a bundle, rap lightly on the table surface, then pull out the strands one by one based upon which is sticking up the most. Sure, it takes forever to do the setup, but it's still O(n

  • "quantum computing"!
  • by JoshuaZ ( 1134087 ) on Friday December 04, 2020 @08:23AM (#60793400) Homepage
    Scott Aaronson, who invented the Boson Sampling technique along with Alex Arkhipov https://en.wikipedia.org/wiki/Boson_sampling [wikipedia.org], has a has a really good writeup of these results https://www.scottaaronson.com/blog/?p=5122 [scottaaronson.com] along with a discussion of the broader implications. The major upshot is that along with the earlier Google results about quantum supremacy, any argument that quantum computers cannot outperform classical computers on at least some tasks seems dead in the water. Whether quantum computers in the real world can in practice perform actually useful tasks better than classical computers remains to be seen, but this strongly suggests that the answer to that question will also be yes.
  • Wait WHUT? (Score:4, Insightful)

    by Fly Swatter ( 30498 ) on Friday December 04, 2020 @08:51AM (#60793484) Homepage
    So they aim a laser onto a bunch of reflective surfaces and SOME of those photons actually made it to the detection device? What exactly did it calculate?

    This is not potential computing of anything, they are still figuring out how to replace conventional circuit traces with light beams (sorry a photon.)

    I guess they need more funding to continue... The question is how many lifetimes will it take to get anything that actually works as a computer?
    • Quote: "The question is how many lifetimes will it take to get anything that actually works as a computer?"

      How many lifetimes took to get anything that actually works as a computer since the Stone Age?

      • The speed at which technology is generally speaking happening exponentially. As computers , quantum computers , AI tech gets better - the faster we can solve problems like scaling this technology down to the size of a something that can be called a product

      • I dunno...we went from the Wright Brothers to landing on the Moon in ~70 years. That seems pretty fast when you consider all of the stuff that had to happen in between.

    • Exactly. I would say I am a trillion times faster than a classical computer because it would take forever for a classical computer to compute me, while I do it all the time effortlessly
      • That does get to the issue of what we mean by a computer. (There's an old joke that every computer is a quantum computer because transistors use quantum mechanics and if you take it to its logical conclusion a bicycle is a quantum computer.) But in this context, computing what you do is tough in part because we don't have a mathematically precise notion of what algorithm constitutes batukhan. In this case, we have a very precise mathematical description of the distributions of the photons. In fact, that de
    • It estimated Boson Sampling defined https://www.scottaaronson.com/papers/optics.pdf [scottaaronson.com]. Strictly speaking, they used the Gaussian variant https://en.wikipedia.org/wiki/Boson_sampling#Gaussian_boson_sampling [wikipedia.org]. It works as a computer in the sense that it is computing a well-defined problem. I'm not sure what your objection is.
      • I guess my problem is with using 'compute' instead of 'proof'. The former makes it sound like 'Quantum Computer' development is further along than it is.
  • Meet your new overlord running on a Quantum Light powered Neural Net.

    • Stiller better than {insert your most hated current leader here} It least you know or sure this one is not human, vs just acting as if they are inhuman.
  • It would have been nice if the article would have talked about what the problem was that light-based "quantum computer" was trying to solve. Maybe we could have just asked The View for the answer.
    • That's not the important bit at this stage. If you have a solution to problem x, then doesn't matter what it's about, there will be other more practical problems that can be rephrased to utilize the same solution. For example AlphaGo solved a tablegame and then later generalized the same solution to many other things, most recently to protein folding. There was nothing special about go as the first target, other than that it was unsolvable with previous methods, that's the important bit.
      • There was nothing special about go as the first target, other than that it was unsolvable with previous methods, that's the important bit.

        It would have been more impressive if they hadn't "solved" it by merely throwing more compute power at it than other previous methods.

        • Alas no, while they did use quite a bit of compute power, that was not the key to solving it. If you remember for a second, then for decades go was considered the one game in which computers can never achieve supremacy over humans. Without neural networks, that would still be true today, it's an EXPTIME complete problem and the board is simply too large, you can't throw enough compute power at it to brute force through it.
          • If you remember for a second, then for decades go was considered the one game in which computers can never achieve supremacy over humans.

            No, what are you talking about? Go was the game in which computers made steady progress against humans as CPU power increased, until one company increased it by more than anyone else. Who else was using 176 GPUs? [ceva-dsp.com] Who else was willing to pay for electricity to power that?

            • If you took best go playing software from 90ies and threw all the compute power that exists in modern world behind it, it would still fail to beat humans. Go playing software doesn't become twice as capable if you throw twice the compute power at it, it plateaus out and then it doesn't matter how much more you throw at it, that's how EXPTIME problems work. Do you know why they used 176GPU-s and not more or less? They did in fact test the first version at 280GPU-s. But there was barely any extra performance
              • ok, but you are completely ignoring the progress made. Actually you are ignorant of the topic altogether. If you had even read the link I sent to you, you would have seen this sentence, "Previously, AI experts thought that this achievement was at least a decade away," which is completely contrary to your statement, " for decades go was considered the one game in which computers can never achieve supremacy over humans."

                Seriously, do more research and less talking and you will be smarter, your current comme
    • Well, it wa implied that that was clear.

      Quantum computers allow parallelization over ALL possible states of a variable at the same time, requiring just a single circuit. Like ramping up one factor in the algoritm performance calculations to literally infinity.

      And photonic logic solves the size limitation of electronic logic. Electrons on current "wire" sizes tend to just tunnel through insulators because their wavefunction's "blurryness" is bigger than the distance to the next "wire". Basically if you want

  • by kackle ( 910159 ) on Friday December 04, 2020 @09:41AM (#60793700)

    Jiuzhang is an elaborate tabletop setup of lasers, mirrors, prisms and photon detectors.

    If it can, this would truly be the year of Linux on the desktop. (ducks)

    • The desktop wishes it would ever approach the power of a real computing environment like Linux.

      It never will. It can go back to the corner and sit with MS Bob.

    • If this works, it will be the year of Linux, Mac OS, iOS, Windows, and BSD running simultaneously on the same desktop. The computer will anticipate every possible click and react to all of them before you press the mouse.

      People will realize how incredibly having every possible solution to every possible quantum computing problem is, and wish for the days of single-core computers with the classical von Neumann architecture. [wikipedia.org]

      • by kackle ( 910159 )
        Incredibly what? I'm hanging here... :)

        And I should have said "Linux on the tabletop."
        • People will realize how incredibly useless having every possible solution to every possible quantum computing problem is, and wish for the days of single-core computers with the classical von Neumann architecture. [wikipedia.org]

          Sorry.

          • by kackle ( 910159 )
            Maybe you expected the quantum computer to fill that in for you, but it failed because of a floating dust particle.
  • For the first time, a quantum computer made from photons -- particles of light -- has outperformed even the fastest classical supercomputers.

    I don't know who wrote this but he/she is a fucking idiot. That's like saying our silicon-based computers are "made from electrons".

    • It may as well be true though, because while it's made of fairy dust, it also doesn't do anything. But it's definitely the fastest at not doing anything!

  • Universality is literally the defining point of computers, in case you iDevice buyers forgot that.

    If it is not universal, it is not a computer. But a piece of hard-wired electronics (or in this case, photonics).
    That is OK for this experiment, of course. But the wording here is not.

    • That's not true. You can absolutely have fix function computers. A computer is merely a device that computes something. A person can even be a computer. Before TTL became common place the vast major of computers in use in the world were fixed function devices that had a specific use. What you are describing is a general purpose computer. One that can serve many functions, the ultimate form of which is being turing complete. But that is not a requirement to be called a computer.

      • Meanings of words change over time, I would say at this time most people would consider 'computer' to mean a general purpose (and complex) device. I doubt most would call an Abacus a computer in today's terminology.
        • This is a site for nerds. Being technically correct is preferred here. The discussion often travels off of mass market consumer products so only using colloquial definitions is insufficient.

          • What is preferred is making it clear contextually whether "computer" is meant in the common sense or the formal sense of the term. Both usages have their place.
  • So let us guess how many years until they show quantum supremacy on a USEFUL calculation. Some result that can be used in math or physics or any other field. My guess is: 8 years.
  • Is it merely faster, or asymptotically faster?

  • by gweihir ( 88907 )

    It does not. I do not how this particular "proof" is a lie because I have stopped reading this nonsense. So far all similar claims have been shameless direct lies or lies by misdirection.

  • Soon the software industry willl make that as slow as today's computers.
    Computers today are about 1000 times faster and have about 1000 times more memory than computers from the late 90's.
    And somehow, for doing the same things we used to do back then, things doesn't seem a bit faster.
    Like... why do I need a 16gb computer to be able to browse the internet without the browser taking over all of my ram? Why is Windows tens of GB in size?
    Sure, lots of things have improved, but... I still have the sensation

"I've finally learned what `upward compatible' means. It means we get to keep all our old mistakes." -- Dennie van Tassel

Working...