Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
China Software Hardware Technology

China Makes Quantum Leap In Developing Quantum Computer (scmp.com) 70

hackingbear writes: Researchers at the University of Science and Technology of China created a quantum device, called a boson sampling machine, that can now carry out calculations for five photons, but at a speed 24,000 times faster than previous experiments. Pan Jianwei, the lead scientist on the project, said that though their device was already (only) 10 to 11 times faster at carrying out the calculations than the first electronic digital computer, ENIAC, and the first transistor computer, TRADIC, in running the classical algorithm, their machine would eclipse all of the world's supercomputers in a few years. "Our architecture is feasible to be scaled up to a larger number of photons and with a higher rate to race against increasingly advanced classical computers," they said in the research paper published in Nature Photonics. This device is said to be the first quantum computer beating a real electronic classical computer in practice. Scientists estimate that the current faster supercomputers would struggle to estimate the behavior of 20 photons.
This discussion has been archived. No new comments can be posted.

China Makes Quantum Leap In Developing Quantum Computer

Comments Filter:
  • Just curious to read...

    Paul B.

    • by Anonymous Coward on Wednesday May 03, 2017 @05:41PM (#54351335)

      It is both a link to the article and a link not to the article. Being quantum physics, you'll never know which it is until you click it.

      • by Anonymous Coward

        Schrodinger's link

    • by slew ( 2918 ) on Wednesday May 03, 2017 @05:45PM (#54351343)

      Paper preprint [arxiv.org]...
      Wikipage about boson sampling [wikipedia.org]...

      In principle, a large-scale boson-sampling machine would constitute an effective disproof against a foundational tenet in computer science: the Extended Church-Turing Thesis, which postulates that all realistic physical systems can be efficiently simulated with a (classical) probabilistic Turing machine.

      The machine may not have any practical use, but it still is an interesting theoretical advance that might serve to challenge our understanding of computablity... Part of the theoretical importance of this area of research is the understanding of #P-complete [wikipedia.org] problems.

      The wikipedia articlenotes the theoretical significance of this...

      A polynomial-time algorithm for solving a #P-complete problem, if it existed, would imply P = NP, and thus P = PH. No such algorithm is currently known.

      • by Anonymous Coward

        Researcher in computational complexity here. No one believes this machine (or any quantum machine, at that) will be able to solve #P-complete problems in full generality. There's strong evidence that even NP complete problems are out of reach for quantum algorithms, and #P is (seemingly) way, way, way above NP in terms of computational difficulty.

  • Slashdot makes quantum leap in writing quantum headlines quantum quantum quantum dark side.

    Quantum Shave.

    • by ls671 ( 1122017 )

      Yes, in other news, BlackBerry made a BlackBerry leap in developing BlackBerry computer:

      https://en.wikipedia.org/wiki/... [wikipedia.org]

    • A quantum leap is hardly noteworthy. Literally that is the smallest possible motion, used in physics for the smallest leaps within an atom.

      Not sure if we should blame /. editors or the submitter for that abuse of language.

      • Re:Slashdot (Score:4, Interesting)

        by ClickOnThis ( 137803 ) on Wednesday May 03, 2017 @06:16PM (#54351425) Journal

        A quantum leap is hardly noteworthy. Literally that is the smallest possible motion, used in physics for the smallest leaps within an atom.

        True, but a quantum leap can occur over an energy-boundary that classical physics would claim can't be overcome. I think that's why the metaphor is applied frequently to an unexpected advance in various fields outside of quantum mechanics.

        • The metaphor only started being applied when that tv series Quantum Leap was shown. In the show, it was a HUGE jump - time travel. The opposite of a quantum. On the other hand, "Quantum of Solace" used the term more or less properly - the killing of the bad guys in the end only provided a small teenie tin iota of solace to the protagonists.
    • Scott Bakula could not be reached for comment.

    • Before you buy
      A quantum leap
      Be sure you get
      The cite complete

      Burma Shave

      AKA, I'll believe it when the paper is confirmed by researchers... somewhere else

  • by Anubis IV ( 1279820 ) on Wednesday May 03, 2017 @05:17PM (#54351215)

    I always struggle with understanding quantum computing concepts, but from the sound of things in the article, this is not some sort of general purpose quantum computer. Rather, it's a purpose-built computer dedicated to estimating the behavior of photons.

    Why that specifically?

    Based on what the article (and summary) said, modern computers struggle to estimate the behavior of 20 or more photons, but it's the sort of problem that quantum computers are theoretically capable of handling quite easily. Researchers are apparently suggesting that in order to disprove skeptics and bring in more support for quantum computing, we should build a quantum computer of this variety and then use it to estimate the behavior of 30 or more photons, because doing so would definitively prove to everyone that quantum computers can provide a massive advantage over traditional computing methods.

  • LOL, I see what you did there and it is kind of funny.... BUT, does it compute?

    • I doubt the headline writer saw what they did there though. A quantum leap is literally the smallest possible change to a system. So the headline suggests they have made the smallest possible improvement which is not very impressive at all.
      • Excellent point.... Hadn't thought of that. So the headline really is a poke in the eye to the Chinese "invention" (assuming it actually exists).

      • In common usage, it's the opposite though.

        Merriam-Webster: quantum leap: a sudden large change, development, or improvement

        • Actually that is an appallingly bad definition from a dictionary for the "common usage" interpretation since it misses the important requirement that it be a huge change. Try a better dictionary like the Cambridge english dictionary [cambridge.org] if you want a more correct common usage definition - they even know how to spell colour correctly too! ;-)
    • since it is a quantum leap, it might compute...
  • by Anonymous Coward

    The Chinese are expending significant resources building conventional supercomputers. Which suggests far more promise for quantum computers than current reality.

    https://www.top500.org/lists/2016/11/ [top500.org]

    And that is true for all the computing leaders at present. We know how to build very effective supercomputers. We think that quantum computers might be great, the promise is there in theory, but you'd be a fool to ditch your conventional HPC systems.

    And even if quantum computing becomes "a thing", suspicions a

  • Can anyone explain in simple language for stupid people (namely, me) how quantum computing could work? What little I know about particle physics suggest that they can't even detect particles directly (only in "probabilities"), so how can they use them to do computing? I suppose I could follow the links and read the scientific papers, but I struggle even with 'dummies' style books (e.g. Tao of Physics and Dancing Wu Li Masters), so I'm sure the papers would be over my head. (And if anyone has any *readabl
    • by skids ( 119237 )

      You set a register of bits to all possible combinations of the bits at the same time -- all possible values from 0 to N^2-1 are entangled. Then you run them through some quantum logic operations that eliminate all impossible solutions to a problem from the set of possible combinations. Then you read the register. It collapses to *one* of the possible solutions when you read it. So for example if you are factoring a large umber, that solution will be one possible divisor of that number. So you divide th

  • and their memory is swiss cheese. Got it.
  • is there a different kind of leap when developing a Quantum Computer? I mean, it's right there in the name and all...
  • The first rule of quantum computing is that everyone talks about scaling their design up to more qubits, but no one actually does it.

  • by mnemotronic ( 586021 ) <mnemotronic.gmail@com> on Wednesday May 03, 2017 @11:24PM (#54352449) Homepage Journal
    I suggest bigger cuts to the Office of Science budget [top500.org]. Why do we need to spend money developing better, faster supercomputers? We can let the Chinese do all the expensive R&D, then we can buy the finished product from them. No problem. It worked for drywall [nolo.com], why not quantum puters?
    • by Anonymous Coward

      Because the first ones to produce viable quantum computers crack everyone elses encryption and we are engaged in a lukewarm cyber war with the Chinese?

  • by OneHundredAndTen ( 1523865 ) on Thursday May 04, 2017 @08:06AM (#54353725)
    So their breakthrough is a vanishingly small one?
  • in the TV show by the same name, all of the jumps were into the past. That would mean that they took steps backward...
  • The US had best be well in front in Quantum computing. The advantage gained by a foreign power may be too great to overcome if we allow them to get a bit ahead of us.

A complex system that works is invariably found to have evolved from a simple system that works.

Working...