Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware

Record-Low Error Rate For Qubit Processor 66

An anonymous reader writes "Thanks to advances in experimental design, physicists at the National Institute of Standards and Technology have achieved a record-low probability of error in quantum information processing with a single quantum bit (qubit) — the first published error rate small enough to meet theoretical requirements for building viable quantum computers. 'One error per 10,000 logic operations is a commonly agreed upon target for a low enough error rate to use error correction protocols in a quantum computer,' said Kenton Brown, who led the project as a NIST postdoctoral researcher. 'It is generally accepted that if error rates are above that, you will introduce more errors in your correction operations than you are able to correct. We've been able to show that we have good enough control over our single-qubit operations that our probability of error is 1 per 50,000 logic operations.'"
This discussion has been archived. No new comments can be posted.

Record-Low Error Rate For Qubit Processor

Comments Filter:
  • Another few orders of magnitude and they might approach vacuum tube-levels of reliability.

    • One error per bit per 50,000 logic operations should be accurate enough for non-technical people.
      • One error in 640,000 ought to be enough for anyone.
        • 19 times out of 20, they'll get 1 error in 50,000....

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Most early computing errors were caused by memory (not RAM as early technologies weren't random access) The shift from mercury delay lines to magnetic cores saw a serveral-orders-of-magnitude drop in error rates, and the associated increase in the viability of general purpose computing.

      • I think a lot of us has forgotten how bad old computers were at with hardware errors, and how much environment can effect the old computers. My old Amstrad CPC1512 use cause programs to crash or odd inputs on the screen when ever I turned on the fan.
         

    • Better than Pentium II math?
      • And of course you remember the joke -- why did they call it the Pentium instead of the 586? They added 100 to 486 and got 585.939434521165242233345, which wouldn't fit on the package.
  • Uncertainty (Score:5, Funny)

    by Knave75 ( 894961 ) on Wednesday August 31, 2011 @04:55PM (#37269114)
    The problem is that once you know what the error is, you don't know where the error is.

    I mean, once you know where the error is, you don't know what the error is.

    I mean, err... I'm not sure.
  • by JoshuaZ ( 1134087 ) on Wednesday August 31, 2011 @05:08PM (#37269250) Homepage

    An important thing to recognize is that most of this experiment was done with a single qubit. Practical quantum computing will need to have this sort of error rate for thousands of qubits. The good news is that the methodology they used looks very promising. They used microwave beams rather than lasers to manipulate the ions. This has been I think suggested before but this may be the first successful use of that sort of thing. As TFA discusses, this drastically reduces the error rate as well as the rate of stray ions.

    We are starting to move towards the point where quantum computers may be practical. But we're still a long way off. In the first few years of the last decade a few different groups successfully factored 15 as 3*5 using a quantum computer. (15 is the smallest number which is non-trivial to factor using a quantum computer since the fast factoring algorithm for quantum computers- Shore's algorithm- requires an odd composite number that is not a perfect power. It is easy to factor a perfect kth power a bit by looking instead at the kth root. And factoring an even number is easily reduced to factoring an odd number. So 15 is the smallest interesting case where the quantum aspects of the process matter.) Those systems used a classical NMR system http://en.wikipedia.org/wiki/Nuclear_magnetic_resonance_(NMR)_quantum_computing [wikipedia.org] which has since been seen as too limited. There are now a lot of different ideas of other approaches that will scale better but so far they haven't been that successful.

    One important thing to realize is that quantum computers will not magically solve everything. They can do a few things quite quickly such as factor large numbers. But they can't for example solve NP complete problems to the best of our knowledge, and it is widely believed that NP complete problems cannot be solved in polynomial time with a quantum computer. That is, it is believed that BQP is a proper subset of NP. Unfortunately, right now we can't even show that that BQP is a subset of NP, let alone that it is a proper subset. Factoring big integers is useful mainly for a small number of number theorists and a large number of other people who want to break cryptography. There are a few other cryptographic systems that can also be broken more easily by a quantum computer but there's not that much else. However, that is changing and people getting a better and better understanding of what can be done with quantum computers. A lot of the work has involved clever stuff involving using quantum computers to quickly calculate stuff related to Fourier series. Moreover, once we get even the most marginally useful quantum computers there will be a lot more incentive to figure out what sorts of practical things can be done with them.

    So the upshot is that these are still a long way off, but they are coming. The way it looked in the late 1990s or early 2000s it was reasonable to think that the technical difficulties would make them never practical. They still are a long way from being practical but right now it doesn't look like there are any fundamental physical barriers and it looks like in the long run the problems that do exist will be solved.

    • by hweimer ( 709734 )

      An important thing to recognize is that most of this experiment was done with a single qubit. Practical quantum computing will need to have this sort of error rate for thousands of qubits.

      I'd take any quantum computer with 50 qubits and get a Nobel prize for beating the shit out of all current supercomputers simulating quantum systems like high-temperature superconductors, quark bound states such as proton and neutrons, or quantum magnets. Also, keep in mind that Rainer Blatt's group recently succeeded in demonstrating entanglement between 14 qubits [arxiv.org] in a similar setup. And for quantum simulations, the error rates probably don't have to be crazily low anyway, it turns out that such errors typ

    • by Anonymous Coward

      You are correct; ion trap systems are in principle scalable, but the task is very challenging (probably much harder than scaling, say, a superconductor-based system). But it is much more than this. The operations needed to manipulate a single qubit are significantly different from the operations needed to interact two qubits. (There is no need to directly interact three or more qubits, since such gates can be built up out of two-qubit operations, much like the two-bit AND and one-bit NOT are universal fo

  • Am I the only one who has difficulty thinking of quantum computers as things that actually exist and do calculations? It's like my brain has placed "quantum computer" firmly into the category "things that are theoretically possible but unable to be built with current technology", and refuses to change it, even to "things that exist in the lab but won't be commercially viable for decades outside classified government work".

    • The problem is that people are not generally aware of what a quantum computer would be useful for. Why should I care if there is a quantum computer sitting under my desk? How do I benefit from quantum algorithms?

      There are indeed tangible benefits to quantum computing, beyond just attacking public key cryptosystems. As an example, quantum computers can speed up certain search algorithms, which is one of the promised commercial applications of a quantum computer.

      Personally, I put quantum computers in
    • That's because as of now it still is in the "theoretically possible but unable to be built." Keep in mind this new technique is for one (1) qubit. You need more (at least 8? Or do quantum computers work that differently from normal ones) to do anything practical. And it only meets the theoretical requirement, once you use ECC. Previously, it wasn't accurate enough that you could count on the ECC to be performed right. Making a quantum computer, even in the lab, is a few years off yet.

      Or so I think. I don't

      • "Existing but not able to do anything practical" is still a pretty big difference from "Could exist but nobody's built one yet". It's like the early airplanes - they existed, but they weren't exactly useful for anything besides proving that heavier-than-air aircraft could work.

  • show that we have good enough control over our single-qubit operations that our probability of error is 1 per 50,000 logic operations.

    They forgot to add: "we calculated this probability on our quantum computer"

  • What we really need is a "standardized" open-source quantum computing language so that we can develop and exchange quantum algorithms to prepare for the day when quantum computers are real.

    Right now we have the QCL [tuwien.ac.at] language, QCF [sourceforge.net] for Matlab/Octave, and the Cove framework [youtube.com] that could be used with any language, but it looks like there is really only a C# implementation right now.

    None of these have really taken hold as a "standard" though, and probably elements of all of them could be brought together in somet

    • by vux984 ( 928602 )

      What do you get when you have THREE competing standards and you try to take elements of them to make something multi-platform and all inclusive?

      FOUR competing standards

  • They are just answers to questions you haven't asked yet.

    • Clearly we need to make a larger quantum computer to calculate the questions we haven't asked yet, to make sense of all these answers that dont make sense yet.

      Perhaps something with a biological matrix..quick, get the mice!
  • Now if someone could just implement Shors algorithm all 1-bit encryption algorithms will be rendered obsolete!

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...