Record-Low Error Rate For Qubit Processor 66
An anonymous reader writes "Thanks to advances in experimental design, physicists at the National Institute of Standards and Technology have achieved a record-low probability of error in quantum information processing with a single quantum bit (qubit) — the first published error rate small enough to meet theoretical requirements for building viable quantum computers. 'One error per 10,000 logic operations is a commonly agreed upon target for a low enough error rate to use error correction protocols in a quantum computer,' said Kenton Brown, who led the project as a NIST postdoctoral researcher. 'It is generally accepted that if error rates are above that, you will introduce more errors in your correction operations than you are able to correct. We've been able to show that we have good enough control over our single-qubit operations that our probability of error is 1 per 50,000 logic operations.'"
Progress! (Score:2)
Another few orders of magnitude and they might approach vacuum tube-levels of reliability.
Re: (Score:3)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
19 times out of 20, they'll get 1 error in 50,000....
Re: (Score:2, Informative)
Most early computing errors were caused by memory (not RAM as early technologies weren't random access) The shift from mercury delay lines to magnetic cores saw a serveral-orders-of-magnitude drop in error rates, and the associated increase in the viability of general purpose computing.
Re: (Score:2)
I think a lot of us has forgotten how bad old computers were at with hardware errors, and how much environment can effect the old computers. My old Amstrad CPC1512 use cause programs to crash or odd inputs on the screen when ever I turned on the fan.
Re: (Score:1)
Re: (Score:3)
Re: (Score:2)
You'll make fewer errors if you only jump on the disc when Coily is right behind you.
Re:Pffff! (Score:5, Insightful)
Yea, that's the whole point of their efforts.
Re: (Score:2)
Re: (Score:1)
Er, what? 1 error in 9 billion bits is around ten errors per second. A computer with such an error rate is worthless unless you build your algorithms specifically to handle that. Only if you prove the errors are completely random you can put three chips next to each other and vote.
Re: (Score:1)
Only if you prove the errors are completely random you can put three chips next to each other and vote.
That's not how error detection and correction is performed.
Re: (Score:1)
I hate to have to explain jokes, but maybe some slashdotters are too young to remember the Pentium FDIV bug?
Re: (Score:2)
12.00000000001
Re:Pffff! (Score:5, Insightful)
Re: (Score:2)
Uncertainty (Score:5, Funny)
I mean, once you know where the error is, you don't know what the error is.
I mean, err... I'm not sure.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
In fact you correct the error by observing it.
Re: (Score:1)
Re: (Score:2)
I have to try that some time. Is it actually as good as the ad makes it sound?
Is it a pils? I hate pils, I like Helles.
Re: (Score:2)
Preferably Augustiner. Oh how I miss my Augie :(
Re: (Score:3)
Still a long way away (Score:5, Informative)
An important thing to recognize is that most of this experiment was done with a single qubit. Practical quantum computing will need to have this sort of error rate for thousands of qubits. The good news is that the methodology they used looks very promising. They used microwave beams rather than lasers to manipulate the ions. This has been I think suggested before but this may be the first successful use of that sort of thing. As TFA discusses, this drastically reduces the error rate as well as the rate of stray ions.
We are starting to move towards the point where quantum computers may be practical. But we're still a long way off. In the first few years of the last decade a few different groups successfully factored 15 as 3*5 using a quantum computer. (15 is the smallest number which is non-trivial to factor using a quantum computer since the fast factoring algorithm for quantum computers- Shore's algorithm- requires an odd composite number that is not a perfect power. It is easy to factor a perfect kth power a bit by looking instead at the kth root. And factoring an even number is easily reduced to factoring an odd number. So 15 is the smallest interesting case where the quantum aspects of the process matter.) Those systems used a classical NMR system http://en.wikipedia.org/wiki/Nuclear_magnetic_resonance_(NMR)_quantum_computing [wikipedia.org] which has since been seen as too limited. There are now a lot of different ideas of other approaches that will scale better but so far they haven't been that successful.
One important thing to realize is that quantum computers will not magically solve everything. They can do a few things quite quickly such as factor large numbers. But they can't for example solve NP complete problems to the best of our knowledge, and it is widely believed that NP complete problems cannot be solved in polynomial time with a quantum computer. That is, it is believed that BQP is a proper subset of NP. Unfortunately, right now we can't even show that that BQP is a subset of NP, let alone that it is a proper subset. Factoring big integers is useful mainly for a small number of number theorists and a large number of other people who want to break cryptography. There are a few other cryptographic systems that can also be broken more easily by a quantum computer but there's not that much else. However, that is changing and people getting a better and better understanding of what can be done with quantum computers. A lot of the work has involved clever stuff involving using quantum computers to quickly calculate stuff related to Fourier series. Moreover, once we get even the most marginally useful quantum computers there will be a lot more incentive to figure out what sorts of practical things can be done with them.
So the upshot is that these are still a long way off, but they are coming. The way it looked in the late 1990s or early 2000s it was reasonable to think that the technical difficulties would make them never practical. They still are a long way from being practical but right now it doesn't look like there are any fundamental physical barriers and it looks like in the long run the problems that do exist will be solved.
Re: (Score:2)
Re: (Score:2)
I hope you are joking.
No qc which is in sight will fit on a pci card anytime soon.
Re: (Score:2)
No qc which is in sight will fit on a pci card anytime soon.
That's okay, I have an ISA slot open.
Mixing QC and GP CPUs at different temps (Score:3)
Chances are pretty good that your Quantum Computer will be running at liquid helium temperatures, maybe 4 Kelvin or so. Your general purpose CPU won't. There have been projects to run CPUs at liquid-nitrogen temperatures, and that already tends to get into mechanical difficulties; you're probably not going to be running your overclocked Xeon down at 4K.
Also, the quantum computer isn't likely to be something you're pumping a lot of data through - you're more likely to set it up, have it magically give you
Re: (Score:3)
An important thing to recognize is that most of this experiment was done with a single qubit. Practical quantum computing will need to have this sort of error rate for thousands of qubits.
I'd take any quantum computer with 50 qubits and get a Nobel prize for beating the shit out of all current supercomputers simulating quantum systems like high-temperature superconductors, quark bound states such as proton and neutrons, or quantum magnets. Also, keep in mind that Rainer Blatt's group recently succeeded in demonstrating entanglement between 14 qubits [arxiv.org] in a similar setup. And for quantum simulations, the error rates probably don't have to be crazily low anyway, it turns out that such errors typ
Re: (Score:1)
You are correct; ion trap systems are in principle scalable, but the task is very challenging (probably much harder than scaling, say, a superconductor-based system). But it is much more than this. The operations needed to manipulate a single qubit are significantly different from the operations needed to interact two qubits. (There is no need to directly interact three or more qubits, since such gates can be built up out of two-qubit operations, much like the two-bit AND and one-bit NOT are universal fo
Is it just me, or... (Score:2)
Am I the only one who has difficulty thinking of quantum computers as things that actually exist and do calculations? It's like my brain has placed "quantum computer" firmly into the category "things that are theoretically possible but unable to be built with current technology", and refuses to change it, even to "things that exist in the lab but won't be commercially viable for decades outside classified government work".
Re: (Score:3)
There are indeed tangible benefits to quantum computing, beyond just attacking public key cryptosystems. As an example, quantum computers can speed up certain search algorithms, which is one of the promised commercial applications of a quantum computer.
Personally, I put quantum computers in
Re: (Score:2)
That's because as of now it still is in the "theoretically possible but unable to be built." Keep in mind this new technique is for one (1) qubit. You need more (at least 8? Or do quantum computers work that differently from normal ones) to do anything practical. And it only meets the theoretical requirement, once you use ECC. Previously, it wasn't accurate enough that you could count on the ECC to be performed right. Making a quantum computer, even in the lab, is a few years off yet.
Or so I think. I don't
Re: (Score:2)
"Existing but not able to do anything practical" is still a pretty big difference from "Could exist but nobody's built one yet". It's like the early airplanes - they existed, but they weren't exactly useful for anything besides proving that heavier-than-air aircraft could work.
Sounds good (Score:2)
show that we have good enough control over our single-qubit operations that our probability of error is 1 per 50,000 logic operations.
They forgot to add: "we calculated this probability on our quantum computer"
A standard Open-Source Quantum Computing Language (Score:2)
What we really need is a "standardized" open-source quantum computing language so that we can develop and exchange quantum algorithms to prepare for the day when quantum computers are real.
Right now we have the QCL [tuwien.ac.at] language, QCF [sourceforge.net] for Matlab/Octave, and the Cove framework [youtube.com] that could be used with any language, but it looks like there is really only a C# implementation right now.
None of these have really taken hold as a "standard" though, and probably elements of all of them could be brought together in somet
Re: (Score:2)
What do you get when you have THREE competing standards and you try to take elements of them to make something multi-platform and all inclusive?
FOUR competing standards
Re: (Score:2)
Re: (Score:2)
Yet another person who has no idea how standards actually work, but is apparently literate enough to read XKCD.
Do tell. How do you think they really work?
Re: (Score:1)
Clearly your sense of humor does not fully implement the standard.
Re: (Score:2)
They aren't errors! (Score:2)
They are just answers to questions you haven't asked yet.
Re: (Score:2)
Perhaps something with a biological matrix..quick, get the mice!
Excellent news! (Score:2)
Now if someone could just implement Shors algorithm all 1-bit encryption algorithms will be rendered obsolete!
Ob. Imagine a Beowulf cluster of these... (Score:1)
...I'll wait.