Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware Science Technology

Major Breakthrough As Quantum Computing In Silicon Hits 99% Accuracy (scitechdaily.com) 83

nickwinlund77 shares a report from SciTechDaily: UNSW Sydney-led research paves the way for large silicon-based quantum processors for real-world manufacturing and application. Australian researchers have proven that near error-free quantum computing is possible, paving the way to build silicon-based quantum devices compatible with current semiconductor manufacturing technology. [...] [The researcher's] paper is one of three published today in Nature that independently confirm that robust, reliable quantum computing in silicon is now a reality. This breakthrough is featured on the front cover of the journal.

[Professor Andrea Morello of UNSW, who led the work] et al achieved 1-qubit operation fidelities up to 99.95 percent, and 2-qubit fidelity of 99.37 percent with a three-qubit system comprising an electron and two phosphorous atoms, introduced in silicon via ion implantation. A Delft team in the Netherlands led by Lieven Vandersypen achieved 99.87 percent 1-qubit and 99.65 percent 2-qubit fidelities using electron spins in quantum dots formed in a stack of silicon and silicon-germanium alloy (Si/SiGe). A RIKEN team in Japan led by Seigo Tarucha similarly achieved 99.84 percent 1-qubit and 99.51 percent 2-qubit fidelities in a two-electron system using Si/SiGe quantum dots.

The UNSW and Delft teams certified the performance of their quantum processors using a sophisticated method called gate set tomography, developed at Sandia National Laboratories in the U.S. and made openly available to the research community. Morello had previously demonstrated that he could preserve quantum information in silicon for 35 seconds, due to the extreme isolation of nuclear spins from their environment. But the trade-off was that isolating the qubits made it seemingly impossible for them to interact with each other, as necessary to perform actual computations. Today's paper describes how his team overcame this problem by using an electron encompassing two nuclei of phosphorus atoms.
The three papers from the UNSW team, Delft team and RIKEN group in Tokyo can be found at their respective links.
This discussion has been archived. No new comments can be posted.

Major Breakthrough As Quantum Computing In Silicon Hits 99% Accuracy

Comments Filter:
  • what am I missing (Score:3, Insightful)

    by Anonymous Coward on Thursday January 20, 2022 @09:18PM (#62193535)
    So what am I missing here. Isn't the core problem that with those error rates they compound when you add more and more qubits which is what they need to do in order to achieve anything useful, when it comes to computing even 99.999% is pretty atrocious, let alone anything worse as it means every Calculation has to be done at least 3 times to ensure accuracy?
    • It is easy when you have that level of accuracy. You can couple systems together and execute in parallel, relying on an odd number of voters to compare results and validate results. The space shuttle had 5 computers, and IBM Z systems still do all operations twice; and a third time when CPUs disagree.
      • by Anonymous Coward
        Except you would need much more than 3 computers in parallel for that approach to be viable as those error rates would still see wrong answers on more than 1 machine often enough that you need 5 or 7 to have any degree of confidence and a quantum computer makes an IBM Z System look like a raspberry Pi cost wise. So no it aint easy or cheap.
        • by ShanghaiBill ( 739463 ) on Friday January 21, 2022 @12:52AM (#62193825)

          Except you would need much more than 3 computers in parallel for that approach to be viable

          No, you don't. If you are factoring a 256-bit number into two 128 bit prime factors, the chance of two parallel computations coming up with the same wrong answer is infinitesimal.

          You can also verify the answer by multiplying the two prime factors on a conventional computer. Many NP problems are hard to solve but trivial to verify.

          • by hey! ( 33014 )

            In any case, with an NP problem you can verify the answer efficiently on a conventional computer.

          • That's probably true. Unfortunately, the odds that both computers come back with different, but both wrong, factors is infinitesimally close to 1
            • Unfortunately, the odds that both computers come back with different, but both wrong, factors is infinitesimally close to 1

              If each bit is 99.995% accurate, then the probability of a 256-bit answer being correct is 0.9995^256 = 88%, and the probability of two QCs both being correct is 0.88^2 = 68%.

              If they are both wrong, throw away the result and try again.

              Or verify the answer on a conventional computer. BQP problems [wikipedia.org] are hard to solve but easy to verify. So just keep trying and verifying until you hit the correct solution.

              • If the probability of hitting right is infinitesimally close to zero, you are no better off than on a classical computer. Try and do the calculation how many times you would have to throw away the results due to a calculation error before hitting right, based on your calculation.
        • by HiThere ( 15173 )

          Are you sure? From my reading the reason for current limits to the number of qbits is that the error rate is too high for error correction to be feasible. This implies that error correction is possible.
          If the systems are not in the same quantum state, then replicated calculations seems a perfectly reasonable way to get an error correction. OTOH, this would seem to limit the spread of the calculation, so possibly some other mechanism is needed, but it appears that one is possible.

          I seem to recall a desig

      • by gweihir ( 88907 )

        It is easy when you have that level of accuracy. You can couple systems together and execute in parallel, relying on an odd number of voters to compare results and validate results. The space shuttle had 5 computers, and IBM Z systems still do all operations twice; and a third time when CPUs disagree.

        All approaches that do not work for QCs.

      • It is easy when you have that level of accuracy. You can couple systems together and execute in parallel, relying on an odd number of voters to compare results and validate results.

        Or ... you can validate the results on a cheap, boring, conventional computer.

        Repeat the quantum calculation if necessary.

      • by bws111 ( 1216812 )

        I don't know where this idea that IBM Z systems 'do all operations twice' comes from. They do not. What they have is extensive error detection throughout the processor. If an error is detected the operation can be retried and/or moved to a different core.

    • by ceoyoyo ( 59147 )

      The people working on individual qubits are doing interesting things, particularly the ones that are compatible with conventional chipmaking, but it doesn't really mean much in terms of actual quantum computers until you see how it scales up.

    • So what am I missing here. Isn't the core problem that with those error rates they compound when you add more and more qubits which is what they need to do in order to achieve anything useful, when it comes to computing even 99.999% is pretty atrocious, let alone anything worse as it means every Calculation has to be done at least 3 times to ensure accuracy?

      Prime factorization is hard to solve, but trivial to verify on a classic computer. So just double-check the answer, and rerun the algorithm in 0.01% cases when the result turns out to be wrong.

      • But the error rate is much higher once you combine enough qbits as would be required for factorization Both because of the increase in number of Qbits but also because the quantum computational itself involves many steps, it's essentially a full modular exponentiation. I saw an estimate that this calculation would take hours so there's plenty of room for errors.
    • Sigh, yet another quantum breakthrough. If you add up all the announcements over just the past two years, we're already at 12,870% of what's required to build a full quantum computer, but for some reason we don't actually have one yet.

      Must be some sort of quantum maths I don't get the workings of.

    • So what am I missing here. Isn't the core problem that with those error rates they compound when you add more and more qubits which is what they need to do in order to achieve anything useful, when it comes to computing even 99.999% is pretty atrocious, let alone anything worse as it means every Calculation has to be done at least 3 times to ensure accuracy?

      It's much simpler than that.

      Quantum computers are mostly useless: They're only good for a few different problems where a large space needs to be searched, eg. factoring numbers.

      If a quantum computer spits out a couple of factors for a big number then it's easy to check the factors on a normal computer. If it's wrong? Repeat the computation.

      This result just means you need less iterations than before.

      • by HiThere ( 15173 )

        You're overly simplistic. Quantum computers would be EXTREMELY valuable for computing chemical bond energies of complex molecules. And that's a lot harder to check. I'm sure there are other similar problems.

        Factoring numbers is just the *really* low hanging fruit.

        • If you can't check the results then how will you know if a quantum computer ever gave you a correct answer?

    • The math and the physics is beyond me (you can't just use the old classical error detection/correction techniques we currently lose.) But from what I've read, there are ways to remove the errors - quantum error correction is an active field of research. The higher the accuracy you get the smaller the number of qubits you need to add to your system to handle the errors.
      • The higher the accuracy you get the smaller the number of qubits you need to add to your system to handle the errors.

        Exactly... but you have to also note that right now error rates and error correction are both bad enough that adding more qubits to correct errors currently just increases the total error rate. The error-correction qubits add more error than they correct. But if the base error rate of qubits rises far enough and the error correction techniques improve enough that adding error-correction qubits reduces the error rate, that means you can build a QC for any QC-accessible problem. It may have to be really, rea

    • So what am I missing here. Isn't the core problem that with those error rates they compound when you add more and more qubits which is what they need to do in order to achieve anything useful, when it comes to computing even 99.999% is pretty atrocious, let alone anything worse as it means every Calculation has to be done at least 3 times to ensure accuracy?

      As I understand it, what makes this really interesting is the combination of increased accuracy in the quantum computers, and increased effectiveness of quantum error correction schemes. The way it was explained to me is that to use quantum error correction you have to add additional qubits, and right now qubit accuracy is bad enough and error correction ineffective enough that adding more qubits for error correction adds more net error than it removes.

      But increasing accuracy in the basic quantum computi

  • by gillbates ( 106458 ) on Thursday January 20, 2022 @09:21PM (#62193537) Homepage Journal

    With a 99.87% accuracy, that's 13 errors per 10k instructions. Put another way: if a modern 4 core CPU were as accurate, it would make 15 million errors per second.

    I do believe even Windows 95 couldn't segfault that fast.

    • by WoodstockJeff ( 568111 ) on Thursday January 20, 2022 @09:27PM (#62193553) Homepage

      when there's a chance that the result will be undecryptable by ANYONE, even with the key!

      • Re: (Score:2, Informative)

        by Anonymous Coward
        Quantum Encryption for most current implementations is a joke to begin with. The only thing quantum about it is the name, it is just bog standard encryption that happens to use a seed generated by a quantum method.
        • Key distribution is as important as generation. If I can get a key to you without any middlemen, then one has a major advantage.

          • Key distribution is as important as generation. If I can get a key to you without any middlemen, then one has a major advantage.

            The whole quantum crypto thing is not worth all that much.

            You can't rely on quantum channel alone there are no care of addresses stamped into photons no way to know who you are talking to without burning thru / relying on classical encryption and classically guarded pre-arranged secrets.

            Given there are an infinite number of ways to exchange keys securely over an untrusted wire if you are so paranoid that you can't rely on encryption to work as intended you are in the same boat when it comes to using encrypt

      • when there's a chance that the result will be undecryptable by ANYONE, even with the key!

        Quantum encryption is completely unrelated to quantum computing. Their only connection is that they both rely on quantum effects.

        Quantum encryption isn't really encryption, either. It's a secure key distribution mechanism.

    • by Mr0bvious ( 968303 ) on Thursday January 20, 2022 @10:17PM (#62193629)

      Not all problems require accuracy. Sometimes speed over accuracy is what's required.

    • Actually modern CPUs have built in Error Detection and Error Correction circuitry. When the BER (Bit Error Rate) gets too high CPUs redo calculations and throttle down clock speed to try and decrease temperature.

      All the need to do now is figure out error-detection and correction in quantum computers and they should be as good as any modern CPU.

      • Some CPUs have that. To my knowledge no consumer CPU for desktop/laptop has it. They just calculate the result once and assume it's right. There is error correction in caches and some buses but that's it. Some enterprise CPUs do have error correction in calculations but it's limited to very specific areas of the CPU. For instance i think some mainframe CPUs had dual ALUs. But no matter what any CPU can return a different result at any time. The probability per second is low but it does happen.
    • by Gaglia ( 4311287 ) on Friday January 21, 2022 @03:11AM (#62193947)
      The breakthrough news is that this level of accuracy for the first time crosses the surface code error threshold, which is BIG. What it means is, there exist techniques (quantum error-correcting codes, of which the most famous one is the surface code) that allow to "bundle together" a bunch of noisy qubits into a logical, self-correcting, error-free one. However, in order for this to work, the underlying noisy qubits need to have a certain minimum precision which is already quite high, and which had never been achieved before. Now that threshold has been overcome and Pandora's box is open.

      This is big news indeed, because once you reach the achievement of one single error-free qubits, then suddenly scaling up the number of qubits becomes much easier. It's not as it has been until now where you need a huge effort to scale from, say, 40 to 50 qubits. For the last five years I've been pointing out that this would happen sooner or later and it would catch everyone by surprise. But I refrain from sneering at the quantum hate club until we see real applications, otherwise I would look bad ;)

      Spoiler alert: still a long way ahead before quantum computers can break RSA and friends. However I expect very soon applications in industrial areas such as chemistry, material science, biology, logistics and optimization (AI not so sure, it seems like the promise of quantum for AI has been vastly resized in the last few years).
      • by Myrdos ( 5031049 )

        I completely understand. However, after years of research, my sneering accuracy has improved beyond the surface code error threshold, allowing me to bundle noisy sneers into a massive, error-free scowl.

        >:(

    • by AmiMoJo ( 196126 )

      Say you want to recover an encryption key. You use the quantum processor to do it reasonably quickly, instead of waiting potentially longer than your own lifespan for a normal CPU to find it.

      You can then test the key by using it to decrypt the data and checking that the output is valid. If not, just try again. Worst case you have to repeat the work.

      Similarly, say you want to optimize routing. There is no way to solve the "travelling salesman" problem that guarantees the best solution, except by brute force.

      • Say you want to recover an encryption key. You use the quantum processor to do it reasonably quickly, instead of waiting potentially longer than your own lifespan for a normal CPU to find it.

        What you would be really doing is waiting far longer than the present age of the universe if every single atom in the universe were turned into a normal
        CPU.

        No doubt quantum computers will eventually be very useful yet this idea nearly infinite computation could be extracted from the universe at a cost approaching 0 is an unsupported fairy tale.

    • by PJ6 ( 1151747 )
      You're forgetting this is a quantum computer. With enough bits, just a few cycles could perform calculations that would take current hadrware thousands of years.

      Nobody's going to be running Windows on it. When it's ready for PC integration it will be hardware you plug into a motherboard just like a graphics card - a QPU.
    • But Windows 95 can't do 1% quantum anything.
  • by Eunomion ( 8640039 ) on Thursday January 20, 2022 @11:28PM (#62193731)
    Do you feel especially empowered or intelligent because your phone can crunch numbers more massively than the Apollo spacecraft?

    You shouldn't. After all, that fact somehow doesn't let you go to the Moon like that glorified calculator did.

    I highly doubt that any level of "horizontal" improvement...no matter how many orders of magnitude are involved...will be any more impressive in real terms.
    • The Z80 was once powering half of the world. The rest were C64 consoles with their 6502's

      • by robi5 ( 1261542 )

        C64 "consoles"?

        • C64 "consoles"?

          They actually were called consoles because of the sheer number of people throwing out their back trying to lift that 1541 disk drive into place. “Sorry to hear about your back...”

        • C64 "consoles"?

          They were actually called consoles because of the sheer number of people throwing out their back trying to lift that 1541 disk drive into place.

      • Half the world once had calculators too.
      • My first ever chip. I learned to write assembly on the Z80A. Gateway to my career.
    • No, but if you get to 100's of qubits with little error you could solve some very large optimization problems that would take classical computers a long long time to solve.

      • Optimization problems. Great. So the only issues at that point will be the initial assumptions and the interpretation of the results: I.e., the bulk of effort in science and technology.
    • by Myrdos ( 5031049 )

      Do you feel especially empowered or intelligent because your phone can crunch numbers more massively than the Apollo spacecraft?

      I do! I feel as if I'm living in the future, with opportunities and possibilities previous generations could only dream of.

      You shouldn't.

      Don't take this away from me, it's all I have. That calculator couldn't go to the moon either, it had to hitch a ride on a rocket.

      • That calculator couldn't go to the moon either, it had to hitch a ride on a rocket.

        A rocket constructed and operated with the aid of said glorified calculators. Why are we not there now? Probably says something about the overestimation of the value of computational power.
    • This is a silly reasoning. — Going to the moon does not require advanced computation, however technologically impressive it may be; there are many very technologically impressive devices that do not require advanced computation./p

  • Will be like, compute, compute, compute. Then, one week after your three month warranty expires, you get the message: All of your qubits have lost entanglement, please send your quantum pcie card in for servicing. Remember to only have your qubits serviced by Western Digital's approved representatives.

    • I was expecting to read: All your qubits are belong to us.

      -Missed opportunity.
      • I was expecting to read: All your qubits are belong to us.

        -Missed opportunity.

        Sorry, this is quantum computing.

        All your qubits are belong to us, or not, you won’t know till you look.

  • but by the looks of their bulky apparatus they are still cooling it down to very low temperatures, which means the average person won't have one for a long long time.

    • The average person might not have much use for one anyway. The kinds of problems QC solves wouldn't make games any faster. Maybe an optimization for a spreadsheet, but you could just use a shared cloud hosted QC,
      • The average person might not have much use for one anyway. The kinds of problems QC solves wouldn't make games any faster. Maybe an optimization for a spreadsheet, but you could just use a shared cloud hosted QC,

        640 qubits is enough for anyone.

    • by ceoyoyo ( 59147 )

      There's very (very) little chance of ever having a quantum computer that doesn't have to be cooled down to liquid helium temperatures.

      • by HiThere ( 15173 )

        Disagree. But only with certain of the approaches. I think the one with spin states in a crystal with precisely distributed impurities might work at room temperature. Maybe a couple of the approaches based around light.

        Of course, none of the ones that might work are at the most advanced state of development, and (AFAICT) some are just a bit beyond "calculations show .. ". But confident assertions that room temparature quantum computing won't happen are far premature.

        • by ceoyoyo ( 59147 )

          Crystals vibrate like mad. Even if you could somehow keep one qubit stable, to do anything interesting you need thousands of them, *and the links between*.

          Photons don't interact with each other, so to do anything useful you're back to particles that do interact at some point, and they all interact with that pesky vibrating like mad thermal environment as well.

          I didn't say it's impossible, but I'm pretty comfortable with "very little chance."

      • There are research groups that have entangled quantum objects at room temperature.

        • by ceoyoyo ( 59147 )

          The problem isn't getting things entangled, it's preventing them getting entanglement with the environment, i.e. preserving coherence long enough to get something useful done. That gets harder and harder the bigger the system you're trying to keep coherent and the longer you need it to stay that way.

          • Yeah, I know that, but if you can't entangle then in the first place then it is not possible. You can shield noise or figure it ways to get around it.

  • by greytree ( 7124971 ) on Friday January 21, 2022 @03:44AM (#62193989)
    This explains WHY this means " robust, reliable quantum computing in silicon is now a reality":

      “You typically need error rates below 1 percent, to apply quantum error correction protocols. Having now achieved this goal, we can start designing silicon quantum processors that scale up and operate reliably for useful calculations.”

    We want to know WHAT THIS MEANS and WHY IT MEANS THAT. Real editors would put that in a short single-paragraph summary.
  • Maybe in a few decades.

  • Better than Excel?

  • These theoretical computers might be good for certain applications where close enough is OK. Biological brain cells certainly have inaccuracies and we seem to do just fine (OK that's debatable).

    That said, I seriously doubt quantum computers will change our world much. Trees falling in a forest make a sound. If God plays with dice, they have as many sides as every single thing in the entire universe and God can predict the outcome of a throw every time. Is it really that hard for people to grasp that quant
    • by Anonymous Coward

      These theoretical computers might be good for certain applications where close enough is OK.

      Can you point to any of these applications? All I'm seeing is theoretical quantum supremacy mumbo jumbo with some obscure algorithms.

  • by OneHundredAndTen ( 1523865 ) on Friday January 21, 2022 @08:56AM (#62194487)

    We get major quantum computing breakthrough to the tune of a couple per month. How many more major breakthroughs will be needed for quantum computing to be able to do something useful that ordinary computers can't tackle far more easily and efficiently?

    Somebody is indulging in hyperbole here - I don't know whether it is the press or the QC practitioners themselves. What I know is that one should look into the history of AI for reference.

    • by HiThere ( 15173 )

      I'm sure that everyone along the line is putting an optimistic face on the results, but there's another factor to consider:
      Different groups are using different approaches to building qbits, and the techniques don't transfer easily (or sometimes at all) between them.

  • To me one of the most important points here is glossed over, which is the physical makeup of these qubits. So far, the majority of qubit research has gone into "transmon" qubits, mostly because they can be fabricated with standard CMOS techniques. Transmons have achieved very high fidelity control and readout, but they have abysmal coherence times. Coherence time is the the average amount of time over which, having initialized a qubit into some particular quantum state, it will still be there when you loo

  • Quantum Computing will fail in Floating Point Math

    https://0.30000000000000004.co... [30000000000000004.com]

2 pints = 1 Cavort

Working...