Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Supercomputing Hardware

Will Quantum Computing Make It Out of the Lab? 129

alphadogg writes "Researchers have been working on quantum systems for more than a decade, in the hopes of developing super-tiny, super-powerful computers. And while there is still plenty of excitement surrounding quantum computing, significant roadblocks are causing some to question whether quantum computing will ever make it out of the lab. 'Artur Ekert, professor of Quantum Physics, Mathematical Institute at the University of Oxford, says physicists today can only control a handful of quantum bits, which is adequate for quantum communication and quantum cryptography, but nothing more. He notes that it will take a few more domesticated qubits to produce quantum repeaters and quantum memories, and even more to protect and correct quantum data. "Add still a few more qubits, and we should be able to run quantum simulations of some quantum phenomena and so forth. But when this process arrives to 'a practical quantum computer' is very much a question of defining what 'a practical quantum computer' really is. The best outcome of our research in this field would be to discover that we cannot build a quantum computer for some very fundamental reason, then maybe we would learn something new and something profound about the laws of nature," Ekert says.'"
This discussion has been archived. No new comments can be posted.

Will Quantum Computing Make It Out of the Lab?

Comments Filter:
  • by JoshuaZ ( 1134087 ) on Monday September 26, 2011 @02:41PM (#37518490) Homepage

    The current state of the field is advancing. The real problem as discussed in TFA is scaling quantum computers in a useful way that can still do error correction. Shore's algorithm which allows you to quickly factor numbers using a quantum computer requires on the order of n qbits to factor an n bit number. So if one wants to factor say a 300 digit number used in some public key crypto system you would need to control around 300 qbits. The technology for that is clearly very far. There's been recent work using superconducting systems and using quantum dots for qbits both of which look more promising than previous systems. (The first experiments were done with NMR systems which are clearly not very scalable).

    From a strictly theoretical compsci perspective, the set of things it seems that quantum computers can do seems to be growing larger. Recent work by Scott Aaronson and others suggest that BQP (the set of problems which can be easily solved by a quantum computer with a low probability of error) may not lie in the polynomial hierarchy at all. http://arxiv.org/abs/0910.4698 [arxiv.org]. This is a much stronger claim then the claim that BQP doesn't lie in NP. This raises the hope that there may be some problems thought of as extremely difficult that lie in NP. However, trying to actually prove any strong results at this point is likely going to be really tough. At this point although many suspect that BPP (the classical analog of BQP) is equal to P, at this point we can't even prove that BPP lies in NP. In many ways theoretical comp sci is still very much in its infancy.

  • by TheSync ( 5291 ) on Monday September 26, 2011 @02:41PM (#37518504) Journal

    1) We have built qbits
    2) We have entangled qbits
    3) We have implemented the CNOT which is the universal gate for quantum computing (similar to NAND/NOR universal gates in classical computing)

    The question is scaling up number of qbits, increasing coherence times (and possibly using coding solutions to reduce decoherence problems).

    We have a number of quantum algorithms [wikipedia.org] waiting to be implemented, and even have quantum programming languages [wikipedia.org] that you can run simulations on at home today. And there is even a LinkedIn Group [linkedin.com] on quantum information science.

    But I must admit that it could end up like fusion. We have all the basic theoretical knowledge of how to do fusion, and we can do a bit of fusion in the lab, what we lack is the engineering knowledge to achieve enough fusion on a large enough scale to make it practical.

  • Let's not forget... (Score:5, Informative)

    by Nethemas the Great ( 909900 ) on Monday September 26, 2011 @02:48PM (#37518594)
    the history of the PC. How many decades did it take for us to get where we are? The first PC was some 50 years in the making and by today's standards was downright laughable in its capabilities. The first computers weren't Von Neumann machines either. You had to have a team of dedicated operators reconfigure patch cables between between outputs and inputs for each an every calculation! To be so pessimistic so early in the life of quantum computing is insulting to the progress we've made so far which is considerably outstripping the pace of development of the modern computer.
  • by drolli ( 522659 ) on Monday September 26, 2011 @07:51PM (#37521670) Journal

    Yes. it will. the time frame for QC leaving the lab is something from 15 years to 50years. If it doesn't work in the next 50years it means we understand something about quantum mechanics significantly wrong (or we figured QC is useless for some reason).

    There are several milestones:

    1) implementing single qubits (done in many systems) and high fidelity readout (done on a few systems)

    2) high fidelity operations on single qubits (done on some systems)

    3) controllable coupling of qubits (done on some systems) witn good on-off ratio (done on a few systems) in a decent architecture (only very few experiments AFAIU) with a demonstration of simple QIP algorithms (done)

    4) scalability in the production yield for solid state systems (NOT done, by far not) or in the resource usage for other systems (atom chips are promising)

    5) Quantum media conversion between solid state and optics (done) with decent fidelity (far, far away) for using QIP in Quantum communication as local processors

    6) Error correcting schemes to lower the threshold for 2) to a doable value for building a scalable computer (that is, a computer which gains computational power when ressources are added): theroretical (done) and experimental (far away)

    7) Theoretical understanding of QIP Architecture (not done)

    6, which implies 1-4 (and depending on the scheme also 5) have been solved is the criterion for building an arbitrary powerful QC for arbitrary money. The more you exceed the absolute thresholds imposed onto 2) and 4) the more power you will gain by adding resources (it could be 10 or 10000 physical qubits needed for 1 logical qubit). The question is: when will it be economical to build it? I cant answer this, but the first thing where it may pay off is for protein folding simulations. We are looking at replacing a 100MW input power classical computer by a some MW input power quantum computer (condensing helium). We may look at power cost savings of 10 to 100million of dollars per year runtime of the QC. Currently the schemes which are predicted to scale with current HW (on the rather optimistic end, i.e. the best experiments ever done) may require roughly a 100Million - 1billion Dollar investment into Hardware alone per QC (hand waving approximation), obviously unacceptable. However if the price goes down by a facto of 10 to 100 (which could happen in the next 20 years if better material or schemes are found), then it would be economical.

Thus spake the master programmer: "Time for you to leave." -- Geoffrey James, "The Tao of Programming"

Working...