Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
IBM Hardware Technology

IBM Touts Quantum Computing Breakthrough 132

Lucas123 writes "IBM today claimed to have been able to reduce error rates and retain the integrity of quantum mechanical properties in quantum bits or qubits long enough to perform a gate operation, opening the door to new microfabrication techniques that allow engineers to begin designing a quantum computer. While still a long ways off, the creation of a quantum computer would mean data processing power would be exponentially increased over what is possible with today's silicon-based computing."
This discussion has been archived. No new comments can be posted.

IBM Touts Quantum Computing Breakthrough

Comments Filter:
  • Exponentially? (Score:1, Informative)

    by drooling-dog ( 189103 ) on Tuesday February 28, 2012 @10:00AM (#39184417)

    data processing power would be exponentially increased over what is possible with today's silicon-based computing.

    Please, please, please stop misusing the word "exponentially". It just means that something is increasing (or declining) at a constant rate, which is practically the opposite of what is meant here.

  • by rgbrenner ( 317308 ) on Tuesday February 28, 2012 @10:07AM (#39184481)

    The Economist had an interesting article a couple days ago.. at least it's interesting if you don't really know the details of quantum computing:

    Quantum computing: An uncertain future [economist.com]

    Each extra qubit in a quantum machine doubles the number of simultaneous operations it can perform. It is this which gives quantum computing its power. Two entangled qubits permit four operations; three permit eight; and so on. A 300-qubit computer could perform more concurrent operations than there are atoms in the visible universe.

  • by JoshuaZ ( 1134087 ) on Tuesday February 28, 2012 @10:10AM (#39184515) Homepage
    Actually, this is a correct use. Some algorithms on quantum computers are exponentially faster than the best known classical algorithms. For example, estimating a Gauss sum http://en.wikipedia.org/wiki/Gauss_sum [wikipedia.org] scales exponentially in time, but the most efficient quantum algorithms are bounded by a polynomial. So exponential speed up is a valid use of the term here.
  • Re:Exponentially? (Score:5, Informative)

    by vlm ( 69642 ) on Tuesday February 28, 2012 @10:15AM (#39184557)

    The whole discussion is fubar

    First of all, the derivative of e to the x ("exponential function") is e to the x. Yeah thats true the D is the same as the function itself. Welcome to 1st semester calculus, kids. Not a constant, not even sure what "constantly increasing" means mathematically, although if AC meant its linear thats a bucket of fail too.

    The next fubar is quantum computing doesn't provide a magic exponential speedup. There is a page length summary on the wikipedia but it should come as No Surprise Whatsoever to anyone in CS that different algorithm designs inherently have different big O notation and magically sprinkling quantum pixie dust doesn't change that, some algos are linear, some poly, some constant, some exponential, all quantum computing does is swap about where some belong. Solve for X where X+1=2 is not gonna change much, factoring into primes is going to change quite a bit. Some of the most interesting problems are polynomial time not exponential in quantum computing. http://en.wikipedia.org/wiki/Quantum_computer#Potential [wikipedia.org]

  • by Gary van der Merwe ( 831179 ) on Tuesday February 28, 2012 @10:19AM (#39184599)
    Quote from article:

    A qubit, like today's conventional bit, can have two possible values: a 0 or a 1. The difference is that a bit must be a 0 or 1, and a qubit can be a 0, 1, or a superposition of both. "Suppose you take 2 qubits. You can be in 00, 01, 10, and 11 at the same time. For 3 qubits you can be in 8 states at the same time (000, 001, 111, etc.). For each qubit you double the number of states you can be in at the same time. This is part of the reason why a quantum computer could be much more powerful," Ketchen said.

    I find that to be a terrible explanation. What he said: "For each qubit you double the number of states you can be in at the same time." is also true for normal bits. Huh? Here is a better explanation: http://en.wikipedia.org/wiki/Qubit [wikipedia.org]

  • by vlm ( 69642 ) on Tuesday February 28, 2012 @10:46AM (#39184869)

    Theres a nice wiki page with pages and pages of detailed explanation of what this post is talking about.

    http://en.wikipedia.org/wiki/Quantum_decoherence [wikipedia.org]

    Here's a nice analogy for quantum computing... its a magic old fashioned analog computer with serious reliability and I/O issues. Imagine at the dawn of the computer era you wanted to simulate the statics of a large railroad bridge. In 8 bits it would take a very long time, 16 bits much longer... And to prevent rounding error propagation you have issues. So why not simulate it with a thundering herd of analog opamps which will "instantly" solve the bridges static loads? OK cool, other than all the opamps must work perfectly the entire time you take a measurement which with vacuum tubes is questionable and qubits maybe impossible. The other problem is if you want 32 bit accuracy now your proto-computer engineer needs to build a 32 bit A/D converter to connect to your analog computer... good luck... This is not a perfect quantum computing analogy, but pretty close in many regards.

    There is a bad trend in computer science to assume "all computers and algorithm programming problems are about the same" which they historically have been, but are not in the real world. So given two roughly identical algorithms and problems on two roughly identical computers, the smaller big-O notation wins every time, more or less. That is a huge mistake to try that thinking across widely different architectures... OK so factoring computation is exponential on classical computers and everyone ignores I/O because thats constant with a normal bus design or at worst linear. OK so factoring computation is poly on quantum computers hooray for us... whoops looks like I/O might go exponential and constant factor might be years/decades to get the thing working.

    The way to keep secure with a classical computer is to pick an algorithm that big O scales such that it can't be broken in this universe. The way to keep secure with a quantum adversary is to pick a key size that seems to make it an engineering impossibility to build a quantum computer, even if by some miracle a quantum computer could solve it in poly time if only it could somehow be built.

  • by mathimus1863 ( 1120437 ) on Tuesday February 28, 2012 @11:11AM (#39185143)
    I took a class on Quantum computing, and studied many specific QC algorithms, so I know a little bit about them.

    Quantum Computers are not super-computers. On a bit-for-bit (or qubit-for-qubit) scale, they're not necessarily faster than regular computers, they just process info differently. Since information is stored in a quantum "superposition" of states, as opposed to a deterministic state like regular computers, the qubits exhibit quantum interference when mixed with other qubits. Typically, your qubit starts in 50% '0' and 50% '1', and thus when you measure it, you get a 50% chance of it being one or the other (and then it assumes that state). But if you don't measure, and push it through quantum circuits allowing them to interact with other qubits, you get the quantum phases to interfere and cancel out. If you are damned smart (as I realized you have to be, to design QC algorithms), you can figure out creative ways to encode your problem into qubits, and use the interference to cancel out the information you don't want, and leave the information you do want.

    For instance, some calculations will start with the 50/50 qubit above, and end with 99% '0' and 1% '1' at the end of the calculation, or vice versa, depending on the answer. Then you've got a 99% chance of getting the right answer. If you run the calculation twice, you have a 99.99% chance of measuring the correct answer. However, the details of these circuits which perform quantum algorithms are extremely non-intuitive to most people, even those who study it. I found it to require an amazing degree of creativity, to figure out how leverage quantum interference constructively.

    But what does this get us? Well it turns out that quantum computers can run anything a classical computer can do, and such algorithms can be written identically if you really wanted to, but doing so gets the same results as the classical computer (i.e. same order of growth). But, the smart people who have been publishing papers about this for the past 20 years have been finding new ways to combine qubits, to take advantage of nature of certain problems (usually deep, pure-math concepts), to achieve better orders of growth than possible on a classical computer. For instance, factoring large numbers is difficult on classical computers, which is why RSA/PGP/GPG/PKI/SSL is secure. It's order of growth is e^( n^(1/3) ). It's not quite exponential, but it's still prohibitive. It turns out that Shor figured out how to get it to n^2 on a quantum computer (which is the same order of growth as decrypting with the private key on a classical computer!). Strangely, trying to guess someone's encryption key, normally O(n) on classical computers (where n is the number of possible keys encryption keys) it's only O(sqrt(n)) on QCs using Grover's algorithm. Weird (but sqrt(n) is still usually too big).

    There's a vast number of other problems for which efficient quantum algorithms have been found. Unfortunately, a lot of these problems aren't particularly useful in real life (besides to the curious pure-mathematician). A lot of them are better, but not phenomenal. Like verifying that two sparse matrices were mulitplied correctly has order of growth n^(7/3) on a classical computer, n^(5/3) on a quantum computer. You can find a pretty extensive list by googling "quantum algorithm zoo." But the reality is that "most" problems we face in computer science do not benefit from quantum computers. In these cases, they are no better than a classical computer. But for problems like integer factorization, bringing the compute requirements down to polynomial time isn't just faster: it makes a problem solvable that wasn't before.

    Unfortunately [for humanity], there is no evidence yet that quantum computers will solve NP-complete problems efficiently. Most likely, they won't. So don't get your hopes up about solving the traveling salesmen problem any time soon. But there is still a lot of cool stuff we can do with them. In fact, the theory is so far ahead of the technology, that we're anxiously waiting for breakthroughs like this, so we can start plugging problems through known algorithms.
  • by Darth Snowshoe ( 1434515 ) on Tuesday February 28, 2012 @12:14PM (#39185839)

    But Apple SHOULD do technological research. Because it provides a long term competitive edge for them, and because its the right thing to do. Corporations, like people, live in a larger society, culture (and nation) and they benefit from those things. Apple would not exist were it not embedded in the Silicon Valley culture emanating from Stanford and Berkeley. Apple should give something back. Maybe Steve would not understand this, but surely Woz would.

    Yeah, iPhones are great, but honestly, ten years from now, we'll be on to a newer, better UI (glasses, brain implants, holodecks, or whatever.) It turns out we're still using lasers and transistors and communications satellites, all invented by Bell Labs in the 60s.

    Here, I'm pasting the best bit from the NYTimes/Bell Labs article, written by Jon Gertner;

    "But what should our pursuit of innovation actually accomplish? By one definition, innovation is an important new product or process, deployed on a large scale and having a significant impact on society and the economy, that can do a job (as Mr. Kelly once put it) “better, or cheaper, or both.” Regrettably, we now use the term to describe almost anything. It can describe a smartphone app or a social media tool; or it can describe the transistor or the blueprint for a cellphone system. The differences are immense. One type of innovation creates a handful of jobs and modest revenues; another, the type Mr. Kelly and his colleagues at Bell Labs repeatedly sought, creates millions of jobs and a long-lasting platform for society’s wealth and well-being."

    The whole article is here (paywall yadda-yadda)
    http://www.nytimes.com/2012/02/26/opinion/sunday/innovation-and-the-bell-labs-miracle.html?pagewanted=all [nytimes.com]

  • by JoshuaZ ( 1134087 ) on Tuesday February 28, 2012 @12:28PM (#39186001) Homepage
    Entagnlement doesn't allow you to communicate information. The following analogy may help. Imagine two coins that whenever they are both flipped they end up either both heads or both tails, but you can't control which one comes up. So if you separate the two coins, you can use them to get a shared source of randomness which you can use for some useful things (like cryptography) but you can't use it to communicate.

This file will self-destruct in five minutes.

Working...