D-Wave Large-Scale Quantum Chip Validated, Says USC Team 141
An anonymous reader writes "A team of scientists says it has verified that quantum effects are indeed at work in the D-Wave processor, the first commercial quantum optimization computer processor. The team demonstrated that the D-Wave processor behaves in a manner that indicates that quantum mechanics has a functional role in the way it works. The demonstration involved a small subset of the chip's 128 qubits, but in other words, the device appears to be operating as a quantum processor."
It Still Doesn't Mean Much... (Score:4, Interesting)
Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever. The whole deal isn't about that. It's about whether those quantum effects are actually useful for something. Like, um, making it usefully faster than classical computers. I would be very happy even if they had shown "just" polynomial running time improvements, say executing an O(N^3) algorithm in O(N^2) time. Even that would be a big deal. Somehow, I'm very skeptical that anything of the sort will ever be shown for this particular architecture. I would so like to be wrong on that.
The question is (Score:2, Interesting)
Can it help crack today's cryptosystems, in what way, and how fast.
If it is able to do it then someone is doing it and we need to act.
Re:The question is (Score:5, Interesting)
Not too surprisingly, when a large US military contractor became a major purchaser of D-Wave equipment, all the company claims about being able to factor large integers vanished. D-Wave was going to have a blog series on it. I looked at it's architecture carefully, and yes, if the D-Wave machine has low enough noise, then a 512-qbit D-Wave machine should be able to factor integers close to 500 bits long. The next bigger machine could tackle 2,000 bit integers. The machine seems almost perfectly suited to this problem. The trick is dealing with noise. No one at D-Wave claims that their machine is perfectly coherent all the time during the annealing process. If 1 of the 512 bits suddenly drops out of quantum coherence, it will still act like a normal simulated annealing element until it re-enters coherence. Is noise like that enough to throw off detection of that one minimum solution? I don't know. I do feel that quantum effects will have a positive impact up to some temperature, after which it will just act like a normal annealing machine. I think there will be a phase change at some temperature where instead of qbits occasionally dropping out of coherence, just adding some noise to the solution, there will be so many out of quantum coherence that they will not be able to function at a chip-wide quantum level, and there will be no chance of finding that minimum energy solution.
Re:The question is (Score:4, Interesting)
I just went googling for my old posts about how to do integer factorization with D-Wave. Guess what? GONE! I thought I'd posted it in enough hard to scrub places... Anyway, all this machine is does is minimize an energy equation. I found somebody who had integer factorization coded as an energy equation as the sum of squared terms, but with the D-Wave machine, it does that naturally, and you don't need to square anything. I've got a lot going on at work, my mother is being sued, and I'm doing some genetics stuff. Do I really need to go back and recreate the integer factoring equation?
Re:Was anyone really surprised by this? (Score:5, Interesting)
No, I mean the 439 benchmark just recently that absolutely destroyed classic computers. Mere seconds compared to over half an hour quicker.
That was a terrible benchmark. They measured performance against possibly the most inefficient algorithm possible (using a third-party implementation) - not even remotely doing the same type of computations. That was where the "3600-fold" improvement came from. Some other computer scientists spent a bit of time optimizing an algorithm (also annealing, I think) for conventional computers in response, with the eventual result that their implementation was faster than the D-Wave. Which makes the entire effort sound like $10 million to avoid writing better software in the first place.
It vaguely reminds me of all of the GPU benchmarks I've seen where single-precision floating-point performance on the GPU is compared to double-precision performance on the CPU. Except orders of magnitude worse.