D-Wave Large-Scale Quantum Chip Validated, Says USC Team 141
An anonymous reader writes "A team of scientists says it has verified that quantum effects are indeed at work in the D-Wave processor, the first commercial quantum optimization computer processor. The team demonstrated that the D-Wave processor behaves in a manner that indicates that quantum mechanics has a functional role in the way it works. The demonstration involved a small subset of the chip's 128 qubits, but in other words, the device appears to be operating as a quantum processor."
It Still Doesn't Mean Much... (Score:4, Interesting)
Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever. The whole deal isn't about that. It's about whether those quantum effects are actually useful for something. Like, um, making it usefully faster than classical computers. I would be very happy even if they had shown "just" polynomial running time improvements, say executing an O(N^3) algorithm in O(N^2) time. Even that would be a big deal. Somehow, I'm very skeptical that anything of the sort will ever be shown for this particular architecture. I would so like to be wrong on that.
Re: (Score:2)
Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever. The whole deal isn't about that. It's about whether those quantum effects are actually useful for something. Like, um, making it usefully faster than classical computers. I would be very happy even if they had shown "just" polynomial running time improvements, say executing an O(N^3) algorithm in O(N^2) time. Even that would be a big deal. Somehow, I'm very skeptical that anything of the sort will ever be shown for this particular architecture. I would so like to be wrong on that.
That's where i'm at right now too. I wonder if the future may see my point of view in the same way as those who said people could never travel fast on a steam train because the air would be sucked out of the cabin...
Re: (Score:1)
The D Wave computer has demonstrated it's ability to solve optimization problems incredibly fast, and that is incredibly useful for a lot of companies and scientists.
Re:It Still Doesn't Mean Much... (Score:4, Insightful)
Re: (Score:2)
Really? I thought it was 12,000 times slower [archduke.org] than a normal computer when solving the one problem it does best, while costing approximately as many times more than said normal computer. That isn't exactly "incredibly fast" or "incredibly useful", is it? Scientists aren't too happy about it either, because the science, if it exists, is not being published.
Re: (Score:1)
You mean the IBM CPLEX tests [ibm.com] (Part 2 [ibm.com])?
"This said, best solutions CPLEX could find in 30 minutes are still worse than the best ones found using D-Wave black box or Tabu search in 29 problems, equal in 3, and better in only one problem. This is partly explained by the fact that CPLEX not only tries to find good feasible solutions, but it also spends a fair amount of time trying to prove optimality.
In short, we dramatically improved CPLEX results, but it does not really change the fact that heuristic methods
Re: (Score:1)
I see you didn't read my link. CPLEX is discussed there. Yes, D-Wave is faster than CPLEX. But does that matter when other classical implementations exist that are *much* faster than CPLEX? The link disucsses one two such implementations: One, written in plain python (including direct for loops in python, not exactly a recipe for efficiency), that still beats D-Wave by a factor for 120 in speed. It is available on a git repository which is also linked from that article. The other one is a C implementation o
Re: (Score:2)
The D Wave computer has demonstrated it's ability to solve optimization problems incredibly fast, and that is incredibly useful for a lot of companies and scientists.
no it hasn't, even this report says it's just potentially possible for it to solve them faster than traditional cpu.
the article is a little light on details, but it just says that it uses some quantum effect in some way. you know what that means? it means that technically it's not a _total_ fraud (bang for buck it is a fraud still though).
Re: (Score:2, Informative)
Fixed that for you, you left out the first/primary definition as shown below...
incredibly
Adverb
1. To a great degree; extremely: "incredibly brave".
2. Used to introduce a statement that is hard to believe; strangely: "incredibly, he was still alive".
Synonyms
unbelievably
Re: (Score:2)
yes, so to be incredibly brave one would have to be brave to the point where most people would not find it credible. That is, extremely brave.
Re: (Score:1, Funny)
Scientists prove Intel silicon chip conducts electricity!
A team of scientists says it has verified that electrical effects are indeed at work in the Intel processor. The team demonstrated that the Intel processor behaves in a manner that indicates that electrons have a functional role in the way it works. The demonstration involved a small subset of the chip's silicon traces, but in other words, the device appears to be operating as a circuit.
Scientists prove abacus beads are mobile!
A team of scientists say
Re: (Score:1)
For what I thought a quantum computer would just actually make the time each "n" takes quite small. But I never thought it would make an O(n^3) run in O(n^2) time at all.
Re:It Still Doesn't Mean Much... (Score:4, Informative)
You can't fight an exponential or even polynomial complexity merely by reducing constant factors. It doesn't matter what the constant factor is. All it takes is bumping, say, RSA from 4096 to 16384 bits. That's all you need to beat any conceivable reduction in the constant factor. Just think about it.
Re:It Still Doesn't Mean Much... (Score:4, Informative)
Why would a quantum computer would reduce the O notation?
Because it's running in multiple worlds simultaneously? It's not just using 1's and 0's but superpositions of the two that are effectively in both states at once. Heh... I'm really don't understand this stuff, but the big deal about quantum computing is that it will make some previously intractable (e.g., non-polynomial) problems accessible to us. All problems in complexity class BQP [wikipedia.org] become, essentially, polynomial on a quantum computer. If you've got enough qbits, among other things.
Re: (Score:1)
Re: (Score:3, Informative)
Pedantic nitpick: Quantum computers cannot break public key (RSA) encryption in O(1) time; for a modulus N the time complexity is O(Log(n)^3).
Re: (Score:2)
A quantum computer can solve public key encryption in O(1) while it takes classical computers O(N^3). Which is the difference between minutes and billions of years.
this isn't that kind of a quantum computer though.
Re: (Score:1)
Because a quantum computer has access to operations that a classical computer does not have access to. A quantum computer can evaluate a function on all classical inputs at once; the problem is that you cannot read out the complete result (you can do so by repeating the calculation exponentially often, but then, you lose the advantage over classical computers). Therefore quantum algorithms are about bringing the interesting features into a form that can be easily (efficiently) read out.
Re: (Score:1)
One small step dude. Maybe one day it will lead to a standard quantum computer. But like searching for life outside our planet, going to the moon was pretty damn cool and so is this.
Re: (Score:3)
Exactly! I am glad just to know that someone is actually working on projects like this. It's not just another generation of current CPU technology it is something new and in time they will either master the technology or abandon the technology if things don't work out. But either way it is just nice to know there are people skilled and dedicated enough to attempt these advances.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Theoretically, it should be able to find the minimum of a set of numbers in O(N^0.5) instead of O(N). This is faster than a CPU, but likely slower than an equivalently priced GPU cluster.
Re: (Score:2)
Good. I won't hold my breath until they actually demonstrate this behavior on a set where it makes a difference. Due to constant factors involved, they may have way more luck with factorization problems, since those can be shown to run fast in the pitiful amounts of "memory" they have available.
Re: (Score:2)
Indeed. That that have not strongly indicates that they cannot, because this thing is not actually useful.
Re: (Score:2)
Re: (Score:2)
Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever.
That's not implausible - quantum effects are directly noticeable in the way your ordinary bipolar junction transistor operates.
Re: (Score:2)
What I said. Yeah, yeah, whatever. :)
Re: It Still Doesn't Mean Much... (Score:2)
Re: (Score:2)
The quantum effects are obviously useful for *something*, or D-Wave wouldn't manage to be selling these things.
Uh-uh, yeah, sure. That's not how real life works, unfortunately.
Re: (Score:2)
No empirical observation will allow you to determine the asymptotic complexity of any implementation of any algorithm.
Said someone who never tried such empirical observations. You're silly.
Re: (Score:2)
He is actually right. By definition, this cannot give you a priori data. And that is what big O is - a priori information.
In practice, with measurements, you can take a pretty good guess as to what the complexity is, but you can never know if it is actually correct. This is a common mistake that people make, and can easily cause them to draw incorrect conclusions.
Re: (Score:2)
Why are we even having this nitpick fest? You and the GP are just silly. We can easily observe the behavior of something as N is increased. If you can't figure out a difference between N^2 and N^3, then you're simply not trying hard enough, that's all. Sure, the measurement may show N^1.8 or N^2.7, with some measure of uncertainity attached to the exponent, like duh, why even argue about it?! Sure as heck you don't get O(N^integer) from the measurements, heck, you don't even get either O() or Omega(), since
Re: (Score:2)
No empirical observation will allow you to determine the asymptotic complexity of any implementation of any algorithm. ...
This is nonsense. Show me an algorithm and I tell you its complexity
The rest of your post makes no sense either. "n" is the size of the input. Complexity is relevant for any input size, otherwise we had e.g. not dozens of different sorting algorithms.
Re: (Score:2)
You got it all wrong. Big-O is indeed about the tight upper bound, and the complexity of the input size. And as the number of operations increase, you bet your ass that it will be particularly useful. Oh you bet your ass.
GP is being an ass, and doesn't seem to understand what "asymptotic complexity" means. However, you are incorrect about big-O, which does not need to be a tight bound. You're thinking of big-theta. Wikipedia has a concise summary:
https://en.wikipedia.org/wiki/Big_theta#Family_of_Bachmann.E2.80.93Landau_notations [wikipedia.org]
Re: (Score:2)
Hence, its just saying "sooner or later", this will always be a bound...
The question is (Score:2, Interesting)
Can it help crack today's cryptosystems, in what way, and how fast.
If it is able to do it then someone is doing it and we need to act.
Re: (Score:1)
No. It cannot. It can't do anything even as close as to as well as conventional semiconductors.
The point is, that this might one day have the potential to be more than electrical circuits, but for now, it's really just an interesting research project.
Re: (Score:2)
Re: (Score:2)
10'000 is in the range that specialized chips give you over general-purpose computers. You get it a bit cheaper though with classical chips, but nobody is doing it as it is still not worthwhile doing.
Re:The question is (Score:5, Informative)
Wrong kind of quantum computer. This does quantum annealing [wikipedia.org].
Re:The question is (Score:5, Interesting)
Not too surprisingly, when a large US military contractor became a major purchaser of D-Wave equipment, all the company claims about being able to factor large integers vanished. D-Wave was going to have a blog series on it. I looked at it's architecture carefully, and yes, if the D-Wave machine has low enough noise, then a 512-qbit D-Wave machine should be able to factor integers close to 500 bits long. The next bigger machine could tackle 2,000 bit integers. The machine seems almost perfectly suited to this problem. The trick is dealing with noise. No one at D-Wave claims that their machine is perfectly coherent all the time during the annealing process. If 1 of the 512 bits suddenly drops out of quantum coherence, it will still act like a normal simulated annealing element until it re-enters coherence. Is noise like that enough to throw off detection of that one minimum solution? I don't know. I do feel that quantum effects will have a positive impact up to some temperature, after which it will just act like a normal annealing machine. I think there will be a phase change at some temperature where instead of qbits occasionally dropping out of coherence, just adding some noise to the solution, there will be so many out of quantum coherence that they will not be able to function at a chip-wide quantum level, and there will be no chance of finding that minimum energy solution.
Re:The question is (Score:4, Interesting)
I just went googling for my old posts about how to do integer factorization with D-Wave. Guess what? GONE! I thought I'd posted it in enough hard to scrub places... Anyway, all this machine is does is minimize an energy equation. I found somebody who had integer factorization coded as an energy equation as the sum of squared terms, but with the D-Wave machine, it does that naturally, and you don't need to square anything. I've got a lot going on at work, my mother is being sued, and I'm doing some genetics stuff. Do I really need to go back and recreate the integer factoring equation?
Re:The question is (Score:5, Funny)
I just went googling for my old posts about how to do integer factorization with D-Wave. Guess what? GONE!
That's what you get for observing them.
Re: (Score:1)
You can either know the exact equation or it's exact location on the internet, but the Uncertainty principle clearly says you can't know both at the same time. We obviously know which he chose now.
Re: (Score:1)
Yes. Obviously you do. And you need to post it everywhere again. Duh.
Re: (Score:2)
Google this: dwave integer factorization New Scientist
Do you see all the "New Scientist" links near the top? Click on one of them. Of course you don't see it. These links started to fade to obscurity while I was writing this short response. If you do find one, you'll find the link goes nowhere.
Re: (Score:2)
By the way, the title of the New Scientist article should be "Controversial quantum computer beats factoring record"
Re: (Score:2)
Awe crud... it only factored 143. I factored 300+ bit numbers with custom algorithms in Python, which only sounds impressive until you find out what others have done. Still.. why are links to integer factorization by D-Wave machine being removed from Google results?
Re: (Score:2)
I'm glad to know that I'm not the only one to think Google search results are getting worse. Not only do I find it more difficult to get the kind of results that are germane to my query but what does come back is also skewed to commercial sites rather than nuts-and-bolts stuff.
I'm also saddened that search is getting worse, of course. From the few times I've used it recently, I haven't found Bing to be much better.
Re: (Score:2)
Please recreate it if you can find the time. I regularly blog about quantum computing [wavewatching.net] and are happy to feature it, and make sure it doesn't get lost again.
Re: (Score:2)
No. So far everything points to this device not actually being able to do anything useful faster than classical computers.
Re: (Score:2)
Seems already solved pretty well, see this:
https://github.com/exaexa/codecrypt [github.com]
Way to go AC answer my question before I got to ask it :) (don't have mod points)
Imagine a computer with this Quantum processor (Score:1)
and a Quantum Fireball hard drive... mind boggles
Re: (Score:1)
Quantum CPU + Quantum Fireball HD = HADOUKEN!
Was anyone really surprised by this? (Score:1)
I think everyone pretty much knew this with any even remotely entry level knowledge on the topic.
It was doing things that no classical computer could do in any reasonable time at the size it is.
Those benchmarks not too far back especially proved this fact.
I guess now though it is good that it is 100% confirmed so the morons can shut the hell up about it.
Looking forward to see what their new 512Qubit system could do. (other than make encryption useless within a human lifetime)
Re: (Score:2)
Not sure about that. Though qubits are great for prime factorization (the one-way function upon which mainstream cryptography relies, and breaks if it becomes no longer in practical terms one-way), I'm not sure that it would help for, say, one-time pads or chained-XOR encryption methods (notably, though trivially simple to implement, IIRC using it immediately disqualified an encryption system from being legally exportable). I think in those ca
Re:Was anyone really surprised by this? (Score:5, Informative)
Re: (Score:2)
Why would a quantum annealer help break encryption? Isn't that a different field of quantum problem (factoring)?
Re: (Score:2)
Indeed it is. A quantum annealer is not a very useful thing, and it is not really faster than classical computers optimized for this.
Re: (Score:1)
You mean the benchmarks where a classical computer was faster? And okay, it's 'quantum'. Shor's algorithm doesn't run on a quantum annealer... the marketing department of the company that sells them is less optimistic than you.
What are you, a quantum fanboy?
Re:Was anyone really surprised by this? (Score:5, Interesting)
No, I mean the 439 benchmark just recently that absolutely destroyed classic computers. Mere seconds compared to over half an hour quicker.
That was a terrible benchmark. They measured performance against possibly the most inefficient algorithm possible (using a third-party implementation) - not even remotely doing the same type of computations. That was where the "3600-fold" improvement came from. Some other computer scientists spent a bit of time optimizing an algorithm (also annealing, I think) for conventional computers in response, with the eventual result that their implementation was faster than the D-Wave. Which makes the entire effort sound like $10 million to avoid writing better software in the first place.
It vaguely reminds me of all of the GPU benchmarks I've seen where single-precision floating-point performance on the GPU is compared to double-precision performance on the CPU. Except orders of magnitude worse.
Re: (Score:2)
But it doesn't matter what the times were for one specific run of the calculation. The question is how the two algorithms scale.
I saw a blog somewhere that the guy claimed the improved classical algorithm scales at the same rate as the quantum annealing algorith, meaning no gain for DWave. But there's wasn't any kind of proof in that post, just a claim.
Re: (Score:2)
For the record, for simulated annealing to guarantee an optimal result the asymptotic worst case time is O(infinity)
Worst case O(infinity) complexity simply isnt interesting in the realm of stochastic algorithms, average case cannot be easily deduced, and best case O(1) is also not interesting.
Sometimes experiment is the only way to ascertain how something scales in t
Re: (Score:2)
Which makes the entire effort sound like $10 million to avoid writing better software in the first place.
How much time/money did they spend optimizing the software?
I always have to remind people of this in my line of work. They'll spend 10 minutes optimizing an equation that only gets run once. Sure it saved an hour of processing time with that 10 minutes of work but that 10 minutes of human resources was worth more than 100 hours of CPU time.
Re: (Score:2)
How much time/money did they spend optimizing the software?
That's a good question - I think it was not insignificant. And I agree that optimization by hand is often a terribly inefficient solution. However, I think this blog post by Scott Aaronson [scottaaronson.com] makes a good counter-argument:
Re: (Score:2)
http://www.youtube.com/watch?v=_Wlsd9mljiU [youtube.com]
Re: (Score:2)
Sorry, but absolutely nothing has been confirmed and the moron is you. If you want to see "quantum effects at work", have a look at any LED. This does not mean anything and the wording is carefully designed to obscure that fact.
Great Scott! (Score:5, Funny)
Great... now the NSA can record everything we do *and* everything we don't do in all possible parallel universes... Welp, the analog world was nice while it lasted I guess.
-- stoops
Re: (Score:1)
1. This isn't a general quantum computer. It's a "quantum annealer".
2. Not all classical encryption is necessarily vulnerable to quantum computing.
Re: (Score:3)
For block-ciphers, the key-bits are halves. For example AES-256 remains completely secure even with a working general quantum computer. For AES-128, it would need a lot more than 128 bits and it would still need to break 64 bits. But constant factors do matter and there is reason to believe general quantum computers (if they ever work) will not be able to do many steps per second.
RSA is a bit different, it could be in trouble. Bit there is always dlog crypto, and AFAIK, quantum computers do not help against
Re: (Score:2)
The world runs on public key encryption, allowing two machines to set up a hard to break encryption without the need for priori private channels of communication to pass otherwise vulnerable keys.
Re: (Score:2)
You need to be able to get the full modulus into the Quantum Computer. At this time, they are still stuck somewhere at a few entangled bits (record in 2012 was a factorization of 21, i.e. 5 bits, up from 4 bits in 2001 for factoring 15 ), while RSA-2048 is standard today if they proceed at this speed, RSA-2048 will be in trouble in the year 20k . Hence, this is not a concern today and may never become one. In fact real quantum computers may never grow to sizes where they become useful.
This could be huge (Score:2)
If this really works, it could be huge. One of the interesting things about quantum computing is that there has been a fair amount of algorithm development done for quantum computers even though they are barely out of the concept stage.
A bit dated but nice general background article on quantum computers:
The Quantum Computer [rice.edu]
Re: (Score:1)
This is not a "Quantum Computer" in that sense of the word.
The device easily finds "rest states" of qubits. It's basically a specialty ASIC that performs a few steps of a few different algorithms very well. (imagine taking blobs of clay, shaping them to any shapes you like, and dropping them on the ground (or through a sheet/material with holes in it BEFORE it hits the ground)
That's basically all the thing does. Impost states onto atoms (correct me if I'm wrong, I believe the qubits are molecules in D-
Re: (Score:2)
Thank you for that clarification.
Re: (Score:3)
there has been a fair amount of algorithm development done for quantum computers even though they are barely out of the concept stage
As the AC above me notes, most of those algorithms won't run on this particular computer. Building a more general-purpose quantum computer is vastly more difficult - this is not even remotely my field of expertise, but from what I've read it has something to do with error-correction. D-Wave is essentially taking a huge shortcut to end up with a vastly less powerful (but prob
Re: (Score:2)
I see. That is helpful. Thank you for that clarification.
Re: (Score:2)
That is the basic ingredient of any good scam "if this really works, it could be huge". Then use enough obfuscation that even "experts" are confused, and you can seel the most pathetic things at a high price.
2 Standard Questions to Evaluate any tech (Score:2)
1) Can I run Linux on it?
2) Can I mine bitcoin with it?
Re:2 Standard Questions to Evaluate any tech (Score:5, Funny)
1) Yes
2) No
--next calculation--
1) No
2) Yes
Re: (Score:2)
What about maintenance? (Score:5, Funny)
Re: (Score:1)
Or I should say
For compiler design you will. Or for assembly as well. Once someone writes a C compiler for it you won't even need to know the name "Albert Einstein".
That's good, because Albert Einstein was not fond of quantum theory ;-).
Actually... (Score:3)
the device appears to be operating as a quantum processor
Maybe it both is and isn't, until you have a look at it.
Re: (Score:2)
the device appears to be operating as a quantum processor
Maybe it both is and isn't, until you have a look at it.
Or, we exist in the universe where appears to be operating as a quantum processor, and in another universe right next door it's not and instead you're making this joke about Sigurdur Thordarson existing as a superposition of both a WikiLeak's employee and FBI informant.
Re: (Score:2)
Quantum Annealing (Score:1)
I think this is the same group I read about in Scott Aaronson's blog post last month: D-Wave: Truth finally starts to emerge [scottaaronson.com]. There is indirect evidence that the D-Wave machine is actually doing quantum annealing rather than classical annealing, which is a great accomplishment, but quantum computing is still a long way from being practical. And the D-Wave machine is no faster than classical simulated annealing running on a much cheaper normal computer.
quantum mechanics has a functional role... (Score:2)
Re: (Score:2)
I don't get it. (Score:2)
I don't want to pay $32 USD for the paper. Am I the only one who can't figure out what they proved and how? The paper's abstract doesn't help much to balance the media's interpretation.
Re:I don't get it. (Score:5, Informative)
I am pretty sure that this 7-month-old arXiv preprint [arxiv.org] corresponds to the Nature Communications [nature.com] paper. The titles and author lists are identical, but the abstract deviates, so who knows what changes it went through in revision (I don't have access to the official paper either, even at the university where I work). But presumably it covers the same ground, and it looks like all of the figures from the official are in the preprint.
(Yo, fuck Nature Publishing Group.)
Re: (Score:2)
We've been waiting! (Score:2)
And it does nothing. And everything. It defines what you want it to do; technically it's already done it.
I'll pay $903,845,908,435 for one!
Information in ArXiv (Score:2)
Since i stumbled back then over a related preprint:
http://arxiv.org/abs/1304.4595 [arxiv.org]
Everything which needs to be said is said there.
Quantum... (Score:2)
mine too (Score:2)
the six core AMD in my machine also depends on quantum effects. it can also do any calculation this quantum annealer can do.
The big ticket question (Score:2)
Can it outperform classical computers?
This remains to be seen for the time being, although early benchmarking was enough to convince Google to shell out some cash. [wavewatching.net]
Nevertheless, there is another set of benchmark results to be released soon, and those may spell a different picture. Unfortunately, I am not at all convinced that I can already win my bet on D-Wave with the current chip generation [wavewatching.net].
Of course 'hardliners' like Scott Aaronson maintain that quantum annealing will never get there in the first place [scottaaronson.com].
Make it run the interwebs! (Score:2)
That way there might be a state where there are no cats on the internet. Maybe.
Just make sure no one actually looks at the internet...
Re: (Score:3, Informative)
It won't become the thing for general computing use. There are specific applications where quantum operations can compute faster, but if it's a matter of what computers are normally used for, standard digital computing hardware is the thing.
That said, quantum processor cores may become an accessory you can buy for your computer, complete with the software needed to set up quantum optimization problems, and high end scientific workstations might have them built in some day.