Forgot your password?
typodupeerror
Hardware Science Technology

D-Wave Large-Scale Quantum Chip Validated, Says USC Team 141

Posted by timothy
from the or-they-don't dept.
An anonymous reader writes "A team of scientists says it has verified that quantum effects are indeed at work in the D-Wave processor, the first commercial quantum optimization computer processor. The team demonstrated that the D-Wave processor behaves in a manner that indicates that quantum mechanics has a functional role in the way it works. The demonstration involved a small subset of the chip's 128 qubits, but in other words, the device appears to be operating as a quantum processor."
This discussion has been archived. No new comments can be posted.

D-Wave Large-Scale Quantum Chip Validated, Says USC Team

Comments Filter:
  • by tibit (1762298) on Friday June 28, 2013 @07:24PM (#44138567)

    Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever. The whole deal isn't about that. It's about whether those quantum effects are actually useful for something. Like, um, making it usefully faster than classical computers. I would be very happy even if they had shown "just" polynomial running time improvements, say executing an O(N^3) algorithm in O(N^2) time. Even that would be a big deal. Somehow, I'm very skeptical that anything of the sort will ever be shown for this particular architecture. I would so like to be wrong on that.

    • by jamesh (87723)

      Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever. The whole deal isn't about that. It's about whether those quantum effects are actually useful for something. Like, um, making it usefully faster than classical computers. I would be very happy even if they had shown "just" polynomial running time improvements, say executing an O(N^3) algorithm in O(N^2) time. Even that would be a big deal. Somehow, I'm very skeptical that anything of the sort will ever be shown for this particular architecture. I would so like to be wrong on that.

      That's where i'm at right now too. I wonder if the future may see my point of view in the same way as those who said people could never travel fast on a steam train because the air would be sucked out of the cabin...

    • by Anonymous Coward

      The D Wave computer has demonstrated it's ability to solve optimization problems incredibly fast, and that is incredibly useful for a lot of companies and scientists.

      • by Shavano (2541114) on Friday June 28, 2013 @10:15PM (#44139529)
        No it hasn't.
      • by amaurea (2900163)

        Really? I thought it was 12,000 times slower [archduke.org] than a normal computer when solving the one problem it does best, while costing approximately as many times more than said normal computer. That isn't exactly "incredibly fast" or "incredibly useful", is it? Scientists aren't too happy about it either, because the science, if it exists, is not being published.

        • by Anonymous Coward

          You mean the IBM CPLEX tests [ibm.com] (Part 2 [ibm.com])?

          "This said, best solutions CPLEX could find in 30 minutes are still worse than the best ones found using D-Wave black box or Tabu search in 29 problems, equal in 3, and better in only one problem. This is partly explained by the fact that CPLEX not only tries to find good feasible solutions, but it also spends a fair amount of time trying to prove optimality.
          In short, we dramatically improved CPLEX results, but it does not really change the fact that heuristic methods

          • by amaurea (2900163)

            I see you didn't read my link. CPLEX is discussed there. Yes, D-Wave is faster than CPLEX. But does that matter when other classical implementations exist that are *much* faster than CPLEX? The link disucsses one two such implementations: One, written in plain python (including direct for loops in python, not exactly a recipe for efficiency), that still beats D-Wave by a factor for 120 in speed. It is available on a git repository which is also linked from that article. The other one is a C implementation o

      • by gl4ss (559668)

        The D Wave computer has demonstrated it's ability to solve optimization problems incredibly fast, and that is incredibly useful for a lot of companies and scientists.

        no it hasn't, even this report says it's just potentially possible for it to solve them faster than traditional cpu.

        the article is a little light on details, but it just says that it uses some quantum effect in some way. you know what that means? it means that technically it's not a _total_ fraud (bang for buck it is a fraud still though).

    • Re: (Score:1, Funny)

      by Anonymous Coward

      Scientists prove Intel silicon chip conducts electricity!

      A team of scientists says it has verified that electrical effects are indeed at work in the Intel processor. The team demonstrated that the Intel processor behaves in a manner that indicates that electrons have a functional role in the way it works. The demonstration involved a small subset of the chip's silicon traces, but in other words, the device appears to be operating as a circuit.

      Scientists prove abacus beads are mobile!

      A team of scientists say

    • by Anonymous Coward
      I always get confused by this O notation. But, why would a quantum computer would reduce the O notation. If you don't fix the algorithm, the processor would still take n or n^2 or n^3, no?

      For what I thought a quantum computer would just actually make the time each "n" takes quite small. But I never thought it would make an O(n^3) run in O(n^2) time at all.
      • by tibit (1762298) on Friday June 28, 2013 @09:08PM (#44139155)

        You can't fight an exponential or even polynomial complexity merely by reducing constant factors. It doesn't matter what the constant factor is. All it takes is bumping, say, RSA from 4096 to 16384 bits. That's all you need to beat any conceivable reduction in the constant factor. Just think about it.

      • by firewrought (36952) on Friday June 28, 2013 @09:17PM (#44139205)

        Why would a quantum computer would reduce the O notation?

        Because it's running in multiple worlds simultaneously? It's not just using 1's and 0's but superpositions of the two that are effectively in both states at once. Heh... I'm really don't understand this stuff, but the big deal about quantum computing is that it will make some previously intractable (e.g., non-polynomial) problems accessible to us. All problems in complexity class BQP [wikipedia.org] become, essentially, polynomial on a quantum computer. If you've got enough qbits, among other things.

      • by Anonymous Coward
        A quantum computer can solve public key encryption in O(1) while it takes classical computers O(N^3). Which is the difference between minutes and billions of years.
        • Re: (Score:3, Informative)

          by Anonymous Coward

          Pedantic nitpick: Quantum computers cannot break public key (RSA) encryption in O(1) time; for a modulus N the time complexity is O(Log(n)^3).

        • by gl4ss (559668)

          A quantum computer can solve public key encryption in O(1) while it takes classical computers O(N^3). Which is the difference between minutes and billions of years.

          this isn't that kind of a quantum computer though.

      • Because a quantum computer has access to operations that a classical computer does not have access to. A quantum computer can evaluate a function on all classical inputs at once; the problem is that you cannot read out the complete result (you can do so by repeating the calculation exponentially often, but then, you lose the advantage over classical computers). Therefore quantum algorithms are about bringing the interesting features into a form that can be easily (efficiently) read out.

    • by Anonymous Coward

      One small step dude. Maybe one day it will lead to a standard quantum computer. But like searching for life outside our planet, going to the moon was pretty damn cool and so is this.

      • Exactly! I am glad just to know that someone is actually working on projects like this. It's not just another generation of current CPU technology it is something new and in time they will either master the technology or abandon the technology if things don't work out. But either way it is just nice to know there are people skilled and dedicated enough to attempt these advances.

      • You just made the only analogy that could make me think this chip is made by J.J. Abrams.
    • They're still working on that small issue of knowing only "where exactly the data is" or "what the data is"... on the plus side all you need do to flip a bit is observe it (but beware of infinite recursion). Personally I'm looking forward to the "Schrodinger Class" of processor... in spite of the strict No Refunds policy for open boxes.
    • by Hentes (2461350)

      Theoretically, it should be able to find the minimum of a set of numbers in O(N^0.5) instead of O(N). This is faster than a CPU, but likely slower than an equivalently priced GPU cluster.

      • by tibit (1762298)

        Good. I won't hold my breath until they actually demonstrate this behavior on a set where it makes a difference. Due to constant factors involved, they may have way more luck with factorization problems, since those can be shown to run fast in the pitiful amounts of "memory" they have available.

    • by gweihir (88907)

      Indeed. That that have not strongly indicates that they cannot, because this thing is not actually useful.

    • The summary made me laugh. 'scientists says it has verified that quantum effects are indeed at work in the D-Wave processor'. Exactly the same claim could be made about pretty much any vaguely computer: how do they think transistors work?
    • Yeah, quantum effects are directly noticeable in the way it operates. Yeah, yeah, whatever.

      That's not implausible - quantum effects are directly noticeable in the way your ordinary bipolar junction transistor operates.

    • The quantum effects are obviously useful for *something*, or D-Wave wouldn't manage to be selling these things. As far as I know, nobody has made any claims of general-purpose quantum computing.
      • by tibit (1762298)

        The quantum effects are obviously useful for *something*, or D-Wave wouldn't manage to be selling these things.

        Uh-uh, yeah, sure. That's not how real life works, unfortunately.

  • The question is (Score:2, Interesting)

    by Anonymous Coward

    Can it help crack today's cryptosystems, in what way, and how fast.

    If it is able to do it then someone is doing it and we need to act.

    • by Anonymous Coward

      No. It cannot. It can't do anything even as close as to as well as conventional semiconductors.

      The point is, that this might one day have the potential to be more than electrical circuits, but for now, it's really just an interesting research project.

      • by Bengie (1121981)
        Boeing and NASA are using DWAVE computers to crunch very specific types of data almost 10,000 times faster. A little more than just "research"
        • by gweihir (88907)

          10'000 is in the range that specialized chips give you over general-purpose computers. You get it a bit cheaper though with classical chips, but nobody is doing it as it is still not worthwhile doing.

    • Re:The question is (Score:5, Informative)

      by MightyYar (622222) on Friday June 28, 2013 @07:50PM (#44138707)

      Wrong kind of quantum computer. This does quantum annealing [wikipedia.org].

    • Re:The question is (Score:5, Interesting)

      by WaywardGeek (1480513) on Friday June 28, 2013 @07:54PM (#44138743) Journal

      Not too surprisingly, when a large US military contractor became a major purchaser of D-Wave equipment, all the company claims about being able to factor large integers vanished. D-Wave was going to have a blog series on it. I looked at it's architecture carefully, and yes, if the D-Wave machine has low enough noise, then a 512-qbit D-Wave machine should be able to factor integers close to 500 bits long. The next bigger machine could tackle 2,000 bit integers. The machine seems almost perfectly suited to this problem. The trick is dealing with noise. No one at D-Wave claims that their machine is perfectly coherent all the time during the annealing process. If 1 of the 512 bits suddenly drops out of quantum coherence, it will still act like a normal simulated annealing element until it re-enters coherence. Is noise like that enough to throw off detection of that one minimum solution? I don't know. I do feel that quantum effects will have a positive impact up to some temperature, after which it will just act like a normal annealing machine. I think there will be a phase change at some temperature where instead of qbits occasionally dropping out of coherence, just adding some noise to the solution, there will be so many out of quantum coherence that they will not be able to function at a chip-wide quantum level, and there will be no chance of finding that minimum energy solution.

      • Re:The question is (Score:4, Interesting)

        by WaywardGeek (1480513) on Friday June 28, 2013 @08:14PM (#44138869) Journal

        I just went googling for my old posts about how to do integer factorization with D-Wave. Guess what? GONE! I thought I'd posted it in enough hard to scrub places... Anyway, all this machine is does is minimize an energy equation. I found somebody who had integer factorization coded as an energy equation as the sum of squared terms, but with the D-Wave machine, it does that naturally, and you don't need to square anything. I've got a lot going on at work, my mother is being sued, and I'm doing some genetics stuff. Do I really need to go back and recreate the integer factoring equation?

        • by BradleyUffner (103496) on Friday June 28, 2013 @08:42PM (#44139027) Homepage

          I just went googling for my old posts about how to do integer factorization with D-Wave. Guess what? GONE!

          That's what you get for observing them.

          • You can either know the exact equation or it's exact location on the internet, but the Uncertainty principle clearly says you can't know both at the same time. We obviously know which he chose now.

        • by Anonymous Coward

          Yes. Obviously you do. And you need to post it everywhere again. Duh.

          • Google this: dwave integer factorization New Scientist

            Do you see all the "New Scientist" links near the top? Click on one of them. Of course you don't see it. These links started to fade to obscurity while I was writing this short response. If you do find one, you'll find the link goes nowhere.

            • By the way, the title of the New Scientist article should be "Controversial quantum computer beats factoring record"

              • Awe crud... it only factored 143. I factored 300+ bit numbers with custom algorithms in Python, which only sounds impressive until you find out what others have done. Still.. why are links to integer factorization by D-Wave machine being removed from Google results?

        • by quax (19371)

          Please recreate it if you can find the time. I regularly blog about quantum computing [wavewatching.net] and are happy to feature it, and make sure it doesn't get lost again.

    • by gweihir (88907)

      No. So far everything points to this device not actually being able to do anything useful faster than classical computers.

  • and a Quantum Fireball hard drive... mind boggles

  • by Anonymous Coward

    I think everyone pretty much knew this with any even remotely entry level knowledge on the topic.

    It was doing things that no classical computer could do in any reasonable time at the size it is.
    Those benchmarks not too far back especially proved this fact.

    I guess now though it is good that it is 100% confirmed so the morons can shut the hell up about it.
    Looking forward to see what their new 512Qubit system could do. (other than make encryption useless within a human lifetime)

    • by Empiric (675968)
      (other than make encryption useless within a human lifetime)

      Not sure about that. Though qubits are great for prime factorization (the one-way function upon which mainstream cryptography relies, and breaks if it becomes no longer in practical terms one-way), I'm not sure that it would help for, say, one-time pads or chained-XOR encryption methods (notably, though trivially simple to implement, IIRC using it immediately disqualified an encryption system from being legally exportable). I think in those ca
    • by MightyYar (622222)

      Why would a quantum annealer help break encryption? Isn't that a different field of quantum problem (factoring)?

      • by gweihir (88907)

        Indeed it is. A quantum annealer is not a very useful thing, and it is not really faster than classical computers optimized for this.

    • by Anonymous Coward

      You mean the benchmarks where a classical computer was faster? And okay, it's 'quantum'. Shor's algorithm doesn't run on a quantum annealer... the marketing department of the company that sells them is less optimistic than you.

      What are you, a quantum fanboy?

    • I have video of the Vesuvius Chip that Google and Nasa are working with for AI.

      http://www.youtube.com/watch?v=_Wlsd9mljiU [youtube.com]

    • by gweihir (88907)

      Sorry, but absolutely nothing has been confirmed and the moron is you. If you want to see "quantum effects at work", have a look at any LED. This does not mean anything and the wording is carefully designed to obscure that fact.

  • by Anonymous Coward on Friday June 28, 2013 @07:29PM (#44138595)

    Great... now the NSA can record everything we do *and* everything we don't do in all possible parallel universes... Welp, the analog world was nice while it lasted I guess.

    -- stoops

    • by Anonymous Coward

      1. This isn't a general quantum computer. It's a "quantum annealer".
      2. Not all classical encryption is necessarily vulnerable to quantum computing.

      • by gweihir (88907)

        For block-ciphers, the key-bits are halves. For example AES-256 remains completely secure even with a working general quantum computer. For AES-128, it would need a lot more than 128 bits and it would still need to break 64 bits. But constant factors do matter and there is reason to believe general quantum computers (if they ever work) will not be able to do many steps per second.

        RSA is a bit different, it could be in trouble. Bit there is always dlog crypto, and AFAIK, quantum computers do not help against

        • by Rockoon (1252108)
          Certainly there are cryptographic methods that can survive a quantum attack, but are any of them public key?

          The world runs on public key encryption, allowing two machines to set up a hard to break encryption without the need for priori private channels of communication to pass otherwise vulnerable keys.
          • by gweihir (88907)

            You need to be able to get the full modulus into the Quantum Computer. At this time, they are still stuck somewhere at a few entangled bits (record in 2012 was a factorization of 21, i.e. 5 bits, up from 4 bits in 2001 for factoring 15 ), while RSA-2048 is standard today if they proceed at this speed, RSA-2048 will be in trouble in the year 20k . Hence, this is not a concern today and may never become one. In fact real quantum computers may never grow to sizes where they become useful.

  • If this really works, it could be huge. One of the interesting things about quantum computing is that there has been a fair amount of algorithm development done for quantum computers even though they are barely out of the concept stage.

    A bit dated but nice general background article on quantum computers:
    The Quantum Computer [rice.edu]

    • by Anonymous Coward

      This is not a "Quantum Computer" in that sense of the word.

      The device easily finds "rest states" of qubits. It's basically a specialty ASIC that performs a few steps of a few different algorithms very well. (imagine taking blobs of clay, shaping them to any shapes you like, and dropping them on the ground (or through a sheet/material with holes in it BEFORE it hits the ground)

      That's basically all the thing does. Impost states onto atoms (correct me if I'm wrong, I believe the qubits are molecules in D-

    • by the gnat (153162)

      there has been a fair amount of algorithm development done for quantum computers even though they are barely out of the concept stage

      As the AC above me notes, most of those algorithms won't run on this particular computer. Building a more general-purpose quantum computer is vastly more difficult - this is not even remotely my field of expertise, but from what I've read it has something to do with error-correction. D-Wave is essentially taking a huge shortcut to end up with a vastly less powerful (but prob

    • by gweihir (88907)

      That is the basic ingredient of any good scam "if this really works, it could be huge". Then use enough obfuscation that even "experts" are confused, and you can seel the most pathetic things at a high price.

  • 1) Can I run Linux on it?

    2) Can I mine bitcoin with it?

  • by jennatalia (2684459) on Friday June 28, 2013 @07:44PM (#44138675)
    Are we going to need quantum mechanics to work on these chips and computers?
  • by Black Parrot (19622) on Friday June 28, 2013 @08:08PM (#44138821)

    the device appears to be operating as a quantum processor

    Maybe it both is and isn't, until you have a look at it.

    • the device appears to be operating as a quantum processor

      Maybe it both is and isn't, until you have a look at it.

      Or, we exist in the universe where appears to be operating as a quantum processor, and in another universe right next door it's not and instead you're making this joke about Sigurdur Thordarson existing as a superposition of both a WikiLeak's employee and FBI informant.

    • by TeknoHog (164938)
      Ob. User Friendly [userfriendly.org]
  • by Anonymous Coward

    I think this is the same group I read about in Scott Aaronson's blog post last month: D-Wave: Truth finally starts to emerge [scottaaronson.com]. There is indirect evidence that the D-Wave machine is actually doing quantum annealing rather than classical annealing, which is a great accomplishment, but quantum computing is still a long way from being practical. And the D-Wave machine is no faster than classical simulated annealing running on a much cheaper normal computer.

  • Is there anything in the universe in which quantum mechanics does not have a functional role?
  • I don't want to pay $32 USD for the paper. Am I the only one who can't figure out what they proved and how? The paper's abstract doesn't help much to balance the media's interpretation.

    • Re:I don't get it. (Score:5, Informative)

      by the gnat (153162) on Friday June 28, 2013 @10:27PM (#44139595)

      I am pretty sure that this 7-month-old arXiv preprint [arxiv.org] corresponds to the Nature Communications [nature.com] paper. The titles and author lists are identical, but the abstract deviates, so who knows what changes it went through in revision (I don't have access to the official paper either, even at the university where I work). But presumably it covers the same ground, and it looks like all of the figures from the official are in the preprint.

      (Yo, fuck Nature Publishing Group.)

      • by lachlan76 (770870)
        They are quite similar, though the Nature paper has been substantially edited (it is 30% shorter).
  • And it does nothing. And everything. It defines what you want it to do; technically it's already done it.

    I'll pay $903,845,908,435 for one!

  • Since i stumbled back then over a related preprint:
    http://arxiv.org/abs/1304.4595 [arxiv.org]

    Everything which needs to be said is said there.

  • Sounds like we are on the road to a good Quantum Leap! I can't wait to meet Al!
  • the six core AMD in my machine also depends on quantum effects. it can also do any calculation this quantum annealer can do.

  • Can it outperform classical computers?

    This remains to be seen for the time being, although early benchmarking was enough to convince Google to shell out some cash. [wavewatching.net]

    Nevertheless, there is another set of benchmark results to be released soon, and those may spell a different picture. Unfortunately, I am not at all convinced that I can already win my bet on D-Wave with the current chip generation [wavewatching.net].

    Of course 'hardliners' like Scott Aaronson maintain that quantum annealing will never get there in the first place [scottaaronson.com].

  • That way there might be a state where there are no cats on the internet. Maybe.

    Just make sure no one actually looks at the internet...

1: No code table for op: ++post

Working...