Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Supercomputing Hardware Science

Lockheed Martin Purchases First Commercial Quantum Computer 189

Panaflex writes "D-Wave systems announced general availability for its 128 qubit adiabatic quantum machine just two weeks ago, and reports of its first sale to Lockheed Martin have come out." The D-Wave Systems site has a rather informative collection of quantum computing papers.
This discussion has been archived. No new comments can be posted.

Lockheed Martin Purchases First Commercial Quantum Computer

Comments Filter:
  • by Anonymous Coward on Friday May 27, 2011 @03:46PM (#36267166)

    ...but I'm uncertain if I'll buy one. Maybe I should check with my cat.

    • by Anonymous Coward on Friday May 27, 2011 @04:24PM (#36267692)

      ...but I'm uncertain if I'll buy one. Maybe I should check with my cat.

      Oh NOW you remember to check the cat. It's been locked in that box for a week now. It's dead.
      or is it?

    • by cshark ( 673578 )

      I don't know about your cat, but the voices in my head say it's a keeper.

    • I sell a low cost replacement for this 128 qubit computer, it is 128 boxes for cats. Cats not included, check your local newspaper for "free to a good home" listing in the Pets section. The only worry is beating the python owners to there free food source.
  • by pestie ( 141370 ) on Friday May 27, 2011 @03:49PM (#36267216)

    So, can this thing crack all non-quantum encryption, then? I seem to remember reading about how that would only require 32 qubits or so. And whether it can or can't, if commercial offerings have come this far, how long has the NSA had a version that can crack all encryption?

    • by jd ( 1658 )

      No, since you can't crack non-quantum one-time pad encryption without the encryption pad.

      • by pestie ( 141370 )

        OK, fair enough - I had forgotten about one-time pads. I really should have specified "all encryption based on multiplying two large primes," since that's the vast majority of commercially-significant encryption. I'm not even sure if there's a theoretical quantum attack on elliptic-curve algorithms or not.

        • Re: (Score:3, Informative)

          by Anonymous Coward

          Adiabatic quantum computing != "classic" quantum computing.

          It does NOT runs the Shor algorithm.

          You can use SSL to download your porn safely tonight.

          • by jd ( 1658 )

            It would be better if you use TLS, though, as it overcomes some of the problems with SSL.

        • AES isn't commercially-significant? What about RC4?
        • by MaskedSlacker ( 911878 ) on Friday May 27, 2011 @05:09PM (#36268144)

          I really should have specified "all encryption based on multiplying two large primes," since that's the vast majority of commercially-significant encryption

          No it isn't. It's public/private key encryption. Symmetric key ciphers (which are far more significant) rely on a variety of algorithms. The main use of public/private key is for exchanging symmetric keys.

          In short, RSA (and similar) would be useless, but AES (and similar) would remain secure. The real problem would become one of securely exchanging symmetric keys.

          • by jd ( 1658 )

            One way you -might- be able to securely exchange keys using an RSA-style algorithm would be to chop the keys up. What I'm thinking is this:

            N proxy servers, of which M receive part of the key using RSA or something similar. The rest get something random, but it's also encrypted using the same method. Each server then transmits what they get to the intended recipient the same way. You'd need some algorithm to generate what combination is the correct one (which then becomes vulnerable to attack itself) and som

            • So, I don't know if I follow you, but... you're proposing security by obscurity? But then you say that is open to attacks itself.

              Also, a man in the middle at the recipient's door, would't it bypass your proxy configuration?

              • by jd ( 1658 )

                No, I'm proposing that the network topology can essentially add bits to an encryption key when you're already at the upper limit of what the algorithm itself can take without harming security. If the problem of decryption can be made exponentially harder for each node added, then one node equals some fixed number of bits (probably 1). If you can make the problem increase in severity faster than that, then every node added adds more bits to the key than the last.

                And that's the key question. If you can only d

              • by jd ( 1658 )

                Oh, and yes, to answer the other question, it does. Every component in a crypto system can be attacked, so having a sifter/sieve algorithm plus N proxies adds N+2 components that are potential targets for attack. Since the proxies decrypt and re-encrypt, if the proxies are broken then the plaintext is readable. If the sieve or sifter are broken, the combinatorial explosion is of no significance since the attacker will look only at the channels of interest.

            • The idea is interesting but the only thing that makes this hard to crack is the fact that multiple users interact simultaneously. The proxy has to envelop the packages and encrypt them again, though.

              That means 2 packages coming from A go via the proxy to D. And 2 packages from B go via the proxy to D. An attacker does not really know which of the 4 packets came from A and wich from B.

              However the attacker knows each "valid" package has a "sequence" number somewhere, that will open a possible attack vector.

              an

    • by stevelinton ( 4044 ) <sal@dcs.st-and.ac.uk> on Friday May 27, 2011 @03:56PM (#36267336) Homepage

      An Adiabatic Quantum Computer is quite a different beast from a quantum computer in the usual sense, and even if it can solve the same class of problems in polynomial time (not at all obvious at this stage) it isn't at all clear that 1 qubit in this machine does the same work as 1 traditional qubit.

      They are, to be honest, being a little bit naughty calling this a quantum computer at all, although it does compute and has quanta, but so does my phone.

      • So, what you're saying is that them calling this computer a "quantum computer" is a bit like calling my vintage LED wrist-watch an "atomic clock" -- since, technically, it does tell time using atoms, but not in the way you would normally think of a "real" atomic clock.
    • Re: (Score:2, Informative)

      by Anonymous Coward

      Meh. Quantum computing, even at its *full* potential, is no threat to symmetric encryption. The recommended minimum key size will jump a moderate amount and you'll be all set again. The effect on asymmetric encryption depends on the type. Some could be severely compromised. BUT, seeing as operations are currently exceptionally fast for end users AND that asymmetric encryption is generally only used to *establish* symmetrically-protected channels over insecure networks, they could probably be jumped up by se

    • The real question is "Does the D-Wave 'quantum computer' do anything useful at all?"

      See Scott Aaronson's opinions on the topic: http://blogs.forbes.com/alexknapp/2011/05/24/q-and-a-with-prof-scott-aaronson-on-d-waves-quantum-computer/ [forbes.com]

      Aaronson is a brilliant quantum algorithm complexity professor for MIT. You can read his blog at http://www.scottaaronson.com/ [scottaaronson.com]

    • So, can this thing crack all non-quantum encryption, then? I seem to remember reading about how that would only require 32 qubits or so. And whether it can or can't, if commercial offerings have come this far, how long has the NSA had a version that can crack all encryption?

      I don't know the implications of these computers for modern cryptography, but assuming they can trivially break encryption it means the public will never get their hands on them. Governments and big corporations would have them (it's jus

      • by Hadlock ( 143607 )

        The tin foil hat store is that way -->

        Quantum add-on cards will be commercially available in 5-8 years for your desktop. Quantum desktops are probably 10-12 years out.

        • Do you honestly believe the government would ever allow the public to purchase computers that easily break public-key encryption? Supercomputers are already classified as munitions by the US government. Tighter restrictions on quantum computers are not at all unreasonable, especially if it turns out they may be used to break public-key encryption schemes.

    • Forget ciphering, imagine all the bitcoins you could mine with this thing!
    • And whether it can or can't, if commercial offerings have come this far, how long has the NSA had a version that can crack all encryption?

      Why do you presume they don't already have such a thing? (Other than the OTP, of course.)

    • by drolli ( 522659 )

      Disclaimer: i worked on QC, but not for dwave (although one of my former employers had dealing with them):

      dwave does not aim to build a machine to crack codes. They build a machine which can do what can be done using the technology available now.

      The normal ideas about how to make a QC work, even for lets say factoring 128 bit numbers require many more logical qubits than available. The logical qubits should be composed from physical qubits, and for all coding schemes besides some quite new ones the minimum

    • As "quantum computeres" are not programmed in the normal way with floating point or fixed point variables and for loops and if/else constructs you will need an new algorithm and way to model it and a way to represent the data for every encryption method you want to crack.
      In other words: if we two exchange some PGP encrypted messages, you need to figure how to represent them in the cubits and how to tackle the encryption. There still lots of work to do ;D

      angel'o'sphere

  • Sounds like hell to program. You start by finding a complex hamiltonian with a ground state describing the solution to your problem, and it gets more math-filled from there. If you want to solve a problem with a quantum computer, you're going to need a quantum physicist to tell it what to do.
    • by jd ( 1658 )

      So it's like fuzzy logic, only they got tired of having muppets run the IT department?

    • I guess I am just have to wait for the Apple Quantum Computer User Experience.

      • Re:iQubit (Score:5, Funny)

        by Jeremi ( 14640 ) on Friday May 27, 2011 @05:51PM (#36268496) Homepage

        I guess I am just have to wait for the Apple Quantum Computer User Experience

        Me too -- in particular I'm looking forward to the quantum MWI version of FaceTime, which connects you to various alternate-universe versions of yourself, so you can compare notes and see who made the better decisions.

    • by stevelinton ( 4044 ) <sal@dcs.st-and.ac.uk> on Friday May 27, 2011 @03:58PM (#36267362) Homepage

      A traditional digital computer is pretty hellish to program too if you take away all the props -- you have to find a set of bit values for the memory such this immense consrtructrion of hundreds of millions of gates, clocks, latches, etc. will evolve to give your answer in a reasonably ti,me.

    • You start by finding a complex hamiltonian with a ground state describing the solution to your problem

      I'm not a math whiz, but to me, this says: "You already know the answer to the problem"...
      How can this device help you, if you already have the solution? Is it used for proving the validity, or similar?

      • by pushing-robot ( 1037830 ) on Friday May 27, 2011 @04:11PM (#36267548)

        The basic idea is to enter "42" and see what happens.

        • by youn ( 1516637 )

          and make sure you watch for highway signs at the local intergalactic precinct headquarters :)

      • by retchdog ( 1319261 ) on Friday May 27, 2011 @04:35PM (#36267782) Journal

        think instead, that solving the hamiltonian is equivalent to (or potentially "harder than") solving the original problem, so that you can translate the original problem into a hamiltonian problem. it doesn't mean that you know the answer of either, but you do know that the solution of the hamiltonian will match up to a solution of the original problem. this is the spirit of it: http://en.wikipedia.org/wiki/Reduction_(complexity) [wikipedia.org]

        very, very roughly, think of it like rewriting java, for example, as c. you may not know what the particular code actually DOES in an overall sense, or what it will output, but you can nevertheless rewrite it sort of mechanically (like a compiler would) if you know both languages. furthermore, it's feasible that translating the code is easier than devising the algorithm from scratch. this is basically a reduction. if you can "easily" rewrite any java code as c code, that means java is "reducible" to c. the theory of computation essentially deals with reductions, not of code, but of entire problem classes, which is where P, NP and all that come from.

      • You start by finding a complex hamiltonian with a ground state describing the solution to your problem

        I'm not a math whiz, but to me, this says: "You already know the answer to the problem"...

        No. You can think of this as posing the question in a very specific way -- a little constructing a wire frame so that soap films on it
        naturally form the shape of the best surface for some purpose.

        The open question is what questions can in fact be posed this way?

      • I'm not a math whiz, but to me, this says: "You already know the answer to the problem"...

        Only because you've never taken a physics class past the 100 level.

        Lots of applied problems involve you already knowing the Hamiltonian and needing to find the ground-state solution. The process would be awkward for solving arbitrary problems, but for physical simulations it's pretty streamlined (much more so than writing a DE solver in C to solve the same problem).

      • You know one answer to your problem. There are others. The state you put the system into describes the solution to your problem, You let the system evolve in time, and, assuming you don't add any energy (which would destroy your computation), the system will always describe *a* solution to your problem, but not necessarily the solution that you started out with.
      • by sjames ( 1099 )

        If I understand correctly (I may not), you know what you want the end result to look like, and what you start with, but the problem you're solving is what values take you from the starting condition to that desired result.

        In a grossly simplified analogy, I know I want 5x+2 to equal 12. Now I need to know what x causes that to happen.

  • Wiki (Score:3, Insightful)

    by squidflakes ( 905524 ) on Friday May 27, 2011 @03:50PM (#36267240) Homepage

    I attempted to get a basic understand of quantum computing from Wikipedia, and maybe find out how a quibit measured up to a traditional bit, and what adibatic meant.

    Whelp...

    I will never make fun of another old person who is unable to grasp the concepts of computing and computer interface that I use every day.

    • by dougmc ( 70836 )

      Absolutely right. This stuff is totally alien. ... it's so totally alien that my BS alarms go off when I hear people talking about it. I've read lots of stuff talking about how quantum computers will work, how they'll change everything, etc. -- but they sound like science fiction. And yet here's a commercial version for sale. It just doesn't ring true with me.

      • You do realize that a failure to understand something is not, rationally, reason to outright reject it.

        • I sympathise with the guy. I just read the article about adiabatic quantum computation and didn't understand what the hell it was going on about. I've been an SD for ten years. If these things take off, I'm very definitely out of a job.
          • I have a feeling it's going to be long time before we even have to worry about it. It will be specialized computational devices for a long time, then maybe some sort of new supercomputers. But it will be like IPv4 and IPv6, yes, the new quantum computers will be there, but you'll still be using dull ol' ordinary ones to talk to them.

            That is until the quantum computers develop consciousness and wipe out all humans in favor of androids with Arnold Schwarzenegger's c.1984 physique.

        • Tell that to Texas wrt evolution.

          • There are those who see their ignorance as a failing, and then there are those who see it as virtue. The religionist nuts in Texas are in the latter category.

        • by dougmc ( 70836 )

          You do realize that a failure to understand something is not, rationally, reason to outright reject it.

          I didn't outright reject it.

          But it still sounds like science fiction.

          • by lennier ( 44736 )

            I didn't outright reject it.

            But it still sounds like science fiction.

            That's basically what Einstein and Shroedinger said too when they heard about quantum mechanics. So you're in good company.

        • by sjames ( 1099 )

          It is not necessarily a reason to reject it, but you'll have to consider the source. Nine times out of ten, if you don't understand a sales and marketing guy's explanation of why something is better, it's because it's a crock.

  • Hold the freaking phone. Last I heard, quantum computing was still in it's infancy and people had a hard time reading even 8 qbits or what ever. I don't remember reading about any fully functional quantum computers until just now. Is this just a well kept secret or has we finally entered the era of the quantum computer (at least for large organizations ala the mainframes of old).
    • by blueg3 ( 192743 ) on Friday May 27, 2011 @04:00PM (#36267388)

      Adiabatic quantum computing is somewhat different from "regular" quantum computing. Also, places like Slashdot don't get every minor update to the state of the art. Might have something to do with all the people who say, "wake me up when there's a commercially-available version of this." Well, here's your commercially-available version of this.

      • by sconeu ( 64226 )

        Also, places like Slashdot don't get every minor update to the state of the art

        Yeah, but a jump from 4 or so qubits to 128 is a quantum leap (pardon the pun), not a minor update.

        • t a jump from 4 or so qubits to 128 is a quantum leap (pardon the pun)

          Great, first people forget what irony is and now puns? Using a word to mean what it means is NOT a pun.

          • by lennier ( 44736 )

            Great, first people forget what irony is and now puns? Using a word to mean what it means is NOT a pun.

            Well, it's only a literal quantum leap if you can't measure either the position or momentum of the intervening development prototypes.

            Or if Doctor Sam Beckett is involved.

        • by blueg3 ( 192743 )

          Well, it wasn't a jump. They've built quantum computers of different types in progressively larger sizes. Just, none of the intervening numbers rated as terribly interesting, apparently. (The four-qbit case was interesting because it was the first quantum device that could in any sense be considered a "computer".)

    • by Guspaz ( 556486 )

      My understanding is that d-wave's product isn't a general-purpose quantum turing machine, and that the applications are rather specific (optimization problems). It's not a general-purpose quantum computer.

  • I'm simultaneously for and against this.

    A proper science lab should be receiving the first one, not a weapons development company, and not because of Skynet, but on grounds of basic research principles.
    On the other hand, at least we have one...

    • by wagonlips ( 306377 ) on Friday May 27, 2011 @04:03PM (#36267436) Journal

      I'm simultaneously for and against this.

      Schrödinger? Is that you?

    • Money = force

      I would also like to add that it is the first COMMERCIAL model. Researches do have research grade designs and models.
    • by Dahamma ( 304068 )

      Yeah, except throughout the history of the supercomputer the primary use has been calculating nuclear bomb yields...

      • The whole history of computer development in general has been weapons development (well, to the extent that we date computer development as starting in the code-breaking/nuclear bomb efforts of WWII, rather than with Babbage--although Ada Lovelace's program to compute Bernouli numbers certainly would have had weapons engineering applications).

        • by Dahamma ( 304068 )

          As has much of the history of radar/EM broadcasting, aviation, metallurgy, and who knows, probably the wheel :) Nothing like a good war to cause a leap in technology...

  • by sprior ( 249994 ) on Friday May 27, 2011 @03:57PM (#36267356) Homepage

    I found the D-Wave white papers very hard to understand, but I'm sure it's because of a poor translation from the original Vulcan to (sortof) English.

    • It was actually translated from Yiddish, but Google messed it up due to D-Wave's abuse of their API.
  • I feel like a second-grader learning calculus. When I learned calculus, I was in 9th grade . My 7th grade son is already learning statistics and discrete functions. I was born 30 years too soon. I took AP Physics! Where was my Ising model, Hamiltonion operator, or Eigenvalues? Why must I suffer for being born too soon?
  • Anyone know what Lockheed's plans are for this system? Complex fluid dynamics? Something else?

    The press release [dwavesys.com] only says ".. applied to some of Lockheed Martin's most challenging computation problems."

    -molo

    • by Jeremi ( 14640 )

      Anyone know what Lockheed's plans are for this system? Complex fluid dynamics? Something else?

      It will be used for solving difficult budget problems: in particular, it will optimize the padding-out of this year's expenditures to match the funds allocated, so that next year's budget doesn't get reduced. (/cynic)

  • Like the IEEE says, it's bullshit [ieee.org] in the sense that it's not quantum in the sense usually understood and it's no more effective than a traditional computer. What is more, as with all snake-oil, it has not allowed peer review.

    It would be interesting to see how the money flows from the citizen-taxpayer via the government through Lockheed into D-Wave and finally back to the people in government who set up the purchase.

  • Who knows a little about quantum computers and is quite interested in them. He says that from all he can find out about D-Wave on the internet they seem like a scam (ie they do not actually have any computers, nor is their any evidence they are linked to any of the experts in the field). Will be interested to see what he thinks of this article.

  • It sure seems to be a Quantum Computer to me.

    It either works or it doesn't.

    Nobody seems to know for sure one way or the other, not the CEO who is still running tests to see, and not their detractors who can only speak in percentage certainties.

    Prediction: When the question collapses into one state or the other, it will either turn out to be just an exotic classical computer, or it won't work at all. Because if it turned out to work as intended, then it would effectively prove that particles are both waves

    • by thsths ( 31372 )

      > It sure seems to be a Quantum Computer to me.

      > It either works or it doesn't.

      Those are two different questions, hm? As far as I can see nobody seems to seriously propose that this thing is actually useful. The discussion is about whether it is just a very bad half digital half analog computer with a lot of noise, or whether quantum effects have to be used to explain its behaviour. That behaviour would be a correlation of the noise beyond what classical theory predicts.

      So even if you can, after lo

  • D-Wave has been beating this drum for years -- and/or the press has been conveying the message incorrectly. What they have produced is *not* a quantum computer. They have only proved for symmetric satisfiability problems that it runs in polynomial time, not in the general case -- and I would be interested to see one real-world problem it can actually solve (I doubt they have actually built what is described in the original computing by adiabatic evolution paper from 2000).
  • But does it both run and not run linux?

    • by lennier ( 44736 )

      But does it both run and not run linux?

      I can do that right now by installing Ubuntu Unity.

Repel them. Repel them. Induce them to relinquish the spheroid. - Indiana University fans' chant for their perennially bad football team

Working...