Lockheed Martin Purchases First Commercial Quantum Computer 189
Panaflex writes "D-Wave systems announced general availability for its 128 qubit adiabatic quantum machine just two weeks ago, and reports of its first sale to Lockheed Martin have come out."
The D-Wave Systems site has a rather informative collection of quantum computing papers.
I want one... (Score:3, Funny)
...but I'm uncertain if I'll buy one. Maybe I should check with my cat.
Re:I want one... (Score:4, Funny)
...but I'm uncertain if I'll buy one. Maybe I should check with my cat.
Oh NOW you remember to check the cat. It's been locked in that box for a week now. It's dead.
or is it?
Re: (Score:2)
I don't know about your cat, but the voices in my head say it's a keeper.
Re: (Score:2)
Re: (Score:3)
So, how long has the NSA had one? (Score:4, Interesting)
So, can this thing crack all non-quantum encryption, then? I seem to remember reading about how that would only require 32 qubits or so. And whether it can or can't, if commercial offerings have come this far, how long has the NSA had a version that can crack all encryption?
Re: (Score:3)
No, since you can't crack non-quantum one-time pad encryption without the encryption pad.
Re: (Score:2)
OK, fair enough - I had forgotten about one-time pads. I really should have specified "all encryption based on multiplying two large primes," since that's the vast majority of commercially-significant encryption. I'm not even sure if there's a theoretical quantum attack on elliptic-curve algorithms or not.
Re: (Score:3, Informative)
Adiabatic quantum computing != "classic" quantum computing.
It does NOT runs the Shor algorithm.
You can use SSL to download your porn safely tonight.
Re: (Score:2)
It would be better if you use TLS, though, as it overcomes some of the problems with SSL.
Re: (Score:2)
Re:So, how long has the NSA had one? (Score:4, Interesting)
I really should have specified "all encryption based on multiplying two large primes," since that's the vast majority of commercially-significant encryption
No it isn't. It's public/private key encryption. Symmetric key ciphers (which are far more significant) rely on a variety of algorithms. The main use of public/private key is for exchanging symmetric keys.
In short, RSA (and similar) would be useless, but AES (and similar) would remain secure. The real problem would become one of securely exchanging symmetric keys.
Re: (Score:2)
One way you -might- be able to securely exchange keys using an RSA-style algorithm would be to chop the keys up. What I'm thinking is this:
N proxy servers, of which M receive part of the key using RSA or something similar. The rest get something random, but it's also encrypted using the same method. Each server then transmits what they get to the intended recipient the same way. You'd need some algorithm to generate what combination is the correct one (which then becomes vulnerable to attack itself) and som
Re: (Score:2)
So, I don't know if I follow you, but... you're proposing security by obscurity? But then you say that is open to attacks itself.
Also, a man in the middle at the recipient's door, would't it bypass your proxy configuration?
Re: (Score:2)
No, I'm proposing that the network topology can essentially add bits to an encryption key when you're already at the upper limit of what the algorithm itself can take without harming security. If the problem of decryption can be made exponentially harder for each node added, then one node equals some fixed number of bits (probably 1). If you can make the problem increase in severity faster than that, then every node added adds more bits to the key than the last.
And that's the key question. If you can only d
Re: (Score:2)
Oh, and yes, to answer the other question, it does. Every component in a crypto system can be attacked, so having a sifter/sieve algorithm plus N proxies adds N+2 components that are potential targets for attack. Since the proxies decrypt and re-encrypt, if the proxies are broken then the plaintext is readable. If the sieve or sifter are broken, the combinatorial explosion is of no significance since the attacker will look only at the channels of interest.
Re: (Score:2)
The idea is interesting but the only thing that makes this hard to crack is the fact that multiple users interact simultaneously. The proxy has to envelop the packages and encrypt them again, though.
That means 2 packages coming from A go via the proxy to D. And 2 packages from B go via the proxy to D. An attacker does not really know which of the 4 packets came from A and wich from B.
However the attacker knows each "valid" package has a "sequence" number somewhere, that will open a possible attack vector.
an
Re: (Score:2)
Re:So, how long has the NSA had one? (Score:5, Informative)
An Adiabatic Quantum Computer is quite a different beast from a quantum computer in the usual sense, and even if it can solve the same class of problems in polynomial time (not at all obvious at this stage) it isn't at all clear that 1 qubit in this machine does the same work as 1 traditional qubit.
They are, to be honest, being a little bit naughty calling this a quantum computer at all, although it does compute and has quanta, but so does my phone.
Re: (Score:2)
Re: (Score:2, Informative)
Meh. Quantum computing, even at its *full* potential, is no threat to symmetric encryption. The recommended minimum key size will jump a moderate amount and you'll be all set again. The effect on asymmetric encryption depends on the type. Some could be severely compromised. BUT, seeing as operations are currently exceptionally fast for end users AND that asymmetric encryption is generally only used to *establish* symmetrically-protected channels over insecure networks, they could probably be jumped up by se
Wrong question (Score:2)
The real question is "Does the D-Wave 'quantum computer' do anything useful at all?"
See Scott Aaronson's opinions on the topic: http://blogs.forbes.com/alexknapp/2011/05/24/q-and-a-with-prof-scott-aaronson-on-d-waves-quantum-computer/ [forbes.com]
Aaronson is a brilliant quantum algorithm complexity professor for MIT. You can read his blog at http://www.scottaaronson.com/ [scottaaronson.com]
Re: (Score:2)
I don't know the implications of these computers for modern cryptography, but assuming they can trivially break encryption it means the public will never get their hands on them. Governments and big corporations would have them (it's jus
Re: (Score:2)
The tin foil hat store is that way -->
Quantum add-on cards will be commercially available in 5-8 years for your desktop. Quantum desktops are probably 10-12 years out.
Re: (Score:2)
Do you honestly believe the government would ever allow the public to purchase computers that easily break public-key encryption? Supercomputers are already classified as munitions by the US government. Tighter restrictions on quantum computers are not at all unreasonable, especially if it turns out they may be used to break public-key encryption schemes.
Re: (Score:2)
Re: (Score:2)
Why do you presume they don't already have such a thing? (Other than the OTP, of course.)
Re: (Score:3)
Disclaimer: i worked on QC, but not for dwave (although one of my former employers had dealing with them):
dwave does not aim to build a machine to crack codes. They build a machine which can do what can be done using the technology available now.
The normal ideas about how to make a QC work, even for lets say factoring 128 bit numbers require many more logical qubits than available. The logical qubits should be composed from physical qubits, and for all coding schemes besides some quite new ones the minimum
Re: (Score:2)
As "quantum computeres" are not programmed in the normal way with floating point or fixed point variables and for loops and if/else constructs you will need an new algorithm and way to model it and a way to represent the data for every encryption method you want to crack. ;D
In other words: if we two exchange some PGP encrypted messages, you need to figure how to represent them in the cubits and how to tackle the encryption. There still lots of work to do
angel'o'sphere
Re: (Score:3)
That's incorrect. They can magically crack encryption based on integer factorization or discrete logarithms. There are potential speedups for other types of encryption. Symmetric ciphers like AES are believed to be safe.
Re: (Score:2)
Actually, symmetric ciphers would be reduced in security roughly equivalent to halving the number of bits. So AES128 would be reduced to roughly 64 bits of security. That would still take a long time to break, and I am not sure a quantum computer could remain coherent for long enough to do it. And you don't get the same speedup from buying multiple quantum computers to let them work in parallel on the problem as you would with classical algorithms. If you w
Re: (Score:2)
light speedup...heh heh.
Did some wiki-browsing... (Score:2)
Re: (Score:3)
So it's like fuzzy logic, only they got tired of having muppets run the IT department?
iQubit (Score:1)
I guess I am just have to wait for the Apple Quantum Computer User Experience.
Re:iQubit (Score:5, Funny)
I guess I am just have to wait for the Apple Quantum Computer User Experience
Me too -- in particular I'm looking forward to the quantum MWI version of FaceTime, which connects you to various alternate-universe versions of yourself, so you can compare notes and see who made the better decisions.
Re:Did some wiki-browsing... (Score:4, Insightful)
A traditional digital computer is pretty hellish to program too if you take away all the props -- you have to find a set of bit values for the memory such this immense consrtructrion of hundreds of millions of gates, clocks, latches, etc. will evolve to give your answer in a reasonably ti,me.
Re: (Score:2)
You start by finding a complex hamiltonian with a ground state describing the solution to your problem
I'm not a math whiz, but to me, this says: "You already know the answer to the problem"...
How can this device help you, if you already have the solution? Is it used for proving the validity, or similar?
Re:Did some wiki-browsing... (Score:4, Funny)
The basic idea is to enter "42" and see what happens.
Re: (Score:2)
and make sure you watch for highway signs at the local intergalactic precinct headquarters :)
Re:Did some wiki-browsing... (Score:5, Informative)
think instead, that solving the hamiltonian is equivalent to (or potentially "harder than") solving the original problem, so that you can translate the original problem into a hamiltonian problem. it doesn't mean that you know the answer of either, but you do know that the solution of the hamiltonian will match up to a solution of the original problem. this is the spirit of it: http://en.wikipedia.org/wiki/Reduction_(complexity) [wikipedia.org]
very, very roughly, think of it like rewriting java, for example, as c. you may not know what the particular code actually DOES in an overall sense, or what it will output, but you can nevertheless rewrite it sort of mechanically (like a compiler would) if you know both languages. furthermore, it's feasible that translating the code is easier than devising the algorithm from scratch. this is basically a reduction. if you can "easily" rewrite any java code as c code, that means java is "reducible" to c. the theory of computation essentially deals with reductions, not of code, but of entire problem classes, which is where P, NP and all that come from.
Re: (Score:2)
Me too, and I notice two confusingly similar things:
While you can blame the same guy [wikipedia.org] for both of them, I don't think they're otherwise related.
Re: (Score:2)
well, there are quantum algorithms for TSP. :-P
apart from that, no, they are not directly related.
Re: (Score:2)
You start by finding a complex hamiltonian with a ground state describing the solution to your problem
I'm not a math whiz, but to me, this says: "You already know the answer to the problem"...
No. You can think of this as posing the question in a very specific way -- a little constructing a wire frame so that soap films on it
naturally form the shape of the best surface for some purpose.
The open question is what questions can in fact be posed this way?
Re: (Score:2)
I'm not a math whiz, but to me, this says: "You already know the answer to the problem"...
Only because you've never taken a physics class past the 100 level.
Lots of applied problems involve you already knowing the Hamiltonian and needing to find the ground-state solution. The process would be awkward for solving arbitrary problems, but for physical simulations it's pretty streamlined (much more so than writing a DE solver in C to solve the same problem).
Re: (Score:2)
Re: (Score:2)
If I understand correctly (I may not), you know what you want the end result to look like, and what you start with, but the problem you're solving is what values take you from the starting condition to that desired result.
In a grossly simplified analogy, I know I want 5x+2 to equal 12. Now I need to know what x causes that to happen.
Wiki (Score:3, Insightful)
I attempted to get a basic understand of quantum computing from Wikipedia, and maybe find out how a quibit measured up to a traditional bit, and what adibatic meant.
Whelp...
I will never make fun of another old person who is unable to grasp the concepts of computing and computer interface that I use every day.
Re: (Score:2)
Absolutely right. This stuff is totally alien. ... it's so totally alien that my BS alarms go off when I hear people talking about it. I've read lots of stuff talking about how quantum computers will work, how they'll change everything, etc. -- but they sound like science fiction. And yet here's a commercial version for sale. It just doesn't ring true with me.
Re: (Score:2)
You do realize that a failure to understand something is not, rationally, reason to outright reject it.
Re: (Score:2)
Re: (Score:2)
I have a feeling it's going to be long time before we even have to worry about it. It will be specialized computational devices for a long time, then maybe some sort of new supercomputers. But it will be like IPv4 and IPv6, yes, the new quantum computers will be there, but you'll still be using dull ol' ordinary ones to talk to them.
That is until the quantum computers develop consciousness and wipe out all humans in favor of androids with Arnold Schwarzenegger's c.1984 physique.
Re: (Score:2)
Tell that to Texas wrt evolution.
Re: (Score:2)
There are those who see their ignorance as a failing, and then there are those who see it as virtue. The religionist nuts in Texas are in the latter category.
Re: (Score:2)
You do realize that a failure to understand something is not, rationally, reason to outright reject it.
I didn't outright reject it.
But it still sounds like science fiction.
Re: (Score:2)
I didn't outright reject it.
But it still sounds like science fiction.
That's basically what Einstein and Shroedinger said too when they heard about quantum mechanics. So you're in good company.
Re: (Score:2)
It is not necessarily a reason to reject it, but you'll have to consider the source. Nine times out of ten, if you don't understand a sales and marketing guy's explanation of why something is better, it's because it's a crock.
Re: (Score:2)
Re:Wiki (Score:4, Insightful)
the knowledge will be modularized and commercialized fairly quickly. in the 50s and 60s linear algebra was really hard because it hadn't been parsed out into an easy form - the useful stuff was all tied up with operator theory and the sort of understanding that geniuses have. fast-forward to now, and computing a matrix svd is a fairly standard task (even if you don't really have what a mathematician would call 'understanding').
similarly, quantum programming will most likely condense into a hierarchy of professional modules and life will go on. the structure of IT and computer engineering is almost totally is socioeconomic phenomenon and not a technical one...
Hold the freaking phone (Score:2)
Re:Hold the freaking phone (Score:4, Interesting)
Adiabatic quantum computing is somewhat different from "regular" quantum computing. Also, places like Slashdot don't get every minor update to the state of the art. Might have something to do with all the people who say, "wake me up when there's a commercially-available version of this." Well, here's your commercially-available version of this.
Re: (Score:2)
Also, places like Slashdot don't get every minor update to the state of the art
Yeah, but a jump from 4 or so qubits to 128 is a quantum leap (pardon the pun), not a minor update.
Re: (Score:2)
t a jump from 4 or so qubits to 128 is a quantum leap (pardon the pun)
Great, first people forget what irony is and now puns? Using a word to mean what it means is NOT a pun.
Re: (Score:2)
Great, first people forget what irony is and now puns? Using a word to mean what it means is NOT a pun.
Well, it's only a literal quantum leap if you can't measure either the position or momentum of the intervening development prototypes.
Or if Doctor Sam Beckett is involved.
Re: (Score:2)
Well, it wasn't a jump. They've built quantum computers of different types in progressively larger sizes. Just, none of the intervening numbers rated as terribly interesting, apparently. (The four-qbit case was interesting because it was the first quantum device that could in any sense be considered a "computer".)
Re: (Score:2)
Re: (Score:2)
Sorry, but a jump from 4 or so qubits to 128 is a very large leap, not an incredibly tiny one like you just said
Sorry but it is actually an incredibly tiny leap. If you read their processor architecture document you would see there are only 8 entangled qbits.
It is a parallel architecture with 16 cells each with 8 qbits = 128qbits. You don't get anywhere near exponential n^qbit scaling out of a crapological quantum computer. The performance I assume is something like 2^8*16... where * any practical number is totally insignificant compared to the expontential term. If they had 128 entangled qbits this would be man
Re: (Score:2)
Honestly though, the term "quantum leap" when used metaphorically should not be concerned with the size of the jump, but rather the discontinuity in the transition. Going from 4 or so qubits directly to 128 without having 32 or 64 bit machines would qualify.
Re: (Score:2)
Honestly though, the term "quantum leap" when used metaphorically should not be concerned with the size of the jump, but rather the discontinuity in the transition. Going from 4 or so qubits directly to 128 without having 32 or 64 bit machines would qualify
Unlike normal computers it matters signficantly how "bits" are arranged in a quantum system... The exponential speedup unique to quantum computation in their configuration applies only to 8 qbits. They could scale to 10 billion qbits and their machine would be a billion times faster..terrific until you consider some quantity times a billion is inconsequential compared to some number to the power of a billion. If the former kind of scaling is all you need then normal computers are cheaper and do not requir
Re: (Score:2)
My understanding is that d-wave's product isn't a general-purpose quantum turing machine, and that the applications are rather specific (optimization problems). It's not a general-purpose quantum computer.
My Feelings (Score:1)
I'm simultaneously for and against this.
A proper science lab should be receiving the first one, not a weapons development company, and not because of Skynet, but on grounds of basic research principles.
On the other hand, at least we have one...
Re:My Feelings (Score:4, Funny)
I'm simultaneously for and against this.
Schrödinger? Is that you?
Re: (Score:3)
meow
Re: (Score:2)
I'm simultaneously for and against this.
Schrödinger? Is that you?
I'd rather not know.
Re: (Score:2)
I would also like to add that it is the first COMMERCIAL model. Researches do have research grade designs and models.
Re: (Score:2)
Yeah, except throughout the history of the supercomputer the primary use has been calculating nuclear bomb yields...
Re: (Score:2)
The whole history of computer development in general has been weapons development (well, to the extent that we date computer development as starting in the code-breaking/nuclear bomb efforts of WWII, rather than with Babbage--although Ada Lovelace's program to compute Bernouli numbers certainly would have had weapons engineering applications).
Re: (Score:2)
As has much of the history of radar/EM broadcasting, aviation, metallurgy, and who knows, probably the wheel :) Nothing like a good war to cause a leap in technology...
Bad Translation (Score:5, Funny)
I found the D-Wave white papers very hard to understand, but I'm sure it's because of a poor translation from the original Vulcan to (sortof) English.
Re: (Score:2)
Star Trek??!! (Score:1)
What will lockheed do with it? (Score:2)
Anyone know what Lockheed's plans are for this system? Complex fluid dynamics? Something else?
The press release [dwavesys.com] only says ".. applied to some of Lockheed Martin's most challenging computation problems."
-molo
Re: (Score:2)
Anyone know what Lockheed's plans are for this system? Complex fluid dynamics? Something else?
It will be used for solving difficult budget problems: in particular, it will optimize the padding-out of this year's expenditures to match the funds allocated, so that next year's budget doesn't get reduced. (/cynic)
lame (Score:2)
Like the IEEE says, it's bullshit [ieee.org] in the sense that it's not quantum in the sense usually understood and it's no more effective than a traditional computer. What is more, as with all snake-oil, it has not allowed peer review.
It would be interesting to see how the money flows from the citizen-taxpayer via the government through Lockheed into D-Wave and finally back to the people in government who set up the purchase.
Re: (Score:2)
The other thing to keep in mind is that I'm sure the first commercially available digital computers weren't particularly more useful, but it's an important step.
IIRC, ENIAC was used to compute the trajectory of artillery shells. The following ones were heavily involved in the design of nuclear weapons.
Re: (Score:2)
Yeah, but ENIAC was't misleadingly described as being more than it was, and it was clear what its benefits were over traditional methods of computation.
Re: (Score:2)
The better comparison would be to Babbage's analytical engines.
I don't think that would be a fair comparison, either. When Babbage was building his computers, there was no theory of computation, and most of the criticism was from people who doubted that machines could "think". What we today think as "simply" building a machine that makes decisions according to pre-determined instructions, at his time it probably looked like magic to most people. Babbage even wrote in his autobiography:
On two occasions I have been asked,—"Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" ... I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
The current objections to D-Wave are very different. Everyone has a pretty good idea
I have a friend ... (Score:2)
Who knows a little about quantum computers and is quite interested in them. He says that from all he can find out about D-Wave on the internet they seem like a scam (ie they do not actually have any computers, nor is their any evidence they are linked to any of the experts in the field). Will be interested to see what he thinks of this article.
Obviously (Score:2)
It sure seems to be a Quantum Computer to me.
It either works or it doesn't.
Nobody seems to know for sure one way or the other, not the CEO who is still running tests to see, and not their detractors who can only speak in percentage certainties.
Prediction: When the question collapses into one state or the other, it will either turn out to be just an exotic classical computer, or it won't work at all. Because if it turned out to work as intended, then it would effectively prove that particles are both waves
Re: (Score:2)
> It sure seems to be a Quantum Computer to me.
> It either works or it doesn't.
Those are two different questions, hm? As far as I can see nobody seems to seriously propose that this thing is actually useful. The discussion is about whether it is just a very bad half digital half analog computer with a lot of noise, or whether quantum effects have to be used to explain its behaviour. That behaviour would be a correlation of the noise beyond what classical theory predicts.
So even if you can, after lo
This is not a quantum computer (Score:2)
nt (Score:2)
But does it both run and not run linux?
Re: (Score:2)
But does it both run and not run linux?
I can do that right now by installing Ubuntu Unity.
Re:Grammar (Score:5, Informative)
Spellcheckers don't usually help with grammar.
Re: (Score:2)
Punctuation isn't grammar.
Re: (Score:2)
reports of it's first sale to Lockheed Martin
Does it have spellcheck?
Any way is perfectly correct, in both spelling and grammar:
reports of it's first sale to Lockheed Martin
In this case it means the reports say it is the first sale to Lockheed Martin
reports of its first sale to Lockheed Martin
Here we have the possessive "its" meaning the first sale of that computer was to Lockheed Martin
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Well, except that the first link is almost a year and half old; and the second one is not a peer-reviewed paper published in Science, but an opinion piece; maybe with a bit of sour grapes flavour that D-Wave's actually peer reviewed paper [nature.com] was published in Nature!
But, hey, they credited me on the PovRay rendering of the actual chip, so it's all cool! And yes, I do have a "conflict of interest" statement to make ("designed the chip", from the link above); but I also do get tired of people waving old IEEE Spec
Re: (Score:2)
See, we can have a reasonable and polite conversation supported by references, right?
Agreed, that is the just the first step, and stay tuned for more! But also, look at the dates of submission and publication of the Nature paper:
Received 30 June 2010; Accepted 15 March 2011; Published online 11 May 2011
The paper that you cite is cool, agreed -- but note the "a spin qubit" in the title. While a great physics experiment, it was not designed to be part of a structure even in principle capable of performing any