Opening Quantum Computing To the Public 191
director_mr writes "Tom's Hardware is running a story with an interesting description of a 28-qubit quantum computer that was developed by D-Wave Systems. They intend to open up use of their quantum computer to the public. It is particularly good at pattern recognition, it operates at 10 milliKelvin, and it is shielded to limit electromagnetic interference to one nanotesla in three dimensions across the whole chip. Could this be the first successful commercial quantum computer?"
28 Qubits ought to be enough for everybody (Score:5, Funny)
There's only a market for at most 10 of these computers, and only big companies will need one.
Was I the only one? (Score:5, Funny)
"operates at 10 milliKelvin"?
"...electromagnetic interference to one nanotesla in three dimensions..."?
Throw in a few universal phase detractors and you've got one heck of a retroencabulator! [youtube.com]
Re: (Score:2, Informative)
Re:28 Qubits ought to be enough for everybody (Score:5, Funny)
Re: (Score:3, Funny)
You have just received the ancient reference of the day award!
Well played! :-)
Re:28 Qubits ought to be enough for everybody (Score:5, Insightful)
Your statement is ironically close to the truth. Quantum computers actually function in parallel to conventional devices when it comes to the simple tasks that they perform, such as rendering intricate scenes, or estimating series values. What quantum computers are better at is taking advantage of quantum effects to exponentially outperform conventional computers at things such as factoring immense integers. They will most likely be used for decryption and quantum simulations, or other mathematically novel applications. In other words, it benefits businesses and scientists the most. They will most likely have commercial value in the future, but that is when they develop more uses for it, such as emulating the human mind to make ultra-realistic (if not realistic) AI. At the moment however, it is still in the computer equivalent stage of useless behemoth. Someone in some field will most likely make a huge discovery similar to the silicon transistors of the past, win a Nobel prize, and set the stage for a new revolution. Feels like a long way from now, but I'll probably be proved wrong.
Re:28 Qubits ought to be enough for everybody (Score:4, Insightful)
What quantum computers are better at is taking advantage of quantum effects to exponentially outperform conventional computers at things such as factoring immense integers.
That's a little misleading; it's unknown how fast classical factoring is, so it's impossible to say that quantum factoring "exponentially outperforms" it.
but that is when they develop more uses for it, such as emulating the human mind to make ultra-realistic (if not realistic) AI.
It's unlikely that quantum computers are needed for AI; the problem with AI is not that we don't have enough computer power, but that we don't know what to do.
Someone in some field will most likely make a huge discovery similar to the silicon transistors of the past
Or it will turn out that quantum computing just isn't feasible for some physical reason.
Re:28 Qubits ought to be enough for everybody (Score:4, Informative)
Actually, it doesn't matter how fast a classical computer operates, a quantum computer WILL go exponentially faster regardless. I researched quantum computers in a laureate report I did a few year back. Quantum computers are able to achieve a dual state as a result of calculations. Also, quantum computers operate on the mathematical principles of a unitary matrix. One of the properties of a unitary matrix is it's reversibility, so that any operation that can be perform can be "unperformed". So take the ability to reverse calculations and achieve more than one answer at once, and you can "unperform" at an exponential rate. For example, you have a matrix with an "and" gate. If you where to reverse the and gate on the value '0', superposition will allow you to get the answers '10', '01', and '00' all at the same time. This means that a 64-qubit computer and theoretically "unrun" 2^64 (more than the molecules in the universe) times faster than a 64-bit computer. Now that is just a simplified gist of things. I don't want any physicists saying "you forgot the Hademard gate etc." The process is much more elaborate, and much more prone to other factors.
Now as for you other comments: 1) quantum computers will be a better candidate for simulating AI on a common commercial scale, and 2) quantum computing already is possible. The discovery that will most likely be made is the ability to create a room-temperature equivalent of a Bose-Einstein condensate so that topological quantum computers (the most reliable model so far) can be fit onto something the size of a thumb.
Hadamard gate (Score:5, Informative)
I don't want any physicists saying "you forgot the Hademard gate etc."
I think you meant "Hadamard gate".
-- Any Physicist
(Much easier to google for the wikipedia article with that spelling.)
Re: (Score:2)
I don't like replying to my own comment, but I felt that I should have been a bit more proficient (not just in my bad grammar and typos) in explaining that mathematics also has to evolve to take advantage of the quantum principles. As you can imagine, the field of "uncalculating" isn't very big, just as Boolean algebra wasn't so big before transistors. That is another factor in making quantum computers that will be incorporated into every-day use.
Re:28 Qubits ought to be enough for everybody (Score:4, Informative)
this has so far not been proven. What is proven is that a quantum computer can outperform a classical computer polynomially (in algorithms based on unstructured search) and that it can outperform the best currently known classical algorithms for some problems (factoring, quantum simulation) exponentially. Moreover, exponential separation has been proven in terms of "query complexity" for "oracle problems" (in which a quantum black box is assumed to be available and only the number of accesses to the black box is counted as cost) and in terms of "communication complexity" in quantum communication (where the number of (qu)bits that need to be exchanged between two locations is counted as cost).
Quantum computers are able to achieve a dual state as a result of calculations. Also, quantum computers operate on the mathematical principles of a unitary matrix. One of the properties of a unitary matrix is it's reversibility, so that any operation that can be perform can be "unperformed". So take the ability to reverse calculations and achieve more than one answer at once, and you can "unperform" at an exponential rate.
that does not follow. having a superposition of 2^100 answers doesn't help you to get out a single one (since when you perform a measurement (and a quantum computer is supposed to give us a conventional ("classical") answer to our problem) each answer occurs with exponentially small probability only. The hard part is to make all these many "answers" to interfere such that the right answer comes out with high probabiliy (that decreases only polynomially in the number of bits used as input). Also, reversibility is not needed for a quantum speed-up.
Re: (Score:2)
What is proven is that a quantum computer can outperform a classical computer polynomially (in algorithms based on unstructured search) and that it can outperform the best currently known classical algorithms for some problems (factoring, quantum simulation) exponentially.
Not even for factoring, actually. Factoring is known to be sub-exponential. =)
Re: (Score:2)
yes, you're right. I should have said "outperforms super-polynomially" (I guess it is poly(n) for a QC vs n^log(n) classically, n beinig the input size in bits)
Re:28 Qubits ought to be enough for everybody (Score:4, Informative)
Actually, it doesn't matter how fast a classical computer operates, a quantum computer WILL go exponentially faster regardless.
I really, really hope that you failed whatever course the report was for. There is not, at present, any known problem which for which a quantum-computing algorithm is known to be exponentially faster than the fastest classical algorithm.
Factoring is known to be sub-exponential, so Shor's O(n^3) quantum algorithm does not provide an exponential speedup.
The strongest known result in terms of speedup for quantum algorithms is for unordered search, from O(n) to O(sqrt(n)), which, again, is not an exponential speedup.
There are some intuitive arguments for why an exponential speedup might be possible. There are also some intuitive arguments for why it shouldn't be. There is no proof either way, as things currently stand.
(Note: I am not an expert in the field. This reflects my understanding, which was current as of about 5 years ago. To the best of my knowledge, things have not changed, but I don't read the literature like I do in other areas).
Re: (Score:2)
Wouldn't using quantum computing in the AI field more closely simulate actual physical brains (strong-AI) then modern processors that only have 2 values?
A neuron has only two "values" -- signal or don't. Damn primitive technology...
Re: (Score:2, Funny)
Your statement is ironically close to the truth. Quantum computers actually function in parallel to conventional devices when it comes to the simple tasks that they perform, such as rendering intricate scenes, or estimating series values. What quantum computers are better at is taking advantage of quantum effects to exponentially outperform conventional computers at things such as factoring immense integers. They will most likely be used for decryption and quantum simulations, or other mathematically novel applications. In other words, it benefits businesses and scientists the most. They will most likely have commercial value in the future, but that is when they develop more uses for it, such as emulating the human mind to make ultra-realistic (if not realistic) AI. At the moment however, it is still in the computer equivalent stage of useless behemoth. Someone in some field will most likely make a huge discovery similar to the silicon transistors of the past, win a Nobel prize, and set the stage for a new revolution. Feels like a long way from now, but I'll probably be proved wrong.
I actually am inclined not to agree with you. Back when people were making similar statements about the computer in general, they weren't small enough, powerful enough, or cheap enough for anyone to afford them who wasn't going to set up some sort of business unit around them. But I say give it 20-30 years. What will probably end up happening is that they'll be making quantum processors that run along side traditional processors, working much like today's GPUs, or yesterday's "Math Co-processors." Programme
Re: (Score:2)
My money says that 50-60 years from now, you'll be running a hybrid quantum/traditional computer on a mobile device you carry in your pocket.
50-60 years from now, i will most probably be dead, you insensitive clod!
Re: (Score:2)
Maybe when the technology is inexpensive we can all own such a rig!
But I'll bet that all kinds of research could benefit from use of this computer.
My best guess is that government will control who has access to quantum computers as they just could not stand communications that they could not spy on.
Re: (Score:2)
*whoosh*
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
...with a probability of 0.25
A: "Oh, cool. A quantuum computer! But does it run Linux?
B: "Dunno. Let me see..."
Still not easy to build at home (Score:3, Interesting)
FTFA : "These things [quantum computers] can be very small and very cold, and they can be built out of exotic materials" - emphasis mine.
He makes this sound as a good thing.
It is if you are the NSA (Score:3, Interesting)
To keep our security agencies happy, quantum computers need to be almost impossible to make. The inventor of a really simple, cheap one is unlikely to have a successful career selling them to Joe Public.
Re: (Score:2)
Re:Still not easy to build at home (Score:5, Interesting)
Thanks to scam companies like this more qualification is needed when referring to "quantum computing".
This is only a little better than the quacks who talk about "quantum healing energy"; they're exploiting the vague term "quantum computing" and the small amount of understanding to try and make a quick buck from investors.
Re: (Score:3, Informative)
Probably, their Hamiltonian phase-space is severely limited. I.e. their quantum computer can't explore all possible configurations of phase-space.
That means it'll need a lot more qubits than an 'ideal' computer for some tasks.
Re: (Score:2)
The current chip, Leda, has 28 rings, giving 28 qubits, but theyâ(TM)re not all interconnected to each other, only to a number of âneighboursâ(TM). The Cooper pair in the niobium are technically bosons so they all exist in the same quantum state, Rose claims, which gives the entire superconductor quantum properties even without interconnecting every qubit.
From Rose himself; they're not interconnected, but they still have "quantum properties". So every other approach to quantum computing is wasting time, because you don't actually have to interconnect the qubits?
That doesn't seem a little hard to believe from a company trying to get investors which won't open their hardware up to scientists?
I suggest getting your information from somewhere other than Rose's blog.
Re: (Score:2, Insightful)
I can say that I have direct experience that proves the existence of energy.
Well, thats a rigorous enough sample size for me, bring on the crystals and pyramids!
Re: (Score:2)
I don't think I'm so convincing that I can induce mass hallucinations at the drop of a hat.
I can. [ava7.com]
Re: (Score:2, Insightful)
I started reading your linked blog, but got stuck at this:
Re: (Score:2, Insightful)
Sounds like you are guilty of the last sentence as well.
What do you believe then? Superposition of wave functions is implicit in Schroedinger's equation. In fact, it's implicit in any differential equation of a wave function (if this is
Re: (Score:2)
FTFA : "These things [quantum computers] can be very small and very cold, and they can be built out of exotic materials" - emphasis mine.
He makes this sound as a good thing.
Oh, it is a great thing for the marketing folks.
But does it work? (Score:3)
Re:But does it work? (Score:5, Informative)
Re:But does it work? (Score:5, Funny)
All I know is that every time I even mention quantum computing my cat gets nervous and absolutely refuses to get in the box.
What does this mean for encryption? (Score:2)
Re:What does this mean for encryption? (Score:5, Informative)
No, their device is *NOT* a universal quantum computer. So far as I know, no reputable quantum physicist not in their employ has been allowed to examine what they actually do. Examples of performing calculations impractical on a classical computer are not available as far as I know.
They are something of a joke among the QC people I know. While people acknowledge that their device may be possible of doing some interesting things, everything they do is acting like they have something to hide.
Re: (Score:3, Insightful)
Not that I'm passing comment either way, as I don't know, but:
"acting like they have something to hide"
Something like intellectual property?
Re:Try "something which would stop the grant chequ (Score:5, Insightful)
I don't know who modded you Insightful, but AI research has produced many useful results. The fact that it hasn't produced HAL 9000 very much does not mean it's on the same level as parapsychology.
Similarly, quantum physics is a real field of science, and quantum computing is based on solid scientific principles. This company may be a bunch of frauds, but if you want to suggest quantum physics is a massive conspiracy among the physicists of the world you're going to need more than just handwaving and pointing to a field of pseudoscience that never had the support of mainstream scientists.
Re:What does this mean for encryption? (Score:5, Funny)
No, their device is *NOT* a universal quantum computer. So far as I know, no reputable quantum physicist not in their employ has been allowed to examine what they actually do.
Duh, of course you can't examine what a quantum computer is doing. That would change the outcome.
Re:What does this mean for encryption? (Score:5, Interesting)
The simplest example of a quantum computing algorithm is Deutsch's algorithm.
Here is how it works. Consider a simple boolean function b_out = f(b_in). It takes an argument that can be 1 or 0 and returns a 1 or 0. There are four possibilities: always zero, always 1, the identify, and logical not.
Now imagine that I give you a black box that computes 'f'. However, it is very, very slow --maybe internally it is computing some NP-complete problem. If you want to know which of the four functions the box calculates, you need to run it twice, once for zero and once for one.
However, suppose you simply want to find out whether zero and one map to the same or different values, i.e., the parity of f. With classical computers, you are screwed. You still have to run the box twice to find that even though you only want to get a single bit of information.
However, you can do better if the black box I gave you is a quantum implementation of f(x). By feeding in a input state that is a superposition of 0 and 1, I can detect in a single evaluation plus some simple operations whether the function is constant or not. However, in doing so I get no information about the specific value. Effectively I can ask any one-bit question about f(x) as efficiently as a specific value.
It unlikely this will every be useful as stated. While it is known how to efficiently translate every classical computing algorithm into a quantum version it is unlikely a real implementation would be within a factor of 2 in speed or cost. I believe it illustrates the basic idea. The character of other quantum algorithms is similar, you often feed in a superposition of all possible inputs and read a single output which is the specific answer you want with high probability without having to ever compute the values you don't want.
Re: (Score:2)
So far as I know, no reputable quantum physicist not in their employ has been allowed to examine what they actually do.
Well if they did go ahead and examine it, wouldn't the system change anyway?
Re: (Score:2, Funny)
No impact on encryption, unless you use ROT-13.
Re:What does this mean for encryption? (Score:4, Funny)
Yeah; I sleep well at night knowing my secrets are safe.
Re: (Score:2)
This has no impact on cryptography whatsoever. Symmetric encryption has never been shown to be a problem that quantum computing can help with. A *large* QC would affect the use of public key algorithms as both factoring and discrete logs can be sped up.
However:
1. 28 is not a large number. Current asymmetric key sizes would takes thousands of qubits.
2. This is not a "quantum computer". Shor's algorithm requires entangled qubits that stay coherent during the length of the algorithm. The 28 cubits in this syst
Re:What does this mean for encryption? (Score:4, Interesting)
it also seems pretty hard to add more bits to these quantum computers, so it looks like traditional encryption might be here to stay after all.
That is exactly the point. Qhantum-computers scale much, much worse than traditional computers. The problem is that tweo of these do basically give you the same maximum problem size as one does. (for traditional computers you can break problems into smaller steps. For Quantum computers you cannot, without loosing all the advanatges.) So you cannot use just more to break encryption. You need to build one with more qbits that are all entangled wich each other. My present impression is that the effort of adding qbits grows quadratically or the like, as each qbit has to be entangled with each other qbit (that is n*n entanglements). If that is true, even 100 qbits are far out of reach. This means that all modern encryption is perfectly safe from this quantum nonsense.
Re: (Score:2, Informative)
The problem is that tweo of these do basically give you the same maximum problem size as one does.
This only holds if the two computers can only communicate clasically and have no prior entanglement -- and right now we're better at communicating qubits than building them. (see "quantum encryption", which relies on precisely this operation!)
My present impression is that the effort of adding qbits grows quadratically or the like, as each qbit has to be entangled with each other qbit (that is n*n entanglements).
No. It's only a constant amount of work to entangle an unentangled n'th qubit with n-1 others -- any quantum operation will do it.
While you are producing the complicated state you can use error correction to preserve that state, which adds only a constant factor overhe
Re: (Score:2)
No. It's only a constant amount of work to entangle an unentangled n'th qubit with n-1 others -- any quantum operation will do it.
I am not talking about entangling them. I am talking about keeping them entangled and performing operations. Inciodentially, the effort for error correction is only constant, if the per-bit error probability is constant. I find it highly doubtful that th error probability can be made independent of overall size for these devices.
Anyways, with what has been built so far, not even
How does it work? (Score:5, Insightful)
Can someone post a link that describes the benefits of a quantum architecture and how software can be written to take advantage of them?
And by "benefits", I don't mean hype.
Re: (Score:2)
Considering that UNIX was developed to play a game on a pdp7, I'd say it doesn't much matter. It will grow.
Re:How does it work? (Score:4, Informative)
Can someone post a link that describes the benefits of a quantum architecture and how software can be written to take advantage of them?
And by "benefits", I don't mean hype.
http://en.wikipedia.org/wiki/Shor%27s_algorithm [wikipedia.org]
^The big one.
Re:How does it work? (Score:5, Informative)
The wikipedia article is not bad, though it is fairly technical.
A very small number of algorithms are known for universal quantum computers (which the D-wave device does not claim to be) that are asymptotically faster than any known algorithm for classical computers.
The most widely known of these is Shor's factoring algorithm. Mostly it would be useful for breaking public key cryptography. The others are: Grovers search algorithm which can give a small speed boost to any classical algorithm that involves enumerating all possibilities and checking some property and quantum simulation: simulating the behavior of systems of many particles where quantum effects are important.
In the past 10 years, considerable progress has been made, but nobody still has a good handle on when scalable universal quantum computing might be a reality, though it no longer looks impossible--only very hard. D-wave does not claim their device is universal. In particular they don't say they can do factoring. They claim to be able to efficiently do quantum simulation and also traveling salesman type optimization problems. Evidence of them actually solving any hard problems is not widely available.
Re: (Score:2)
I dunno... Producing a working device seems to be plenty hard. There's vaporware, and then there's university research that doesn't even bother leave the lab. No use if it just remains theory.
Re: (Score:2)
Grovers search algorithm which can give a small speed boost to any classical algorithm...
I wouldn't call the step from O(N) to O(sqrt(N)) a small speed boost. You can even be pretty fast when your qubits are ramdomly (but not too fast) change state using error correction techniques. So you don't even need a perfect quantum computer to do usefull work.
That said; I know a few people who do actual research on quantum computing and I've never even heard them talk about D-Wave.
Re:How does it work? (Score:4, Informative)
sqrt(N) is small compared to the other promised speedups of quantum computers which are typically reduction from super-polynomial or exponential time to polynomial time.
The real crux is that the type of problems that you often want to apply Grover's algorithm to are already O(2^n). Grovers algorithm reduces that to O(2^(n/2)). With a similar size quantum computer you could only solve problems of roughly twice the size.
Still interesting and potentially useful, The main advantage is its wide applicability. Many classical algorithms can simply be directly translated to a quantum equivalent, then have Grover's algorithm applied. Finding a special-purpose quantum algorithm is typically very hard or impossible.
Re: (Score:2, Informative)
D-wave does not claim their device is universal. In particular they don't say they can do factoring. They claim to be able to efficiently do quantum simulation...
Being able to "efficiently do quantum simulation" makes a device a universal quantum computer. That is what "universal" means.
Re: (Score:2)
The wikipedia article is not bad, though it is fairly technical.
You must be new here. And to Wikipedia.
Sure... (Score:2)
Re: (Score:2)
Good luck with that - it's all hype.
Quantum computing is the new string-theory, ie. a theoretical physics quagmire. It's soaking up funding and diverting graduate student talent that could be better utilized in other areas.
Re:How does it work? Parallel universes. (Score:3, Informative)
A good reason to look there is to get an intuition of the concept of computing using parallel universes.
But can it run Q*bert? (Score:2)
But can it run Q*bert? ???
Re: (Score:2)
When someone ports MAME to it, yes.
28 qbits? Can do less than a pocket calculator.. (Score:2)
This ir several orders of magintude from from useful size. Invest the same money into a normal CPU and get much, much more power, even if you use it to simuulate the 28 qbit device.
Re:28 qbits? Can do less than a pocket calculator. (Score:3, Insightful)
By the same token, you could have performed calculations easier on a slide rule than on the first binary computers built. I think the point of this is proof-of-concept of a new technology rather than this particular unit taking over for modern systems.
If no one had bothered to use, abuse, and continue to develop binary computers half a century ago, then we'd still be using abacus and slide rule to perform all our calculations.
Re: (Score:2)
This is very much not the case. Computing developed for entirely practical reasons, performing computations which were either difficult or impossible to perform without them: brute forcing the Enigma codes, calculating artillery tables, etc.
In any case, the summary (I haven't bothered to read the article) makes it sound like they're presenting it as a practical, useful device, in which case saying that it's too weak to be useful is an entirely valid criticism.
1st thing I'd get it to compute... (Score:4, Funny)
Re: (Score:3, Interesting)
I'd write a Jeopardy program and have the only clue be "42". I'd like to see what the thing churns out.
Re: (Score:2)
I'd write a Jeopardy program and have the only clue be "42". I'd like to see what the thing churns out.
Answer: "How many roads must a man walk down?"
D-Wave a bit of scam (Score:5, Interesting)
I work with the IQC, we specialize in quantum computing, quantum crypto, and many other things like that. We are also joined partially with the Perimeter Institute (and they do mostly theoretical physics). Anyway, when I first joined the institute, we had a discussion about d-wave. No one believed that it was real, and in fact considers d-wave to be bad for the field. Many of you will probably remember the cold fusion controversy. What happened was that experiment that could not be reproduced was published. This enraged the scientific community. Also, this led to massive funding cuts, and killed off the field. QC has a more stable base, but if d-wave keeps on been publicized like this, and they can never prove their claims (remember that all the experiments and functioning of the QC are considered "trade secrets", they let no one look at it), then we may end up with skepticism from the funders. Keep in mind that the ones who donate have usually no clue what is happening in the field (politicians, ceos, etc, so they are "stupid" enough to be affected by this. Everyone in the field is in the back of their head hoping that its real, but with that chance being so low, we want d-wave to be forgotten.
Re: (Score:2)
So are you saying that they haven't yet passed a series of tests that would prove their computer is working?
One would think that it should be possible to design tests which they could pass if they possessed the working technology, without them having to reveal how exactly they achieved the result.
Very high level example: For instance, perform X number of Z type calculations in Y seconds, where Z type calculations would normally take present-day computers Y * 10 months of time but through quantum computing c
Re:D-Wave a bit of scam (Score:4, Interesting)
One would think that it should be possible to design tests which they could pass if they possessed the working technology, without them having to reveal how exactly they achieved the result.
This is actually quite hard (and I'm not sure any such test exists).
One can distinguish two scenarios: (1) a quantum computer tas a box that gets a classical input, processes it and outputs a classical result. Then the only distinction between classical and quantum is speed - or rather "computational complexity" in the sense that the number of required computational steps sclaes differently with the size (in bits) of the input - hence by sending a series of queries with varying length and plotting the scaling one might conclude "this device is better than any known classical machine". But there are two caveats. one needs to go to really large input to see such a scaling and there's no proof that there does not exist a clever classical algorithm with the same scaling.
(2) one can demand more of a quantum computer, namely the capability to perform a universal set of gates and therefor prepare a large class of quantum states. There are well-developed criteria to verify that such states have been produced and that certain gates have been performed. If a universal set of gates has been implemented with sufficient quality one knows that the device is capable of performing quantum computations (but maybe this capability is not needed for QC). To apply this criterion, however, one needs to "look into the box" and perform measurements on the qubits.
This problem could be circumvented, if their supposed quantum computer would also have a "quantum interface" that allows input and output of quantum information (e.g., I send them a bunch of photons, they map their state into their computer, perfom a set of operations I ask them to do and then they write back the state ofthe qubits to photons and send them back to me for analysis. Then I could verify (not me, but experimentalists with the proper equipment) if the desired operation has indeed been performed.
Of course, d-wave does not claim that their device is a "universal quantum computer" or that it can prepare these kind of states. How their claims can be verified without looking into their device, I don't know.
Re: (Score:2, Insightful)
[CITATION NEEDED]
You state that you work for the Institute for Quantum Computing" [wikipedia.org]. How are we to know that you are not just badmouthing a company that may have gotten on your bad site?
Sources and facts, please.
Call me when it can run a useful program... (Score:2)
...then we'll discuss the word "successful".
Uh (Score:3, Informative)
It is particularly good at pattern recognition, it operates at 10 milliKelvin, and it is shielded to limit electromagnetic interference to one nanotesla in three dimensions across the whole chip. Could this be the first successful commercial quantum computer?
Based on that description? No. I don't even know what the fuck any of that stuff you just said even means, man (except for the bit about pattern recognition, which was an unquantified statement anyway and about as useful as "the computer is fast"). Speak in a language I can understand, like, the average framerate it can run Crysis at.
Re: (Score:2)
I'll translate for you:
It's fairly useless but less useless at pattern recognition than it is for anything else. To make it work we have to make it really really cold. It won't get cancer from cell phones. We won't make any money off of this particular machine.
Re: (Score:2)
Do you understand what it means to say that the CPU in your desktop has a 14-stage execution pipeline? That it has a TLB hit rate over 98%? That it has a double-pumped ALU?
Based on that description? No.
Whether or not you understand the specs has no relation to commercial viability. As you say, you (or Joe Average) only care how it will affect your frame rate in the latest FPS.
Re: (Score:2)
It might be valuable as a specialized research tool, or in industry as a component of production, but it isn't consumer-viable. It will never be a "final product" (as counted in the GDP), at least not in this decade.
Also, no, I didn't understand any of that stuff either, haha. I'm a student of economics and a consumer, not a hardware engineer.
Re: (Score:2)
Re: (Score:2)
Who said I didn't understand the concept of absolute zero? You're putting words in my mouth, again. Understanding absolute zero is one thing; understanding the technical significance of "it operates at 10 milliKelvins" is something else entirely.
Utilizing quantum entanglement to transfer information over great distances, instantly, is obviously a genius idea. But that article summary means nothing to anyone who isn't a hardware engineer. Whoever wrote it should have kept a more general audience in mind, and
Sounds interesting (Score:2)
But their claims are so far of everyone else's and there are so few details about how it works that it also sounds like an investment scam.
Quantum computer tech support (Score:5, Funny)
"Hello, Quantum Computer Tech Support"
"My new QC is not working, I'd like a replacement under the warranty"
"What makes you think it's broken?"
"It keeps giving wrong results"
"But it's giving the right results in lots of nearby parallel universes. The computer is not broken - you're not observing from the recommended viewing position. This is user error." CLICK.
No proof (Score:5, Informative)
See some skepticism here:
http://scottaaronson.com/blog/?p=306 [scottaaronson.com]
http://scottaaronson.com/blog/?p=291 [scottaaronson.com]
http://scottaaronson.com/blog/?s=d-wave [scottaaronson.com]
Re: (Score:2)
"D-Wave has provided neither proof nor convincing evidence..."
Quantum computing is not a black or white or a yes or no type of field, D-Wave's response to your accusation could be 'Maybe'.
Re: (Score:2)
Re: (Score:2)
I'm a physicist. It seems that you are widely-regarded as a crackpot. And, what, you're a programmer or something? And--if this one website is correct--you're trying to program a Christian AI based on Revelation?
Saying belief or disbelief in major branches of physics, tested and proven numerous times over, is an opinion is a major disservice to... science. I'm sorry. You're a crackpot.
Re: (Score:2)
You must build it (Score:2, Funny)
Sounds like smoke and mirrors/snake oil BS (Score:2)
D-Wave sounds like a classic scam to lure investors.
IF they really had a working QC, they could patent the tech and license the patents for Billions of dollars.
All they're doing is saying 'Trust me, it works, ignore the man under the table'.
I bet that in 2 or 3 years we'll be reading a story like this about D-Wave.
http://trashotron.com/agony/columns/05-21-02.htm/ [trashotron.com]
Re: (Score:2)
Sorry, bad link...
this one should work
http://trashotron.com/agony/columns/05-21-02.htm [trashotron.com]
Call me when someone makes a 2048-qubit computer (Score:2)
Because I've got a date with the Xbox public key.
"milliKelvin?" (Score:2)
"it operates at 10 milliKelvin"
First off, "kelvin" as a unit of measure is not a proper noun, any more than "meter" is (read me). But even if it was, it'd be "Millikelvin," you don't capitalize letters in the middle of words!
Re: (Score:3, Informative)
"A kelvin is a unit of temperature difference,"
It is a unit of thermodynamic temperature.
"a defined fraction of the temperature of the triple point of water above absolute zero."
Note the inclusion of absolute zero in the definition.
"It is not a scale referenced to Absolute Zero."
Yes, it is; it is a unit of thermodynamic temperature. Didn't you pay attention in your basic chemistry class?
"But the milli prefix is not capitalised, because capital M implies the Mega prefix"
If you spent more time reading my pos
No, this isn't commercially useful (Score:2)
Purely by chance, I recently ended up sitting next to a D-wave employee while travelling from Philly to New York, and we got to talking about commercial viability.
I was curious at what point they would reach the commercial tipping point, where it would be cheaper to use quantum computation than to do it on regularly processors.
According to him, the point at which they planned to be commercially viable was somewhere in the vicinity of 512 qubits, at that point there was a number of problems that started to b
Re:Qbert vs. Qubit (Score:5, Funny)
^%$#@!
Re:Qbert vs. Qubit (Score:4, Funny)
Re: (Score:2)
Re: (Score:2)
But how much is it in degrees Rankin? /sarcasm
By the way you cannot really understand what a temperature this close to 0 kelvins represents if you don't know what 0 K is, and when you do not only you know how much 0 K is but it also makes the conversion irrelevant because 10 mK = 0.01 K = 0.01 Celsius above 0 K = 0.018 F above 0 K
Re: (Score:3, Funny)
Re: (Score:2)