IBM Touts Quantum Computing Breakthrough 132
Lucas123 writes "IBM today claimed to have been able to reduce error rates and retain the integrity of quantum mechanical properties in quantum bits or qubits long enough to perform a gate operation, opening the door to new microfabrication techniques that allow engineers to begin designing a quantum computer. While still a long ways off, the creation of a quantum computer would mean data processing power would be exponentially increased over what is possible with today's silicon-based computing."
And people say .... (Score:4, Funny)
Re: (Score:2)
Re: (Score:2)
Yes, Sheldon...sarcasm.
Bazinga.
Re: (Score:2)
Re: (Score:2)
It may be cyclic.
We started with mainframes, then went to PC's. From somebody else having your computing resources to you having them.
While the cloud may be inherently distributed, it is like the mainframe approach in so far as your interface is much like a mainframe interface, and someone else again has your computing resources.
Ironically, quantum computing might well put cycle things back to the PC model, in so far as you might no longer need the power of the cloud to get the job done; it might be feasib
Re: (Score:2)
The Cloud is successful for a few reasons:
1. Most users don't back anything up.
2. Most users have no clue whatsoever how to set up a server of any kind, nor to properly configure their firewall/router for remote access.
3. It's easier to share things on a platform your friends and family are already using than for everybody to have their own.
4. Bandwidth has become cheap and prevalent enough that pushing data around public networks is relatively quick and convenient.
Everyone could have their own "Cloud" on t
Re: (Score:2)
It had nothing to do with PC's the original quote, seeing as IBM started to produce small PC's in 1975 which was years ahead of the competition (The IBM 5100), instead your misquote comes from Thomas Watson's so called quote in 1943 about electronic computers, and yet since 1973 people knew it was a misquote and outright wrong.
If your going to open your mouth (or in this case type) please read some history before you remove all doubt.
Re:And people say .... (Score:4, Funny)
I wonder what animal will go on the cover of O'Reilly Quantum Computing In a Nutshell; a cat in a box?
Re: (Score:3)
Re: (Score:3)
I wonder what animal will go on the cover of O'Reilly Quantum Computing In a Nutshell; a cat in a box?
You won't know till you open the book.
Re:And people say .... (Score:5, Insightful)
The depressing thing is that you will never see anything like this out of Apple. Billions of dollars in reserves and no "Jobs Labs".
Re: (Score:1, Insightful)
no, but apple has been pretty focused on making technology cool and even desirable to the masses. While perhaps not as interesting to you as Quantum computing, its certainly important, and something that IBM was never able to do.
Re: (Score:2)
Wait, you mean International Business Machines doesn't make things for the masses? Who'd a thunk it?
Re: (Score:2)
.. focused on making technology cool and even desirable to the masses.
You probably mean the middle class
Re:And people say .... (Score:5, Insightful)
THIS, like times a million. NYTimes this weekend had an excellent article on the history of Bell Labs (the laser, the transistor, communications satellites, etc). HP, whatever else you may think of them, supported the pure research lab which brought forth the memristor. IBM can point to things such as this, its various efforts to simulate a brain, and Watson. Google, bless their souls, is pushing for automated driving (this may not sound in the same league, until you realize the consequences for everybody who drives or rides in an auto.)
Where is the pure research at Apple? Do they think they can get by on just making better UIs, for the rest of forever? Are they at all part of a larger community?
Re: (Score:1, Interesting)
That was the commercial I remember for several years.
Its not always about making cutting edge front page news break throughs, sometimes its just about refining something until its just right after someone else made the break through and then forgot about it because they moved on to the next shiny thing.
Both kinds of people/businesses are useful and needed, well atleast until this utopian dream you have becomes reality and everyone work
BASF (Score:1)
BASF, we don't make the things you use.
We make the things you use BETTER.
That was the commercial I remember for several years.
I suppose they couldn't live up to this promise any more since they changed it to We don't just make chemicals, we make chemistry [youtube.com]
Re: (Score:1)
Re:And people say .... (Score:5, Interesting)
Apple doesn't do technological research. Instead, they pour all of that money into usage research, so that they can design an improved user experience.
It's not necessarily a bad thing. There's a place for both the technological side, and the usability side. Most tech companies focus on the technology side while neglecting the usability, which is why so much technology ends up unusable by laymen.
Microsoft actually does a lot of usability research too. But the difference between Microsoft and Apple is that Apple has (or had) someone steering the ship. They're a top-down dictatorship-style management house. Microsoft is more about internal competition to see who wins out. They're more of a survival-of-the-fittest, cream-of-the-crop-rises-to-the-top type of management house.
Re:And people say .... (Score:4, Informative)
But Apple SHOULD do technological research. Because it provides a long term competitive edge for them, and because its the right thing to do. Corporations, like people, live in a larger society, culture (and nation) and they benefit from those things. Apple would not exist were it not embedded in the Silicon Valley culture emanating from Stanford and Berkeley. Apple should give something back. Maybe Steve would not understand this, but surely Woz would.
Yeah, iPhones are great, but honestly, ten years from now, we'll be on to a newer, better UI (glasses, brain implants, holodecks, or whatever.) It turns out we're still using lasers and transistors and communications satellites, all invented by Bell Labs in the 60s.
Here, I'm pasting the best bit from the NYTimes/Bell Labs article, written by Jon Gertner;
"But what should our pursuit of innovation actually accomplish? By one definition, innovation is an important new product or process, deployed on a large scale and having a significant impact on society and the economy, that can do a job (as Mr. Kelly once put it) “better, or cheaper, or both.” Regrettably, we now use the term to describe almost anything. It can describe a smartphone app or a social media tool; or it can describe the transistor or the blueprint for a cellphone system. The differences are immense. One type of innovation creates a handful of jobs and modest revenues; another, the type Mr. Kelly and his colleagues at Bell Labs repeatedly sought, creates millions of jobs and a long-lasting platform for society’s wealth and well-being."
The whole article is here (paywall yadda-yadda)
http://www.nytimes.com/2012/02/26/opinion/sunday/innovation-and-the-bell-labs-miracle.html?pagewanted=all [nytimes.com]
Re: (Score:2)
Bullshit.
Apple invests in PATENTING UI components so they can SUE companies and people who use them.
IBM, on the other hand, sponsored, developed, published, and GAVE AWAY the Common User Interface Standard.
Apple has no intent on sharing anything with anyone. They want to OWN the market. All markets. And any device that makes the mistake of using a common sense gesture, icon, or interface that anyone with a functioning brain cell could have come up with.
Apple is a pimply leech on the ass of compu
Re: (Score:2)
Given some of the unpolished turds Microsoft has put out in the past, that cream must be pretty curdled...
In reality though, the stuff Microsoft has put out makes me think more of an organization with tons of internal competition, yes, but one in which people sabotage each other or engage in politicking in order to force pet ideas into projects resulting in products clearly designed by committee and often containing so many compromises that whatever good points are often completely outshone by really horrif
Re: (Score:1)
It's not a zero-sum game. This would only be a problem if Apple was the only company that ever will exist. Let them do what they do well and let IBM do what they do well.
Re: (Score:2)
I don't know if they can, but they should be able to do just that. As long as they make the best UI (do they make it now?) if you want a nice UI, you go with them. The same way that if you want some revolutionary tech, you get it from IBM, or some other company that invest on tech.
It is called especialization. It is one of the biggest drivers of growth since the XVIII century.
Re: (Score:2)
Because Apple isn't that kind of company and seems to hold no ambitions to become one?
Apple is really good at taking various pieces of existing technology and combining them in a way that nobody else has, or, if a similar product exists, taking it and refining the hell out of the user experience. That's it.
Why should they spend money on something that's outside their intended domains of expertise? It would be like complaining that the Gap doesn't spend money researching basic materials science so they can m
Re: (Score:2)
Why is that so depressing? Apple is design and marketing, not engineering or research. And they do a damned good job of it, too -- they do usually have the best designs if not the best engineering (e.g., iPhone antenna).
Re: (Score:2)
Apple's not in the business of researching, producing, and selling bleeding-edge computers. Apple's focus is primarily on pleasant design, intuitive user interfaces, and an overall integrated experience using commodity-grade hardware.
(I say this as someone who doesn't even buy their products.)
Re:And people say .... (Score:4, Insightful)
Apple will wait for everyone else to have quantum computing, and then release a device making the masses believe Apple invented quantum computing because they call it iQuantum.
But I agree. Apple has 98 billion in the bank and is worth over 1/2 trillion on paper, yet they are only focus on repackaging largely off the shelf components invented by other companies into fancy packages and spending way too much money designing retail stores that boast large sheets of seamless glass.
What strikes me as really depressing is that while Bill Gates is generally hated among Slashdot readers he had given more back to the world in the terms of his charity work. In his "retirement" he is focused on trying to solve some of the world's biggest issues in poverty and quality of life.
On the other hand, Steve Job's stayed at Apple pretty much up till his death bed creating an empire where people just thrown them money to buy into a walled garden of content and hardware while Apple shits on any other competitive product or company.
How has Apple given back to the world? Creating jobs where the pressure is so high people kills themselves when they don't meet Apple's quota's or quality standards? Creating products people actually kill for? Creating a market of "want" that is never satiated until someone becomes bankrupt?
Apple needs to start giving back, put some of them billions into charity and maybe try to invent something useful for the world that does have an "i" in front of it.
I sincerely think that Apple has enough money to cure cancer, but the company is more interested in hoarding money and technology patents. Its a shame really that everybody's beloved Apple is probably one of the most evil, greedy, selfish and vindictive companies wrapped in a protective bubble of smugness.
another misleading quantum computing article (Score:5, Insightful)
1) Repeated news about being able to perform some operation with a tiny number of qubits do not suggest that it is probably true that a useful quantum computer of practical size can be built;
2) It wouldn't mean data processing power would be "exponentially increased", but that certain algorithms could be executed asymptotically faster.
QC remains a second rate branch of mathematics for computer science types who don't want to apply themselves to less glamorous problems in the more mature and challenging fields of classical computing. For engineers, it's still in the nuclear fusion stage: kinda just possible in the right conditions, but under no conditions shown useful.
Re: (Score:2)
That is just the kind of attitude that holds back progress.
Re: (Score:2)
Exponentially? (Score:1, Informative)
data processing power would be exponentially increased over what is possible with today's silicon-based computing.
Please, please, please stop misusing the word "exponentially". It just means that something is increasing (or declining) at a constant rate, which is practically the opposite of what is meant here.
Re: (Score:2)
Your Ignorance is showing.
Well, somebody's certainly is. Suppose that something grows continuously at a constant rate of 10% per year. That is exponential growth, because it fits the relation x = e^(0.1*n). Try it.
Re: (Score:2)
I'm talking about the growth rate (i.e., the 0.1), not the slope of the resulting exponential curve.
Here, try this: Plot the curve of a quantity that grows by a fixed proportion from one point to the next. E.g., a population that grows by 10% per year. Is that curve linear over time? No, of course not: it's an exponential, and the exponent is the growth rate (multiplied by t), which is constant. It's like magic.
This is pretty elementary stuff, but before you start calling people a "stupid fuck", maybe you s
Re:Exponentially? Yes (Score:5, Informative)
Re:Exponentially? (Score:5, Informative)
The whole discussion is fubar
First of all, the derivative of e to the x ("exponential function") is e to the x. Yeah thats true the D is the same as the function itself. Welcome to 1st semester calculus, kids. Not a constant, not even sure what "constantly increasing" means mathematically, although if AC meant its linear thats a bucket of fail too.
The next fubar is quantum computing doesn't provide a magic exponential speedup. There is a page length summary on the wikipedia but it should come as No Surprise Whatsoever to anyone in CS that different algorithm designs inherently have different big O notation and magically sprinkling quantum pixie dust doesn't change that, some algos are linear, some poly, some constant, some exponential, all quantum computing does is swap about where some belong. Solve for X where X+1=2 is not gonna change much, factoring into primes is going to change quite a bit. Some of the most interesting problems are polynomial time not exponential in quantum computing. http://en.wikipedia.org/wiki/Quantum_computer#Potential [wikipedia.org]
Re: (Score:3)
Wouldn't this be a game-changer for encryption, though (if they can actually make it work, that is)? I mean, brute-force decryption seems like exactly the kind of computational task that a quantum computer could easily handle. So a brute-force attack on a key that may take hundreds of years on a current supercomputer could be done in a few minutes. No password would be safe from any organization with access to that kind of computing power. Or am I understanding the potential?
Re: (Score:3)
Wouldn't this be a game-changer for encryption, though (if they can actually make it work, that is)? I mean, brute-force decryption seems like exactly the kind of computational task that a quantum computer could easily handle. So a brute-force attack on a key that may take hundreds of years on a current supercomputer could be done in a few minutes. No password would be safe from any organization with access to that kind of computing power. Or am I understanding the potential?
Not necessarily, no. For any crypto app you can come up with some formula where you chunk in the number of bits and it spits out how long it takes to crack it. It exclusively has to do with scalability in design. Double a linear algo and that number takes twice as long. Most (good) crypto is exponential so triple the number of bits it goes up by 3^3 or 27 times longer or whatever. The deal is quantum computing for some crypto increases by poly instead of exponential.
What no one wants to talk about is w
Re: (Score:2, Troll)
Oy... The rate is constant, meaning that the increase is in constant proportion to the value of the function at any given time. That's why calculations of continuous compound growth take exponential form, and it's a result of e^x being its own derivative, as you point out.
Of course neither the OP nor I were talking about the computational order ("Big-O") of a quantum algorithm, because no specific algorithm was under discussion. If such algorithms were typically exponential in N - i.e., O(e^N) - that wouldn
Re: (Score:2)
First of all, the derivative of e to the x ("exponential function") is e to the x. Yeah thats true the D is the same as the function itself. Welcome to 1st semester calculus, kids
Me again... This discussion is long dead, but I have to point out that the exponential growth curve is not e^x but rather a*e^(x*t), where a is the initial quantity, t represents time and x is the rate constant. Now differentiate that with repect to t, and you get the slope a*t, as you would expect from a constant x.
Hence, a population of 1000 that grows by 1% adds 10 times as many as a population of 100 that grows by the same rate. Constant growth rate, exponential growth. I don't see why this is so diffic
Re: (Score:2)
Re: (Score:2)
I think your main problem is that you said "constant rate". Rate is kind of ambiguous. To you, that meant increasing by a constant ratio, i.e. A(x+n)/A(x) = A(y+n)/A(y), i.e. exponential growth. To a few other people, it obviously meant increasing by a constant amount, i.e. A(x+n)-A(x) = A(y+n)-A(y), i.e. linear growth (as revealed by AC's introducing the derivative and thinking that proved his point).
And really, if someone said something was "accelerating at a constant rate", I'd typically assume they mean
Re: (Score:2)
Show us your sugar cube sized z-series mainframe! :0)
The Raspberry Pi isn't much bigger than a large sugar cube, and it's more powerful than any mainframe from 1960.
economist article more interesting (Score:4, Informative)
The Economist had an interesting article a couple days ago.. at least it's interesting if you don't really know the details of quantum computing:
Quantum computing: An uncertain future [economist.com]
Each extra qubit in a quantum machine doubles the number of simultaneous operations it can perform. It is this which gives quantum computing its power. Two entangled qubits permit four operations; three permit eight; and so on. A 300-qubit computer could perform more concurrent operations than there are atoms in the visible universe.
Re: (Score:2)
As the AC said, that's misleading.
Another way to look at it is by having the computer indeed calculating over the entire domain at once (that means, the computer runs the calculation on all the numbers represented with X bits at once), but when you read you can only get the result of one of the calculations. The actual one you'll get is a random one, with a distribution that you can tune.
That is equivalent to constraining the operations. I think that way of thinking is more intuitive.
Bad explanition of qubit superposition (Score:3, Informative)
I find that to be a terrible explanation. What he said: "For each qubit you double the number of states you can be in at the same time." is also true for normal bits. Huh? Here is a better explanation: http://en.wikipedia.org/wiki/Qubit [wikipedia.org]
Re: (Score:1)
Actually, it is correct. An additional normal bit doubles the number of *possible* states. An additional (entangled) qubit doubles the "number of states you can be in at the same time" (with emphasis being on "at the same time"), which is a colloquial description of doubling the dimension of the state space.
Re: (Score:3)
No. With regular bits, you can only be in one state at once. Adding a bit doubles - 1 the number of states you are not in.
What really IS important: (Score:1)
Re: (Score:2)
And Minecraft. Does it run Minecraft? How does JAVA behave in it?
This does _not_ imply scalability! (Score:5, Interesting)
For conventional computers, as soon as you have "and" and "not" in gate-form, you can do everything, as you can just connect them together. For quantum computers that is not true, as all elements performing the complete computation need to be entangled the whole time.
IMO, there is now reason to believe that the real-world scalability of quantum computers is so bad that it negates any speed advantage. It seems the complexity of building a quantum computer that can do computations on inputs of size n is at least high-order polynomial or maybe exponential in n. That would explain why no significant advances have been made in keeping larger quantum computing elements entangled in the last 10 years or so and no meaningful sizes have been reached.
Keep in mind that, for example, to break RSA 2048, you have to keep > 2048 bits entangled while doing computations on them. And you cannot take smaller elements and combine them, the whole > 2048 bits need to represent the input all must be entangled with each other or the computation does not work.
Re:This does _not_ imply scalability! (Score:4, Informative)
Theres a nice wiki page with pages and pages of detailed explanation of what this post is talking about.
http://en.wikipedia.org/wiki/Quantum_decoherence [wikipedia.org]
Here's a nice analogy for quantum computing... its a magic old fashioned analog computer with serious reliability and I/O issues. Imagine at the dawn of the computer era you wanted to simulate the statics of a large railroad bridge. In 8 bits it would take a very long time, 16 bits much longer... And to prevent rounding error propagation you have issues. So why not simulate it with a thundering herd of analog opamps which will "instantly" solve the bridges static loads? OK cool, other than all the opamps must work perfectly the entire time you take a measurement which with vacuum tubes is questionable and qubits maybe impossible. The other problem is if you want 32 bit accuracy now your proto-computer engineer needs to build a 32 bit A/D converter to connect to your analog computer... good luck... This is not a perfect quantum computing analogy, but pretty close in many regards.
There is a bad trend in computer science to assume "all computers and algorithm programming problems are about the same" which they historically have been, but are not in the real world. So given two roughly identical algorithms and problems on two roughly identical computers, the smaller big-O notation wins every time, more or less. That is a huge mistake to try that thinking across widely different architectures... OK so factoring computation is exponential on classical computers and everyone ignores I/O because thats constant with a normal bus design or at worst linear. OK so factoring computation is poly on quantum computers hooray for us... whoops looks like I/O might go exponential and constant factor might be years/decades to get the thing working.
The way to keep secure with a classical computer is to pick an algorithm that big O scales such that it can't be broken in this universe. The way to keep secure with a quantum adversary is to pick a key size that seems to make it an engineering impossibility to build a quantum computer, even if by some miracle a quantum computer could solve it in poly time if only it could somehow be built.
Re: (Score:2)
I like your analog computer analogy. Maybe for those that are not into electronics: Building a working 32 bit A/D converter (i.e. one that has 32 bits accuracy) is pretty much impossible, even at 24 bits the lower bits are only noise from several different noise sources. And OpAmps are pretty noisy to when you get to that precision level. 16...20 bits is about the practical limit unless you do things like supercooling and even then you only gain a few bits.
I also completely agree on the countermeasures. And
Re: (Score:2)
Close but you missed a point in the system. If and only if the input signal is noiseless and/or the spectrum of the noise is white (almost always isn't AGWN) and completely driftless over time.
which is a static system (time independent), and we can measure it for as long as we like
There is no such thing out there, and at the 32 bit level there is a lot of logic chopping and special circumstances required. Maybe something like counting 32 bits worth of individual photons in a laboratory setting...
I do agree with your analysis.... if the "raw analog signal" has a theoretical 16 bits of accuracy
Re: (Score:2, Insightful)
Another way of explaining this is that in order to take advantage of the exponential speed-up of quantum computing in practical applications, you need exponentially better management of entanglement and decoherence effects, which turns out to be a very difficult engineering problem. People keep proposing different models for quantum computing hoping that if they do these operations in solid state rather than via NMR, or in Bose-Einstein condensates, or using exotic pseudo-particles, or other means that the
Re: (Score:2)
For conventional computers, as soon as you have "and" and "not" in gate-form, you can do everything, as you can just connect them together. For quantum computers that is not true, as all elements performing the complete computation need to be entangled the whole time.
Actually for conventional computers, to implement any binary function you only need either NAND or NOR [uiowa.edu], the "universal gates".
For qbit-based Quantum Computing, the universal gate is Controlled Not (CNOT) [wikipedia.org] gate, which can be used to realize any q
Re: (Score:2)
A matter of taste. I like to regard AND and NOT as different constructs, since one is unary and one is binary. May have to do with some background in modern algebra I have. Of course, you can combine them, but whether NAND/NOR is really less complex than AND/OR and NOT is up for debate. When implemented classically as TTL, NAND is easier than AND and only minimally more complex than NOT.
Anyways, entanglement is the primary foe of scalability, other problems are data input and output, since that has to be do
Welcome H(I)A(B)L(M) (Score:1)
I miss IBM PC's (Score:1)
Re: (Score:2)
I miss the days when IBM actually made PC's they were always rock solid. You could beat someone to death with one of there laptops and after wiping the blood off it it would still work...
Model M keyboard with the steel backplate and buckling springs. Still use mine with a PS/2 to usb converter thing (not an adapter, a more expensive converter). Lack of a windows key didn't bother me until I switched to the "awesome" windowmanager which likes to use that key as a control key. Bummer.
Re: (Score:1)
Re: (Score:2)
Why I'll be... a brand new 104 key type M... that means a windoze key to drive "awesome" window manager with. I may have to retire my old PS/2 type M...
They're not expensive, they're only a hundred bucks. If they're as good as a real type M, your grandkids will be using them, which works out to "about a can of soda per month". Expensive is something like an all plastic "gamers keyboard" for $30 that only lives for 6 months before keys start sticking (true anecdotal story).
Re: (Score:1)
Please enlighten me : Quantum computers & MWI (Score:1)
If we consider the many worlds interpretation to be viable, from what I understand :
- when a scientist will start up the very first quantum computer for the first time -- say, a big 250 qubit computer -- and will test it against a big cypher or whatever, 2^250 univers
Re: (Score:1)
(In the other worlds, I'm better at learning foreign languages).
Re: (Score:2)
Re: (Score:2)
That's not how quantum computers work, despite of what you might have read in science popularization articles. Quantum algorithms don't work like classical algorithms work, but "doing all possibilities at once". That wouldn't work because of the contradiction you described -- once you measure the result, all the other "possibilities" go away.
Quantum algorithms work by not only solving the problem, but also shifting the probabilities of the qubits in such a way that, when you measure it, you get a very high
Pre-emptive strike against wtf is a QC (Score:5, Informative)
Quantum Computers are not super-computers. On a bit-for-bit (or qubit-for-qubit) scale, they're not necessarily faster than regular computers, they just process info differently. Since information is stored in a quantum "superposition" of states, as opposed to a deterministic state like regular computers, the qubits exhibit quantum interference when mixed with other qubits. Typically, your qubit starts in 50% '0' and 50% '1', and thus when you measure it, you get a 50% chance of it being one or the other (and then it assumes that state). But if you don't measure, and push it through quantum circuits allowing them to interact with other qubits, you get the quantum phases to interfere and cancel out. If you are damned smart (as I realized you have to be, to design QC algorithms), you can figure out creative ways to encode your problem into qubits, and use the interference to cancel out the information you don't want, and leave the information you do want.
For instance, some calculations will start with the 50/50 qubit above, and end with 99% '0' and 1% '1' at the end of the calculation, or vice versa, depending on the answer. Then you've got a 99% chance of getting the right answer. If you run the calculation twice, you have a 99.99% chance of measuring the correct answer. However, the details of these circuits which perform quantum algorithms are extremely non-intuitive to most people, even those who study it. I found it to require an amazing degree of creativity, to figure out how leverage quantum interference constructively.
But what does this get us? Well it turns out that quantum computers can run anything a classical computer can do, and such algorithms can be written identically if you really wanted to, but doing so gets the same results as the classical computer (i.e. same order of growth). But, the smart people who have been publishing papers about this for the past 20 years have been finding new ways to combine qubits, to take advantage of nature of certain problems (usually deep, pure-math concepts), to achieve better orders of growth than possible on a classical computer. For instance, factoring large numbers is difficult on classical computers, which is why RSA/PGP/GPG/PKI/SSL is secure. It's order of growth is e^( n^(1/3) ). It's not quite exponential, but it's still prohibitive. It turns out that Shor figured out how to get it to n^2 on a quantum computer (which is the same order of growth as decrypting with the private key on a classical computer!). Strangely, trying to guess someone's encryption key, normally O(n) on classical computers (where n is the number of possible keys encryption keys) it's only O(sqrt(n)) on QCs using Grover's algorithm. Weird (but sqrt(n) is still usually too big).
There's a vast number of other problems for which efficient quantum algorithms have been found. Unfortunately, a lot of these problems aren't particularly useful in real life (besides to the curious pure-mathematician). A lot of them are better, but not phenomenal. Like verifying that two sparse matrices were mulitplied correctly has order of growth n^(7/3) on a classical computer, n^(5/3) on a quantum computer. You can find a pretty extensive list by googling "quantum algorithm zoo." But the reality is that "most" problems we face in computer science do not benefit from quantum computers. In these cases, they are no better than a classical computer. But for problems like integer factorization, bringing the compute requirements down to polynomial time isn't just faster: it makes a problem solvable that wasn't before.
Unfortunately [for humanity], there is no evidence yet that quantum computers will solve NP-complete problems efficiently. Most likely, they won't. So don't get your hopes up about solving the traveling salesmen problem any time soon. But there is still a lot of cool stuff we can do with them. In fact, the theory is so far ahead of the technology, that we're anxiously waiting for breakthroughs like this, so we can start plugging problems through known algorithms.
Re: (Score:1)
Re: (Score:1)
It's a very meaningful phrase. It describes all computing problems where both the input and the output are completely classical.
Re: (Score:2)
I think this description applies to any "computing problem".
It certainly applies to any "quantum computing problem" (i.e., any problem solvable by a quantum computer), because any quantum computer can be simulated by a classical computer. That is, any problem solvable by a quantum computer can also be solved by a classical computer.
The catch is that, as the size of the input grows, it might be the case that the amount of time and/or space needed by the classical computer grows exponentially (that's not prov
Re: (Score:2)
Your explanation was awesome. Thank you.
Re: (Score:2)
Re:Pre-emptive strike against wtf is a QC (Score:5, Informative)
Re: (Score:2)
encryption key layer (Score:1)
You have 2048 randomly entangled bits.
Somebody on the other side of the world has the matching pair of 2048 randomly entangled bits.
Not useful for communication per-se if you can't influence them, but if you could *READ* them without influencing them, they'd be darn spiffy for an encryption key or seed shared between two parties.
Simple XOR encryption would be awesome so long as you both have synchronized reading of the encrypted bits. Take message, XOR it against the encryption key, send to recipient, recip
Re: (Score:2)
[...] but if you could *READ* them without influencing them [...]
That would be great, but you can't. Once you read them, the entanglement is broken. As mathimus1863 wrote in the original message in this thread,
Typically, your qubit starts in 50% '0' and 50% '1', and thus when you measure it, you get a 50% chance of it being one or the other (and then it assumes that state). [my emphasis]
That means that once you measure a qubit, its state becomes exactly what you measured (this is commonly known as "wave function collapse"), and so it's not entangled anymore.
Re: (Score:1)
Imagine a machine (hidden from any observers) that flips a coin, cuts it in half, and puts the halves in two sealed boxes. The halves could be either heads up or tails up, you can't tell until you open a box. Both will be in the same state, no matter how far apart you move the boxes. You can drive all day with a half-coin in a box and it won't change. (Had to work a car in.) Knowing the state of one tells
good explanation (Score:2)
"On a bit-for-bit (or qubit-for-qubit) scale, they're not necessarily faster than regular computers, they just process info differently."
Thank you. I have been trying and failing (in tweets @DrEpperly) to explain the concept you describe very succinctly. I have a telecommunications background so we just think of it as having two channels...sort of like the old 'dual-mode' phones...
When you get published saying this please send me a link ;)
Re: (Score:2)
That was a great explanation.
Just one small nitpick: when you talk about factorization, you use "n" for the number of bits, and when you talk about guessing an encryption key, you use "n" for the number of possible keys, which makes things a little unnecessarily confusing. I'd change the second one to also use number of bits -- so it would be O(2^n) on classical computers and O(2^(n/2)) for quantum computers. This way it's also easier to see that the square root (i.e., the factor of 1/2 in the exponent) doe
IBM layoffs (Score:1)
This news story appears the day after IBM laid off a number of engineers in STG. (system and technology group, the part of the company that works on operation systems and hardware like Power, blades, Z, etc)
Not that IBM would be attempting to deflect any negative news stories which might range from the very tight lipped control on number of employees let go, forbidding those employees let go from talking to the press or lose their severance pay, current number of employees in the US, brain drain of engineer
"...properties in quantum bits or quibits" (Score:1)
Is "quibit" an accepted variant spelling, and, if so, where does the extra letter "i" come from?
Quantum Computing the new fusion (Score:1)
Re: (Score:1)
I dont know anything better, and you will be dissapointed with scientific journals too, so get used to it
Re: (Score:2)
Any suggestions for a good replacement tech news website?
You'll have to clarify your definition. If you want "tech news" as in news about "hard science" tech there are places like arxiv.org or PLOS for bio stuff. My best guess for IT type primary sources is maybe the debian-announce mailing list or the daily SANS ISC diary? There are no primary source places that I'm aware of with social media type features, not /. certainly not on G+ or whatever.
If by "tech news" you mean news about other tech news sites, if you prefer a weekly format thats "this week in tech
Re:So (Score:5, Funny)
has any1 else hear hurd of it????!?
FTFY.
Re: (Score:2)
I've always been partial to The Register (www.theregister.co.uk) which is snarky and British (and the home of the BOFH) and to Ars Technica (www.arstechnica.com) which tries to focus on actually writing interesting articles about news rather than just linking to things.
There is often some overlap between themselves and /. but it's not as bad as you'd expect.
Re: (Score:2)
"...also likely to reinforce any of your existing biases since they're never neutral."
I thought we were talking about science content? Why would biases be relevant?
Independently reproduceable results rule; soundbites drool.
Re: (Score:2)
Re: (Score:2)
Maybe it's more a reflection of the changes in the tech industry than in the site.
Re: (Score:1)
More than Anonymous Cowards.