China Makes Quantum Leap In Developing Quantum Computer (scmp.com) 70
hackingbear writes: Researchers at the University of Science and Technology of China created a quantum device, called a boson sampling machine, that can now carry out calculations for five photons, but at a speed 24,000 times faster than previous experiments. Pan Jianwei, the lead scientist on the project, said that though their device was already (only) 10 to 11 times faster at carrying out the calculations than the first electronic digital computer, ENIAC, and the first transistor computer, TRADIC, in running the classical algorithm, their machine would eclipse all of the world's supercomputers in a few years. "Our architecture is feasible to be scaled up to a larger number of photons and with a higher rate to race against increasingly advanced classical computers," they said in the research paper published in Nature Photonics. This device is said to be the first quantum computer beating a real electronic classical computer in practice. Scientists estimate that the current faster supercomputers would struggle to estimate the behavior of 20 photons.
What about link to an actual article? (Score:2)
Just curious to read...
Paul B.
Re:What about link to an actual article? (Score:5, Informative)
Nope, I mean to the scientific paper in Nature Photonics, not press-release...
Like this: http://www.nature.com/nphoton/... [nature.com]
Paul B.
Re:What about link to an actual article? (Score:5, Funny)
It is both a link to the article and a link not to the article. Being quantum physics, you'll never know which it is until you click it.
Re: (Score:1)
Schrodinger's link
Here's a few more links... (Score:4, Informative)
Paper preprint [arxiv.org]...
Wikipage about boson sampling [wikipedia.org]...
In principle, a large-scale boson-sampling machine would constitute an effective disproof against a foundational tenet in computer science: the Extended Church-Turing Thesis, which postulates that all realistic physical systems can be efficiently simulated with a (classical) probabilistic Turing machine.
The machine may not have any practical use, but it still is an interesting theoretical advance that might serve to challenge our understanding of computablity... Part of the theoretical importance of this area of research is the understanding of #P-complete [wikipedia.org] problems.
The wikipedia articlenotes the theoretical significance of this...
A polynomial-time algorithm for solving a #P-complete problem, if it existed, would imply P = NP, and thus P = PH. No such algorithm is currently known.
Re: Here's a few more links... (Score:2, Informative)
Researcher in computational complexity here. No one believes this machine (or any quantum machine, at that) will be able to solve #P-complete problems in full generality. There's strong evidence that even NP complete problems are out of reach for quantum algorithms, and #P is (seemingly) way, way, way above NP in terms of computational difficulty.
Re: (Score:2)
Mod parent up, please. This person knows what they [aceseditors.org] are talking about.
Re: (Score:2)
Computer scientist : I only care about P, what's the biggest exponent?
Cryptographer: Is this BQP?
Complexity Theorist: Is this P or NP?
Slashdot (Score:2)
Slashdot makes quantum leap in writing quantum headlines quantum quantum quantum dark side.
Quantum Shave.
Re: (Score:2)
Yes, in other news, BlackBerry made a BlackBerry leap in developing BlackBerry computer:
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
Let's not forget SuSE making a Leap [opensuse.org], too
Re: (Score:2)
hmm... for me your link freezes after the first redirect... Already slashdottted?
Here is another link I found about it:
http://www.cio.com/article/300... [cio.com]
Re: (Score:2)
A quantum leap is hardly noteworthy. Literally that is the smallest possible motion, used in physics for the smallest leaps within an atom.
Not sure if we should blame /. editors or the submitter for that abuse of language.
Re:Slashdot (Score:4, Interesting)
A quantum leap is hardly noteworthy. Literally that is the smallest possible motion, used in physics for the smallest leaps within an atom.
True, but a quantum leap can occur over an energy-boundary that classical physics would claim can't be overcome. I think that's why the metaphor is applied frequently to an unexpected advance in various fields outside of quantum mechanics.
Re: (Score:1)
Re: (Score:3)
Scott Bakula could not be reached for comment.
Re: (Score:2)
Re: (Score:1)
Before you buy
A quantum leap
Be sure you get
The cite complete
Burma Shave
AKA, I'll believe it when the paper is confirmed by researchers... somewhere else
Not general purpose? (Score:3)
I always struggle with understanding quantum computing concepts, but from the sound of things in the article, this is not some sort of general purpose quantum computer. Rather, it's a purpose-built computer dedicated to estimating the behavior of photons.
Why that specifically?
Based on what the article (and summary) said, modern computers struggle to estimate the behavior of 20 or more photons, but it's the sort of problem that quantum computers are theoretically capable of handling quite easily. Researchers are apparently suggesting that in order to disprove skeptics and bring in more support for quantum computing, we should build a quantum computer of this variety and then use it to estimate the behavior of 30 or more photons, because doing so would definitively prove to everyone that quantum computers can provide a massive advantage over traditional computing methods.
Re: (Score:2)
over traditional computing methods... eventually i mean its already faster then the first ever electronic computer. oh ENIAC i miss you so.
Re:Not general purpose? (Score:4, Informative)
Quantum Leap in quantum computing.. (Score:2)
LOL, I see what you did there and it is kind of funny.... BUT, does it compute?
They did not see what they did there (Score:3)
Re: (Score:2)
Excellent point.... Hadn't thought of that. So the headline really is a poke in the eye to the Chinese "invention" (assuming it actually exists).
Re: (Score:3)
In common usage, it's the opposite though.
Merriam-Webster: quantum leap: a sudden large change, development, or improvement
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Babbage and the abacus was the original computer technology.
Vacuum tubes and stepper motors... Now THAT was a classic computer...
RTL, TTL, ECL stuff... That was the golden age....
VLSI CMOS that put a CPU on a chip is "modern" computer technology...
So, Don't feel too old.. Unless you where alive during WW2 working at Bletchley house or some other similar effort of the day.
Re: (Score:2)
So, Don't feel too old.. Unless you where alive during WW2 working at Bletchley house or some other similar effort of the day.
In the future he'll take a "quantum leap" to WW2 Bletchley, where he'll make "incredible breakthroughs" because he already knows the answers, and then kill himself because AC posters are, well, you know, ghey* [urbandictionary.com].
* ghey: Usurping the traditional term GAY to take the homosexual meaning out and leaving in the lame.
Re: (Score:2)
Re: (Score:2)
Nice, But... (Score:1)
The Chinese are expending significant resources building conventional supercomputers. Which suggests far more promise for quantum computers than current reality.
https://www.top500.org/lists/2016/11/ [top500.org]
And that is true for all the computing leaders at present. We know how to build very effective supercomputers. We think that quantum computers might be great, the promise is there in theory, but you'd be a fool to ditch your conventional HPC systems.
And even if quantum computing becomes "a thing", suspicions a
How does this work? (Score:1)
Re: (Score:3)
You set a register of bits to all possible combinations of the bits at the same time -- all possible values from 0 to N^2-1 are entangled. Then you run them through some quantum logic operations that eliminate all impossible solutions to a problem from the set of possible combinations. Then you read the register. It collapses to *one* of the possible solutions when you read it. So for example if you are factoring a large umber, that solution will be one possible divisor of that number. So you divide th
So, they're sometime in the past (Score:2)
Um... (Score:2)
First rule (Score:2)
The first rule of quantum computing is that everyone talks about scaling their design up to more qubits, but no one actually does it.
Re: (Score:2)
I've been wondering whether the safety of "post-quantum" crypto functions was threatened more by quantum simulators than quantum computers.
As to "how do you know the output is correct?" well, pick a problem with a verification step that is not NP hard I guess...
Better cut more from science funding (Score:5, Insightful)
Re: (Score:1)
Because the first ones to produce viable quantum computers crack everyone elses encryption and we are engaged in a lukewarm cyber war with the Chinese?
A quantum leap? (Score:3)
It's really funny to me because... (Score:1)
Get Off It ! (Score:2)