Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware Science Technology

1 Molecule Computes Thousands of Times Faster Than a PC 246

alexhiggins732 writes with this tantalizing PopSci snippet: "A demo of a quantum calculation carried out by Japanese researchers has yielded some pretty mind-blowing results: a single molecule can perform a complex calculation thousands of times faster than a conventional computer. A proof-of-principle test run of a discrete Fourier transform — a common calculation using spectral analysis and data compression, among other things — performed with a single iodine molecule transpired very well, putting all the molecules in your PC to shame."
This discussion has been archived. No new comments can be posted.

1 Molecule Computes Thousands of Times Faster Than a PC

Comments Filter:
  • by Polarina ( 1389203 ) on Saturday May 08, 2010 @06:50PM (#32142254) Homepage
    This would more likely break Moore's Law since this molecule isn't a transistor.
  • by thms ( 1339227 ) on Saturday May 08, 2010 @06:58PM (#32142324)
    From the top of my head, among these limitations are:
    • It won't solve any NP complete or even hard problems faster than a few orders of magnitude.
    • It is probabilistic, so you still need old fashioned silicon around it, and still all results will come with a P-value.
    • They need quite good cooling, as in liquid nitrogen.
  • by Anonymous Coward on Saturday May 08, 2010 @07:05PM (#32142372)

    "Quantum computers can turn some problems that require exponential time to solve into a polynomial time."

    If P=NP and BQP in NP that would be false. Also, if BQP=P (is that possible?). Interestingly, we don't have poly-time quantum algorithms
    for any NP-complete problems.

  • by blair1q ( 305137 ) on Saturday May 08, 2010 @07:25PM (#32142524) Journal

    Moore's law isn't about the tip of high-tech research. It's about the leading edge of profitable manufacturing of computational devices.

    I.e., until someone like Applied Materials or KLA Tencor is done installing a fab line for this process node, you can't count it as a data point in the history of the law.

  • by king_nebuchadnezzar ( 1134313 ) on Saturday May 08, 2010 @07:50PM (#32142696)
    he is not saying that it can solve NP problems, he is saying that things such as factorization that are not thought to be in P are definitely in BQP
  • Re:Thats cheating (Score:3, Informative)

    by Interoperable ( 1651953 ) on Saturday May 08, 2010 @08:32PM (#32142986)

    That's like saying that the only thing a transistor can only compute is how it will behave for given applied voltages across its base and collector. Strictly true, but it's a critical building block. Any time you can deterministically create a particular quantum state, allow it to evolve, and read the output you can perform some quantum computations. Similarly, any classical system can perform some classical computations; the question is whether those computations are useful. Frauenhofer diffraction performs a Fourier transform and, as another poster pointed out, that can be useful.

    The key here is that, while it's easy to prepare a classical system and let it evolve, it's much harder to do it with a quantum system. The experiment is a proof-of-principle experiment that vibrational modes in molecules can be deterministically written to and remain undisturbed enough to evolve in a quantum fashion. So far, the only thing that this quantum system can compute is how it will evolve, but, given appropriate input, other operations could be computed. The authors claim that a controlled-NOT (C-NOT) gate could be implemented which is the only two-bit operation needed to build an arbitrary quantum algorithm.

    The reason that this paper isn't a huge breakthrough (Physical Review Letters is good, but it's no Nature or Science) is that the read and write stages are classical so it can't be chained with other operations. Good fidelity C-NOT gates can be built out of many quantum systems but I think vibrational energy level in molecules is a new one, which has many useful features but not, at the moment, quantum read-write. Reliable read-write operations with quantum light are common, but not to systems that have high-fidelity C-NOT protocols.

    People, especially people who read /., need to stop expecting quantum computers tomorrow. It turns out that they're really hard to do, but steps like this are solid progress. Give it time; quantum computers will come through a lot incremental progress towards increased fidelity operations in many areas of the field.

  • Comment removed (Score:3, Informative)

    by account_deleted ( 4530225 ) on Saturday May 08, 2010 @09:58PM (#32143456)
    Comment removed based on user account deletion
  • by RobVB ( 1566105 ) on Saturday May 08, 2010 @11:12PM (#32143816)

    It has to do with the complexity of calculations, and the time a computer needs to find the solution for a problem with n variables/elements. For a certain way of solving a problem, increasing the amount of variables (n) increases the complexity, and thus the calculating time.

    An example: simulating a traffic situation with n cars. Doing the simulation with 11 cars is more complex than with 10 cars, because there's one extra car that's interacting with all the other cars.

    If a problem is of the order of complexity of 2^n, increasing n by 1 doubles the calculating time - for example: if n increases from 10 to 11, the complexity increases from 2^10 or 1024 to 2^11 or 2048, an increase of 100% (in this case, it will always be 100% no matter the value of n)

    If a problem is of the order of complexity of n^3, the increase in calculating time is much less: from 10^3 or 1000 to 11^3 or 1331, an increase of 33% (different values of n will give different percentages here: if n=1000, it's only about 3%).

    As Vellmont said:

    Quantum computers can turn some problems that require exponential time to solve into a polynomial time. So instead of taking 2^n time, it might take n^3 time.

    The quantum computers have a different way of approaching the problem, which affects the order of complexity. This means they're better at solving "larger" problems: problems with more variables and higher values of n.

  • by blankinthefill ( 665181 ) <blachancNO@SPAMgmail.com> on Saturday May 08, 2010 @11:39PM (#32144022) Journal
    Agree 100%! I mean, the first transistor was invented in 1947, and the first integrated circuit wasn't introduced until 1959, and the integrated circuit took even more years to make it into computing devices... and then even more years to evolve to a complexity that allowed the creation of the PC. And the science and engineering involved in those was kid stuff in comparison to many of these inventions. We're not even to the point of the transistor in quantum computing... This is probably more closely related to the Babbage's analytical engine!
  • As I understand it, there's a trade-off between uncertainty and speed in quantum computing. You can get results faster, but you'll have a higher probability that your machine returns 2+2=5.

    The same goes for conventional computing. No computer is error-free, and bit errors can and do happen. There are unsolved/unsolvable problems in electronics like metastability that always come with a P-value which you can make as large as you want by trading off speed.

    Conventional computers are tuned such that the error rates are small enough that people can live with them (e.g. once a few months for crappy consumer hardware, or hopefully once every decade or more for proper servers). The question is whether quantum computing will still be faster after being tuned to similar error rates. There are also tricks you can use, such as ECCs and other types of parity for conventional computers. For example, on quantum computing you can have several computers running the same problem and then require that they agree on the result.

  • by Serious Callers Only ( 1022605 ) on Sunday May 09, 2010 @06:58AM (#32145794)

    Moore's law

    Moore's 'law' isn't a law of nature (or of humans) in any meaningful sense. It's a conjecture, a guess, a prediction, and nothing more. Why people who are supposedly rational cling to it as some unchanging constant of nature mystifies me. Why even bother to argue about whether it is true or not? It's already completely out of date, in that he wisely limited his guess to 10 years, up to 001975.

    If Moore's conjecture is broken, or has already been, so what? Have any fundamental laws of physics been violated, has our understanding of the world changed one iota? It was an interesting guess in its time about the progress of technology, and was not, so far as I know, intended to last forever.

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...