Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
United Kingdom Hardware Technology

Two-Photon Walk a Giant Leap For Quantum Computing 112

ElectricSteve writes "Research conducted at the University of Bristol means a number of quantum computing algorithms may soon be able to execute calculations of a complexity far beyond what today's computers allow us to do. The breakthrough involves the use of a specially designed optical chip to perform what's known as a 'quantum walk' with two particles ... and it suggests the era of quantum computing may be approaching faster than the scientific establishment had predicted. A random walk – a mathematical concept with useful applications in computer science – is the trajectory of an object taking successive steps in a random direction, be it over a line (with only two possible directions) or over a multi-dimensional space. A quantum walk is the same concept, but translated to the world of quantum computing, a field in which randomness plays a central role. Quantum walks form an essential part of many of the algorithms that make this new kind of computation so promising, including search algorithms that will perform exponentially faster than the ones we use today."
This discussion has been archived. No new comments can be posted.

Two-Photon Walk a Giant Leap For Quantum Computing

Comments Filter:
  • We soon need to certify in quantum-mechanical logic to write software?
    Any good resources?

    • Re: (Score:1, Informative)

      most software can't benefit from quantum logic...
      • most software can't benefit from quantum logic...

        Are you a professional programmer?

        Predicting randomness would've save me alot of time in my projects ;)

        • Re: (Score:1, Insightful)

          i'm a senior engineer.

          how would quantum logic creating randomness help you predict randomness?

          • by Cstryon ( 793006 )

            I'm glad you are a senior engineer. Mind if I pick your brain?

            I won't pretend to understand quantum logic. But how would arranging for photons (Or electrons, or whatever you want for your object), to take a random step in a random direction, help computing? Doesn't computing depend on expected actions with expected results, as opposed to random possible actions with maybe even more random possible results?

            Or would this new machine expect that position of particle 1 would be in position A, or B? Observe and

            • i'm confused what makes you glad i have a job, but i'll give you the quantum mechanics 101:

              it's not about taking singular steps, it's about taking all possible steps in the same time it would take to take a single step. think about parallel processing... not all applications lend themselves to benefitting from parallelization... in fact, almost all procedural processes can't benefit at all. the same is true for quantum logic.

              one of the most important applications of quantum logic is reversing encryption.

    • Re: (Score:2, Funny)

      by Tablizer ( 95088 )

      Microsoft is coming out with Windows Quantum Edition. It only BSOD's when you are observing it; and is in an undetermined state when you are not. How is that different from regular Windows you ask? Well......um...

  • when can I have one on my desktop, and will it make completely immersive games possible. Sure search algorithms are great and all, and so are other scientific wonders that can come about from quantum computing, but unless I can have fun with it at home it does fuck all for me.

    HEX

    • Re: (Score:3, Insightful)

      Unless they perfect a neural interface I'm pretty sure you won't be getting completely immersive games... But I'd be interested in seeing what kind of crazy fractal-based graphics and random world maps they can make with this tech.
      • Re: (Score:3, Informative)

        Unless they perfect a neural interface.

        I believe you mean until they perfect the neural interface. If the story is true, the neural interface seemed a lot closer to reality than practical quantum computing until about 3 mins ago.

        • Been waiting for this for over 20 years. Though I'll be sure to skip rev. 1.
          • "It's safe to turn of the universe now" -Windows, no wait, Wormhole 2095.

            "General Protection Fault?!?! You- You.... AAAAAAAAAAAAAAAAAHHHHH!!!!!"

        • ... the neural interface seemed a lot closer to reality than practical quantum computing until about 3 mins ago.

          So what you're saying is that we couldn't know where we were until we opened our eyes?

      • by lxs ( 131946 )

        Don't do it! I've tried one of those newfangled Quantum Consoles and ended up in this Universe. Can I get out now? Please?

    • by AvitarX ( 172628 )

      Better search should help with fun at home

    • Re: (Score:1, Funny)

      by Anonymous Coward

      Screw immersive gaming. How is this going to help me watch porn in the interweb.

      • Screw immersive gaming. How is this going to help me watch porn in the interweb.

        I'm glad you've asked that question, here's how it works:

        First you have to rub one off, then by the Magic of Quantum(TM) the girl inside the video gets a bunch of milk products on some part of her body right afterwards. So it's like being there, but the technology scales simultaneously to millions of web surfers in their parents basement!

    • Based on this [slashdot.org] article, I would count on quantum computing having a big impact on computer graphics. A quantum algorithm that can crunch matrices exponentially faster than current techniques would be as important for graphics (and many other fields) as a quantum computer's ability to quickly factor large numbers would be for cryptography.
  • by Anonymous Coward on Thursday September 16, 2010 @07:13PM (#33606190)

    into a bar... wait... where am I?

    • by marcosdumay ( 620877 ) <(marcosdumay) (at) (gmail.com)> on Thursday September 16, 2010 @07:27PM (#33606270) Homepage Journal

      Just open your eyes, and see where you are. After seeing it you are not going to be anywhere else, but before looking, I can't really tell you.

      • Re: (Score:3, Informative)

        by Nemyst ( 1383049 )
        But by looking you change where you are, so that doesn't work...
        • i wonder about that, is by obversing you change your location, do you observe the old, or the new location? (if you observe the new one, nothing bad happens really, you know where you are), and, if you keep observing, do you also keep changing your location? (in other words, is the location change edge-triggered on the act of observing, or a continous side-effect of the observing?)

          bah, if this means i have to get my head around quantum-physics to continue working as a programmer i'd better start learning a

          • bah, if this means i have to get my head around quantum-physics to continue working as a programmer i'd better start learning a new job..

            Nah. Assuming the 10 year prediction cones true, there will be a select few applications where supergeek programmers manage to make this thing work. Then about 5 years later a double plus good supergeek, who double majors in quantum mechanics and computer engineering (but never learns to tie his shoes), will invent a beautifully elegant programming language to do all the heavy lifting for you. Unfortunately, his work will get caught up in IP conflicts, and the dev tools will cost you one year's salary.

          • Ok, getting serious here :)

            Before you observe, you don't know where the particle is. Worse yet, its position isn't a point (it couldn't just be on some unknown place). After observation, the particle is at a defined position, and you may or may not know it (you can always throw the data away), but every time you observe after that, you'll see it at the same position. That is, unless it's moving, of course, and you can't be sure that it isn't moving anymore, since you observed its position...

            Then, you observ

        • With your eyes open or closed, you're still in your parent's basement.
        • In one universe you open your eyes and in the other you don't, so there are three universes now. It's like playing God :D

      • But you won't see much with only two photons around.

    • by JamesP ( 688957 )

      Actually they walk in, order, drink, get trashed, get into a fight and are thrown out

      ALL AT THE SAME TIME

    • I believe the technical jargon makes it sound much more complicated than it is. Understanding what the scientists did requires knowing about 'random walks' and their significance. Think of a typical processor working on a problem that involves random walk sequences. Now imagine if that was replaced by getting 2 photons to calculate the 'random walk' part of the problem -- speed is massively increased, and quantum superpositions are now hopefully being utilized. The problem, in many cases, may have just
  • by countertrolling ( 1585477 ) on Thursday September 16, 2010 @07:18PM (#33606230) Journal

    Here's how it looks under a microscope [craftzine.com]

  • Don't think PC (Score:2, Redundant)

    by T Murphy ( 1054674 )
    According to the quantum computing video from a while ago (I think it was 90 minutes or something, I just watched 20), a quantum computer is designed for the problem it solves- they aren't general purpose like the processors in use today. As far as I understood from the video*, quantum computers are mostly just useful for doing calculations related to quantum physics.

    *If I'm wrong/misleading, please correct me.
    • Re: (Score:3, Informative)

      You can make general purpose quantum computers if you have a working set of "quantum gates" or similar -- much like you can make a general purpose classical computer if you have a working set of classical gates.

    • Re: (Score:3, Informative)

      by kmac06 ( 608921 )
      A quantum computer able to do useful classical computing (i.e., factoring large numbers) would have to have a large number of bits (512-1024, very far away by any metric). A quantum computer able to do simulations of quantum systems beyond what current supercomputers could do would have to have maybe 10 bits (maybe not too far away).
    • by Nemyst ( 1383049 )
      Currently the most usable way would indeed be to make a quantum processing unit that latches onto an otherwise classical computer, a bit like how a graphics card works. However, quantum computers are useful for way, way more than just quantum physics. Quantum crypto and solving NP-complete problems faster would just be two small examples of what we can do with it, but remember: quantum physics, particularly quantum computing, are a young field. You should expect more and more possibilities as we move on, es
  • Well, it seems there is no article yet to be read, and I couldn't understand anything from the press release. What does it mean a "one photon quantum walk", and what is the difference from any other kind of transformation that happens on a photon? Also, what is the difference of "two-photon quantum walk" and normal interference?

    Or, in other words, what did they actualy do?

    • Re: (Score:3, Informative)

      by c0lo ( 1497653 )

      What does it mean a "one photon quantum walk"

      Conceptually, no different from a "one-ball-in-the-maze random walk" - can have a single state.

      ...and what is the difference from any other kind of transformation that happens on a photon?

      Again, no difference: the photons will random walk the maze independently (entanglement is not a requirement).

      Also, what is the difference of "two-photon quantum walk" and normal interference?

      a. Conceptual: while walking the maze (and solving your problem), the photons will be particles, thus interference is not an issue to consider.
      b. The maze you make the photons walk through (instead of just two slits) should be programmable (model the system for which you want to compute the answer).
      c. on

      • So, they did a programmable maze with some interference at the last stage so they could apply something like the last step of Shor's algorithm? I couldn't read that in the article.

    • Re: (Score:3, Informative)

      by Bigjeff5 ( 1143585 )

      See the wikipedia link in the summary. It 'splains it.

      • By the explanation there, any kind of quantum transformation would cause a quantum walk. Is that right? If so, what is new?

  • by mathimus1863 ( 1120437 ) on Thursday September 16, 2010 @08:30PM (#33606586)
    Because people always get it wrong every time a QC article hits slashdot, here's a link to my previous, highly-modded (upwards) post on QC:

    http://slashdot.org/comments.pl?sid=1285849&cid=28520061 [slashdot.org]

    Quantum computers can do some cool things, but mostly solve problems no one cares much about (except a few of us mathematicians)
    • Very nice summary, thanks.

      Kind of reminds me of what photonic computing can do. Because photonic interference takes place more or less for free, if you can arrange your problem in a clever way, you can get the photons to do you work for you.

    • Re: (Score:3, Insightful)

      by urusan ( 1755332 )

      Quantum computers can do some cool things, but mostly solve problems no one cares much about (except a few of us mathematicians)

      That is until some practical application is found that uses the solution. From what I've heard, Boolean algebra was thought to have no utility for a very long time after it was discovered, but nowadays...

  • by mathimus1863 ( 1120437 ) on Thursday September 16, 2010 @08:38PM (#33606612)
    Summary is wrong. Quantum algorithms cannot provide "exponential" speedup of any problem. If they could, we would be able to [probably] solve NP-complete problems with quantum computers, and that hasn't been proven yet. The best they can do is "super-polynomial" speedup of classical algorithms.

    Google "quantum algorithm zoo" to see all the known algorithms and their speedups (and how unexciting most of them are).
    • Re: (Score:3, Interesting)

      by Twinbee ( 767046 )

      How about raytracing or particle physics?

      • by m50d ( 797211 )
        Um, what? What about them? Are you claiming you have an algorithm that gives an exponential speedup for those problems? If so, publish it, and collect millions of dollars.

        If not, are you saying we should assume that a quantum computer would be better for those problems? Why?

        • by Twinbee ( 767046 )

          I'm just asking if quantum processors could benefit those tasks. I'm not assuming anything.

          • Not raytracing. Maybe particle physics could get some nice speed-ups from a quantum computer, that depends on the specific problem.

    • by Catullus ( 30857 ) on Friday September 17, 2010 @03:36AM (#33608558) Journal

      This comment isn't accurate. There are problems for which quantum computers are indeed exponentially faster than our best known algorithms running on a standard computer. The most important of these is probably simply quantum simulation - i.e. simulating quantum mechanical systems. This has umpteen applications to physics, chemistry and molecular biology (e.g. drug design).

      • I knew you'd have to correct at least one person in this story. Seriously, get out now while you still can.... ;)

  • by Anonymous Coward on Thursday September 16, 2010 @08:56PM (#33606668)

    Seams like it about time to start putting 5 years of real world quantum programming experence on the old resume.

    • I could have any number of years of real world quantum computing experience. Unfortunately, there is no way to determine how many before you hire me. I can, however, provide you with some probabilities.
  • Peter Gibbons: What would you do if you had a million dollars?

    Lawrence: I'll tell you what I'd do, man: two photons at the same time, man.

  • Randomness (Score:1, Redundant)

    by Lotana ( 842533 )

    I just can't comprehend quantum computing and quantum mechanics in general.

    What absolutely derails me is the talk about randomness, probability and statistics inherent with this field. The word chance gets mentioned a lot and that just stops me in my tracks.

    In programming there is simply no room for chance. Algorithm must always return the same result given same parameters. 1 + 1 must always return an exact, perfect value of 2 no matter how many times it is executed.

    But from what I read, in quantum world yo

  • This isn't new (Score:2, Informative)

    Come on, Scott Bakula was taking random quantum walks back in the late 80s, get with the times people!
  • Grover's search algorithm [wikipedia.org] gives only a quadratic speedup.

  • by Anonymous Coward

    Two photons walk into a bar, orders a round of IPA and asks "How much do we owe?"
    The barkeep says "For you? No charge."

  • Finally we'll be able to run bogosort efficiently [wikipedia.org]

  • Some background (Score:5, Informative)

    by Interoperable ( 1651953 ) on Friday September 17, 2010 @04:44AM (#33608888)

    Let me provide some context. This research group specializes in manufacturing arbitrary waveguide structures on chips, then coupling particular quantum states of light into them. The idea is to turn a large optical table worth of mirrors into a tiny chip. What they have done here, is allowed a two photon input state to interfere with itself in the waveguide structure.

    While interesting technically, it isn't exactly a huge leap forward because the interaction is linear. What's needed for deterministic quantum computation with light is a very non-linear process. The waveguide structure can replace a large number of mirrors and compact the optics into a tiny space but, at the end of the day, mirrors aren't all that interesting for quantum computation. It is, however, worthwhile because of the impressive miniaturization and the technical challenge of working with quantum light in such tiny structures. A strong non-linear component will be needed for true optical quantum computation, but chips like these show a lot of promise for handling a lot of state preparation and measurement.

  • Two photons walk into a bar...

    Wait, didn't Brian Greene already tell this joke?

  • So far, claims from quantum computing researchers hungry for funding resemble that of some other areas that have consistently not delivered for the last 50 years or so. All these isolated demonstrations mean nothing for quantum computing, because different from normal computers, you cannot build them up from parts you understand. A quantum computer is always only one unit, there are no modules or sub-components. Compare that with a traditional computer and it becomes obvious that the only proof of scalabili

    • (...) it is quite possible that quantum computers of meaningful power are fundamentally infeasible in this universe.

      That's true, but in order for quantum computers to be fundamentally impossible, there must exist something in the laws of nature that we haven't observed yet. Quantum computers being possible is actually the most boring thing that can happen from the point of view of theoretical physics (i.e., if there are no surprises and what we currently know is pretty much the way things are).

      As for a quantum computer not being made of sub-components, I don't quite understand what you mean. From what we currently unders

      • by gweihir ( 88907 )

        That's true, but in order for quantum computers to be fundamentally impossible, there must exist something in the laws of nature that we haven't observed yet.

        Actually, no. There is a finite amount of matter and energy in the universe. It is, for example, possibly that the scalability (given the need for error correction, e.g.) is so bad, that using all what is available still is not enough for a meaningful size. (By meaningful, I mean here performing far better that a conventional computer built with about

        • It is, for example, possibly that the scalability (given the need for error correction, e.g.) is so bad, that using all what is available still is not enough for a meaningful size. (By meaningful, I mean here performing far better that a conventional computer built with about the same effort.)

          OK, but if that's the case, there has to be a reason why scalability is that bad. Shor's algorithm can factor n-bit numbers with ~4n qubits. Laflamme's error correction works by encoding each qubit as 5. So, to factor a 2048-bit number in polynomial time, we "only" need about 40000 qubits. We currently know nothing in nature that prevents us to achieve that in principle.

          The problem is entanglement. (...) With a quantum computer, you always have to look at the whole, due to entanglement, hence all known to w

You know you've landed gear-up when it takes full power to taxi.

Working...