Forgot your password?
typodupeerror
Software Hardware Technology

Separating Hope From Hype In Quantum Computing 109

Posted by timothy
from the semi-un-non-deterministic dept.
pgptag writes "This talk by Dr. Suzanne Gilbert (video) explains why quantum computers are useful, and also dispels some of the myths about what they can and cannot do. It addresses some of the practical ways in which we can build quantum computers and gives realistic timescales for how far away commercially useful systems might be."
This discussion has been archived. No new comments can be posted.

Separating Hope From Hype In Quantum Computing

Comments Filter:
  • by NoSleepDemon (1521253) on Tuesday September 07, 2010 @10:32AM (#33497822)
    Upon observation, this post has collapsed into the first post state.
    • Re: (Score:1, Informative)

      by Anonymous Coward

      The direct link appears:

      http://blip.tv/file/get/Telexlr8-vbSuzanneGildertOnQuantumComputingInTeleplaceSeptember4640.flv

  • Would this sort of thing ever become useful for personal use? Or is Quantum Computing strictly a commercial endeavor?

    • by tom17 (659054) on Tuesday September 07, 2010 @10:50AM (#33497954) Homepage
      Definitely commercial-only. The world only needs five quantum computers.
      • Re:question: (Score:4, Insightful)

        by mcgrew (92797) * on Tuesday September 07, 2010 @11:29AM (#33498324) Homepage Journal

        It appears that the moderators don't know any history. You're obviously making a joke based on the observation in the early 1950s that "the worldwide market for computers is about ten." It's funny now, but then computers weren't very useful for anybody without huge number crunching and database needs and multi-million dollar budgets. At the time, a computer took an entire building to house, and a whole lot of personnel to operate. The most powerful computer in existance was less powerful than a singing Hallmark card.

        So the joke's on the mods, who actualy believe it. Of course, right now the worldwide market is zero, since they haven't actually constructed one yet. If and when they accomplish the feat, it's possible that in the future all compuers will be quantum computers. I doubt I'll live long enough to see it (I'm not young any more). [kuro5hin.org]

        That link will give the mods a little computer history if they're interested.

        • by Bigjeff5 (1143585)

          You've got to give these things times, my man. This is Slashdot. Just an hour after your post (and about an hour and 40 minutes after the original) it is up to +5, Funny where it belongs.

          • by mcgrew (92797) *

            It was a +4 insightful when I made the post; it's now +5 funny, 50% insightful and 50% funny according to the "score" link.

        • Re:question: (Score:5, Informative)

          by houghi (78078) on Tuesday September 07, 2010 @12:50PM (#33498992)

          The quote debunked http://en.wikipedia.org/wiki/Thomas_J._Watson#Famous_misquote [wikipedia.org]

          Some facts:
          1) The misquote is "I think there is a world market for maybe five computers,"
          2) The story had already been described as a myth in 1973
          3) Correct quote: "IBM had developed a paper plan for such a machine and took this paper plan across the country to some 20 concerns that we thought could use such a machine. [...] But, as a result of our trip, on which we expected to get orders for five machines, we came home with orders for 18."

          • Re: (Score:3, Funny)

            by Ihmhi (1206036)

            Geez, thanks for ruining a good meme with facts. Next thing you know we'll find out all those cats have been misquoted time and time again.

            • by jesset77 (759149)

              Geez, thanks for ruining a good meme with facts. Next thing you know we'll find out all those cats have been misquoted time and time again.

              Or that Lemmings don't commit mass suicide [snopes.com], or that virtually all cartographers knew the world was round throughout the dark ages, or that glass isn't a liquid, yeah I've been noticing lots of this lately xD

              I'm beginning to waft back to the old maxim: "It's not true unless it's boring"

          • by mcgrew (92797) *

            Thank you for the correction. The wiki article is interesting, this line "All these early quotes are questioned by Eric Weiss, an Editor of the Annals of the History of Computing in ACS letters in 1985" caught my eye - Harry Houdini's [wikipedia.org] real nane was Eric Weiss, according to biographies I've read, although wikipedia says "Harry Houdini was born as Erik Ivan Weisz (he would later spell his birth name as Ehrich Weiss)". It's magic!

            As you say, Watson didn't say it, but from the wiki article:

            I went to see Profess

        • by tibit (1762298)

          I have an offtopic question. Eniac is your contemporary, and it was used to do a bunch of numerical calculations. How did it compare to what Feynman and technologically-apt teenagers under his direction did for Project Manhattan? Does anyone know a rough order of magnitude of multiplies-and-adds that both projects had to go through, and the time it took? IIRC, Feynman's boys have figured out pretty much every basic contemporary CPU/GPU design trick (pipelining, interleaving/scheduling, speculative execution

        • Of course, right now the worldwide market is zero, since they haven't actually constructed one yet.

          Except that in the video she clearly talks about several quantum computers that have been built and have actually solved problems.

        • by arth1 (260657)

          That link starts out with "ENIAC, the first electronic programmable computer", and goes downhill from there. Sure, let's forget Colossus Mark I and II.

          The rest of the article is a jump between family anecdotes and quite limited personal experiences that I do not think "will give the mods a little computer history". Where are TI, Fairchild and Motorola? Or HP? Were the DEC PDPs not worth mentioning? Didn't anything happen between 1974 and 1982 that deserved more than a single combined sentence? To me,

          • by mcgrew (92797) *

            It's a little computer history, not an exhaustive history. Note the title is "Growing up with computers"; it's a personal chronicle to give a little insight to younger folks.

        • Re: (Score:3, Interesting)

          Actually, based on TFA, I'd say we're more likely to see a multi-core processor with some quantum and some classic cores. Kind of like the old floating point co-processors, or going back still further, the TI-99/4A architecture which was made up of a CPU with dedicated video, audio, and peripheral co-processors.

          • by jesset77 (759149)

            Actually, based on TFA, I'd say we're more likely to see a multi-core processor with some quantum and some classic cores. Kind of like the old floating point co-processors, or going back still further, the TI-99/4A architecture which was made up of a CPU with dedicated video, audio, and peripheral co-processors.

            Yis, because Parsec with Speech Synthesis wasn't enough, now it's going Quantum. 8D

    • Perhaps (Score:3, Funny)

      by sycodon (149926)

      Maybe in the future, a Quantum Computer running Windows x.x will be able to harness its power to show the contents of a folder in less than the 30 seconds it takes now.

      • Re:Perhaps (Score:4, Funny)

        by Whalou (721698) on Tuesday September 07, 2010 @10:58AM (#33498040)
        On the contrary, observing the content of a folder would change its state.


        I'm not a quantum physics expert and I don't play one on television

        And if I did, the show would have been canceled.
        • On the contrary, observing the content of a folder would change its state.

          I know windows 2.0 was a while back, but the good news is that they're finally going to get this bug fixed for Windows 8.

      • Re:Perhaps (Score:5, Funny)

        by daveime (1253762) on Tuesday September 07, 2010 @11:07AM (#33498098)

        If you'd categorized your porn collection properly, it wouldn't need to all be in one folder :-(

      • by ByOhTek (1181381)

        It takes less than 30 seconds now, for those of us running on hardware more advance than an 80286

        • Re: (Score:3, Funny)

          by sycodon (149926)

          Here we are having fun and you have to go throw your superior hardware in our face.

          I bet you're real fun at parties.

        • by toastar (573882)

          It takes less than 30 seconds now, for those of us running on hardware more advance than an 80286

          It takes more then 30 seconds for a 486 to index a TB of porn... And thats even with turbo

        • It feels like 30 seconds when you have a badass computer at home with 4 cores, 8gb ram, and an SSD.... and then you go to work. 5 seconds of explorer refreshing seems like 5 minutes.
  • "I hope this thing is really fast running my Beowulf cluster" or "oh man, these things will run a Beowulf cluster 10000x faster than today's machines! "

    • Re: (Score:1, Insightful)

      by Anonymous Coward
      I think that either English is not your first language, or you don't know what a Beowulf cluster is.
    • by Bigjeff5 (1143585)

      You obviously don't know what a Beowulf cluster is.

      The joke is "imagine a Beowulf cluster of those!" for a reason.

      The quantum computers wouldn't run your Beowulf cluster, they would be your Beowulf cluster.

      And the first ones will probably be slow as shit anyway (but catch up much faster than current tech).

      • The quantum computers wouldn't run your Beowulf cluster, they would be your Beowulf cluster.

        Unless you're running a Beowulf cluster emulator on them, of course.

  • by bricko (1052210) on Tuesday September 07, 2010 @10:35AM (#33497844)
    Oops. Thought this thread title was about Obama....sorrry.
    • Nah, it's about Fox News. By balancing out mostly truth with mostly fiction, their audience doesn't know the exact state of the union. Because they don't know, they don't languish into depression and become unproductive. However, given their politics, if they did become unproductive there, we would be better off and more productive overall. It doesn't bother me since on a long enough timeline, everything collapses into one state.

  • Video? (Score:1, Funny)

    by Anonymous Coward

    What video? There's no video on that page, only a huge blank gap sponsored by Adobe.

    • Re:Video? (Score:4, Funny)

      by bhartman34 (886109) on Tuesday September 07, 2010 @10:38AM (#33497858)
      Is that on the iPad or iPod Touch? :)
  • by Zontar_Thing_From_Ve (949321) on Tuesday September 07, 2010 @10:38AM (#33497856)
    We can't even get people to read the articles referenced in submissions. That's wildly optimistic to expect us to watch a video that is over 2 hours long.

    This is begging for an "executive summary" from anyone who has time to watch it, if there is such a person.
    • by JoshuaZ (1134087) on Tuesday September 07, 2010 @10:48AM (#33497940) Homepage
      I don't have time to watch this right now, but if I have to make a guess, the primary points are going to be about the common misconceptions about quantum computers. The most common such belief seems to be the belief that a quantum computer can solve NP-complete problems in polynomial time. This is false although many problems which are believed to be in NP are believed to be not in P are solvable with quantum computer. The most prominent example is integer factoring since the difficulty of factoring large integers is something many crypto systems depend on (such as RSA). There's probably some addressing also that consciousness probably has nothing to do with any quantum effects in the human brain because structures there are generally too warm and too large to have meaningful quantum entanglement.
      • Re: (Score:2, Funny)

        by BergZ (1680594)
        Warm, wet, and squishy doesn't seem to be a limiting factor on quantum mechanical behavior anymore: Untangling the Quantum Entanglement Behind Photosynthesis [sciencedaily.com].
      • The most common such belief seems to be the belief that a quantum computer can solve NP-complete problems in polynomial time.

        Allow me to expand a bit on that.

        There's a complexity class known as BQP which is defined to be what quantum computers can do in polynomial time (hence the Q and P; the B is for Bounded error probability, i.e. algorithms succeed with probability at least 2/3; if you want better: repeat and take majority voting).

        It is known that BQP contains P and BPP (randomized poly-time turing machines), and is contained within PSPACE (which contains NP).

        It is conjectured that P != NP and that BQP contains some but not al

    • Quantum computers are useful for the following class of problem:

      1. The only way to solve it is to guess answers repeatedly and check them,
      2. There are n possible answers to check,
      3. Every possible answer takes the same amount of time to check, and
      4. There are no clues about which answers might be better: generating possibilities randomly is just as good as checking them in some special order.

      If your problem doesn't look

      • Re: (Score:3, Informative)

        by Tacvek (948259)

        That is obviously not the only thing it can do. In P time it can solve P problem (much like a classical computer, but potentially using $\sqrt{classical}$ time, if it meets the above requirements. You can use quantum computing to find (with any probability of your choice which is less than one) the solution to a BPP problem in P time, which is again just like classical computers. Something new here is the ability to solve BQP problems (with any chosen probality less than one) in P time.

        That last one is the

    • by gandhi_2 (1108023) on Tuesday September 07, 2010 @10:54AM (#33498008) Homepage

      I just want to know what exactly is added to this presentation by using an avatar on a virtual stage.

      People want to bash powerpoint but someone takes up half the video area with superfluous (and bad) VR and no one minds?

      • Re: (Score:2, Informative)

        by pgptag (782470)
        @gandhi re added value of avatar on virtual stage: this is an online talk with a participative audience in realtime telepresence. The second half of the video shows a very lively Q/A session and discussion, with a lot of people asking a lot of questions.
      • by ceoyoyo (59147)

        It makes it a video. And videos are cool.

    • by mcgrew (92797) * on Tuesday September 07, 2010 @10:57AM (#33498028) Homepage Journal

      Indeed; is there a printed transcript anywhere? I can read a lot faster than I can listen, with a lot better comprehension.

    • We can't even get people to read the articles referenced in submissions. That's wildly optimistic to expect us to watch a video that is over 2 hours long.

      As long as nobody watches it, we can't really say for certain what's in it.

    • There are many myths about quantum computers. The most prevalent myths are that they will break all cryptographic protocols, be exponentially faster, do all calculations simultaneously, and solve NP-Complete problems in polynomial time. These are all untrue to various degrees.

      A quantum computer is a computer that uses at least one quantum effect to solve problems. Currently quantum computers are leveraging either superposition or entanglement. A difficult hurdle to scaling quantum computers is decohe

    • Re: (Score:3, Insightful)

      by CarpetShark (865376)

      Yeah. A cute, fresh-faced, geeky female doctor with glasses, summarising quantum computers in about an hour. Nah, no one here wants to watch that ;)

    • by PCM2 (4486)

      Not only that, but whatever crappy player they're using doesn't seem to want to let you seek. No matter where you move the marker, the whole presentation just starts over from the beginning -- complete with the audience jabbering right over the speaker.

      • Re: (Score:2, Informative)

        by pgptag (782470)

        Not only that, but whatever crappy player they're using doesn't seem to want to let you seek. No matter where you move the marker, the whole presentation just starts over from the beginning -- complete with the audience jabbering right over the speaker.

        Go to the source http://telexlr8.blip.tv/file/4083093/ [telexlr8.blip.tv] open the Files and Links box in the right column and download the original .mp4 video file.

  • by Drakkenmensch (1255800) on Tuesday September 07, 2010 @10:45AM (#33497910)
    The quantum computer is both a realistic ideal and vaporware hype, until a computer journalist examines the claims.
  • I'd position her qubits any day of the week!
  • Irony? (Score:2, Informative)

    by gotfork (1395155)
    Someone from D-wave is giving a talk called "separating hope from hype": http://arstechnica.com/hardware/news/2007/02/quantum.ars [arstechnica.com] http://www.technologyreview.com/computing/20587/ [technologyreview.com] http://en.wikipedia.org/wiki/D-Wave_Systems [wikipedia.org]
  • The real question is if there's some significant use case not already covered by current methods, like RSA and AES for encryption. Sure quantum encryption have some nice theoretical properties, but most things are not 110% secure. You can still bribe people, extort people, plant spys, record passwords and so on. I doubt for almost any system that pure crypto is the weakest link in the chain anymore. Maybe, just maybe there's a quantum code cracking computer deep in the halls of the NSA but it won't be any o

  • by Danathar (267989) on Tuesday September 07, 2010 @11:35AM (#33498390) Journal

    I was going to listen, but the dude yakking in the background totally oblivious (well..not totally oblivious as he questioned himself as to why he can hear himself talking) to the fact that his mic is broadcasting right over the speaker. Dumb.

  • W/O RTFA (Score:5, Informative)

    by mathimus1863 (1120437) on Tuesday September 07, 2010 @11:36AM (#33498398)
    I took a class on Quantum computing, and studied many specific QC algorithms, so I know a little bit about them. If you don't want to RTFA, then read this: Quantum Computers are not super-computers. On a bit-for-bit (or qubit-for-qubit) scale, they're not necessarily faster than regular computers, they just process info differently. Since information is stored in a quantum "superposition" of states, as opposed to a deterministic state like regular computers, the qubits exhibit quantum interference around other qubits. Typically, your bit starts in 50% '0' and 50% '1', and thus when you measure it, you get a 50% chance of it being one or the other (and then it assumes that state). But if you don't measure, and push it through quantum circuits allowing them to interact with other qubits, you get the quantum phases to interfere and cancel out. If you are damned smart (as I realized you have to be, to design QC algorithms), you can figure out creative ways to encode your problem into qubits, and use the interference to cancel out the information you don't want, and leave the information you do want. For instance, some calculations will start with the 50/50 qubit above, and end with 99% '0' and 1% '1' at the end of the calculation, or vice versa, depending on the answer. Then you've got a 99% chance of getting the right answer. If you run the calculation twice, you have a 99.99% chance of measuring the correct answer. However, the details of these circuits which perform quantum algorithms are extremely non-intuitive to most people, even those who study it. I found it to require an amazing degree of creativity, to figure out how to combine qubits to take advantage of quantum interference constructively. But what does this get us? Well it turns out that quantum computers can run anything a classical computer can do, and such algorithms can be written identically if you really wanted to, but doing so gets the same results as the classical computer (i.e. same order of growth). But, the smart people who have been publishing papers about this for the past 20 years have been finding new ways to combine qubits, to take advantage of nature of certain problems (usually deep, pure-math concepts), to achieve better orders of growth than possible on a classical computer. For instance, factoring large numbers is difficult on classical computers, which is why RSA/PGP/GPG/PKI/SSL is secure. It's order of growth is e^( n^(1/3) ). It's not quite exponential, but it's still prohibitive. It turns out that Shor figured out how to get it to n^2 on a quantum computer (which is the same order of growth as decrypting with the private key on a classical computer!). Strangely, trying to guess someone's encryption key, normally O(n) on classical computers (where n is the number of possible keys encryption keys) it's only O(sqrt(n)) on QCs. Weird (but sqrt(n) is still usually too big). There's a vast number of other problems for which efficient quantum algorithms have been found. Unfortunately, a lot of these problems aren't particularly useful in real life (besides to the curious pure-mathematician). A lot of them are better, but not phenomenal. Like verifying that two sparse matrices were mulitplied correctly has order of growth n^(7/3) on a classical computer, n^(5/3) on a quantum computer. You can find a pretty extensive list by googling "quantum algorithm zoo." Unfortunately [for humanity], there is no evidence yet that quantum computers will solve NP-complete problems efficiently. Most likely, they won't. So don't get your hopes up about solving the traveling salesmen problem any time soon. But there is still a lot of cool stuff we can do with them. In fact, the theory is so far ahead of the technology, that we're anxiously waiting for breakthroughs like this, so we can start plugging problems through known algorithms.
    • by kurokame (1764228)

      They're hypothetically faster in the case of quantum-quantum operations since they're analog with hypothetically infinite data density (where a binary bit stores a 0 or a 1, a qbit stores any value between 0 and 1). But without improved ways to interface with this, it's of fairly limited use. Nature simulates itself with perfect fidelity, which is not really of help to us unless we can find a reliable way to reduce the answer to something consistent and human-understandable.

      It's true but potentially mislead

    • Interesting post. Please consider using paragraphs in the future. They help readability singificantly.
    • by astar (203020)

      Yah, I did not rtfa either. But hey you might know something. So here is where I am coming from: where do you use a kinematic causality model vs a dynamic causality model? So there was a odd slashdot article recently on someone who built a quantum computer that was reliable enough to get some statistics on. So I guess he did a thousand runs. He got the right answer 60% of the time and something apparently random 40% of the time. If I think kinematics, I think machine. And I wonder, was the quantum co

  • "Why do I hear my voice?" during a video conference.

    Makes me feel hella smart. :D

    • by pgptag (782470)

      "Why do I hear my voice?" during a video conference.

      Makes me feel hella smart. :D

      I was surprised to hear my voice during a video conference, because I was hot speaking! Somebody in the audience had started playing a video clip he had recorded a few minutes before.

  • Oh not again (Score:3, Informative)

    by iris-n (1276146) on Tuesday September 07, 2010 @04:11PM (#33501686)

    These crooks from D-Wave just won't give up. 128 qubits quantum computer!? pics or it didn't happen.

    For more info: http://en.wikipedia.org/wiki/D-Wave_Systems [wikipedia.org]

Faith may be defined briefly as an illogical belief in the occurence of the improbable. - H. L. Mencken

Working...