Forgot your password?
typodupeerror
Hardware Science

First Von Neumann Architecture Quantum Computer 90

Posted by timothy
from the my-kids-better-be-smarter-than-I-am dept.
holy_calamity writes "The first computers with a von Neumann architecture, where a processor has access to RAM, appeared in the 1940s. Now the first quantum computing system with a von Neumann design has been made, at University of California Santa Barbara. Their quantum processor made up of two superconducting quantum bits can use a 2-bit "quantum RAM" to save entangled bit values into."
This discussion has been archived. No new comments can be posted.

First Von Neumann Architecture Quantum Computer

Comments Filter:
  • Will it run Quark?
    • by Anonymous Coward

      I don't know, but someone will try to run Doom on it.

      • I Can't help but wonder what a 16 Petabytes of RAM configuration would look like; sweet.
        • by mikael (484)

          You would have every possible Quake, Doom level as well as all the cheat codes.

        • by hairyfeet (841228)

          If it is like the PCs I work on most of it would be twiddling its thumbs waiting for something to do or be a "just in case' cache of pretty much everything ever run on the machine? I generally like to slam the crap out of my PC but since going to 8Gb I've found most of it is just cache because i simply don't have enough data being worked on at any one time to need it. Right now checking ResMon I'm looking at 6300Mb being used for cache and only 1800Mb in actual use.

          As for TFA while i'm sure it'll be useful

          • by X0563511 (793323)

            This stuff generally wouldn't find a use at home. Least not for a long time. It would be in the hands of researchers... you know, actually doing stuff.

            • I concur, but the goofy thing about making a computer capable of handling 16 Petabytes of RAM, is that someone will start making an App for it, that will require those 16 Peta's, and possibly a little more, thank you.

              Oh ya, I spit up my coffee when I read your sig, now I have a mess to clean up...
          • by luxifr (1194789)

            If it is like the PCs I work on most of it would be twiddling its thumbs waiting for something to do or be a "just in case' cache of pretty much everything ever run on the machine? I generally like to slam the crap out of my PC but since going to 8Gb I've found most of it is just cache because i simply don't have enough data being worked on at any one time to need it. Right now checking ResMon I'm looking at 6300Mb being used for cache and only 1800Mb in actual use.

            Mine Says: In Use: 6981MB, Cached: 8149MB, Free: 1220MB... And I don't do CAD, 3D-Modelling, Video-Editing or something like that.

            As for TFA while i'm sure it'll be useful for someone like the military to simulate nukes going off for most folks it would be a big old "meh" as they simply don't have enough work for even their duals and triples, much less something thousands of times faster. There just hasn't been a "killer app" that would really slam a CPU in so long it isn't even funny, that is why Nvidia and AMD are pushing 3D and Eyefinity because even games just aren't slamming the shit out of chips like the old days.

            Not true either. NVIDIA pushes 3D because they can and because 3D currently is the hottest new shit there is. AMD on the other hand pushes Eyefinity not only for enthusiast gamers but for bussiness, too. If you've ever worked with SAP, Datev or other bussiness software that needs huge amounts of screen real estate you might imagine how people could possibly appreciate having 3 or

        • That sounds like a sufficient amount of Doom if you ask me.

    • by geekoid (135745)

      I suppose, but I don't think a garbage scow needs that much power~

      • by ogdenk (712300)

        Is this coming from an official "Garbage Scow Captain"?

        Speaking of which I can't believe the modern "Atari" trashed my favorite childhood game recently with that horribly s***ty remake.

  • by Anonymous Coward on Thursday September 01, 2011 @05:49PM (#37280382)

    To be considered a Von Neumann architecture, the program code needs to be stored in the same ram as the working data. That's the whole point of it. Otherwise, it's a Harvard architecture.

    Given that the total ram capacity seems to be 2 bits (though, in all fairness, the bits are qu), it seems implausible that a useful program could fit in it.

    Though I have not actually read TFA, I'd say to be skeptical about this. What they probably meant is "It has RAM." Unfortunately they used the completely wrong term for this.

    • by Baloroth (2370816)
      The way they designed it the RAM it can store instructions in addition to data (they specifically state that in TFA), meaning it will be able, when made a little larger, to store programs in RAM. Also, one you can store 2 qubits, it's just a matter of scaling it up, same as regular transistor computers (keep in mind the first RAM was only a few bits too, and the first transistors were just a few per chip.) So I think they really did use the correct term and this is a (moderately) significant step forward.
      • I haven't read the whole posting, but I highly disagree with the use of the term moderately with regards to significant in this context. First I would need to understand your usage of the term significant. Are you the type of person that considers a Hummer to be a monster truck or an economy vehicle for city driving? That would give be a better idea of how to interpret your use of the word significant. Then in relation, the use of the optional term moderately can be applied, but again... I'd have to know if
    • by mestar (121800)

      "Given that the total ram capacity seems to be 2 bits (though, in all fairness, the bits are qu), it seems implausible that a useful program could fit in it."

      How do you know? It's not like you went trough all possible programs and checked which are useful and which are not...

      • by X0563511 (793323)

        00 - useful? No.
        01 - useful? No.
        10 - useful? No.
        11 - useful? No.

        Stick another possible value in there and it's still not useful.

  • Would half a nibble be enough for any serious operation? I know these are quantum bits so it is equivalent to the superposition of every possible value that those two bits could possibly represent which may be a lot but still it seems pretty useless at this point. I think it is too early to imagine what we could do with such a computer but the possibilities are there, maybe not just yet.
    • Re:2 bits? (Score:4, Funny)

      by somersault (912633) on Thursday September 01, 2011 @06:01PM (#37280514) Homepage Journal

      May be a lot? Try it yourself.. 2 times 2.

    • Not "nibble" it's nybble cause it's byte not bite.

      • by Anonymous Coward

        Just like it's byts, not bits? The spelling of nibble/nybble never really got nailed down like byte did. Both spellings are common. Read on. [wikipedia.org]

    • ... which may be a lot but still it seems pretty useless at this point.

      There was a time when people struggled to put 10 transistors on a chip. Valve-based computers of that era ran rings around the puny and useless transistorized systems. You have to start somewhere ...

      • Agree with the sentiment. But just to nitpick, tube-based computers were pretty cleanly on the way out by the time the IC was invented (1958-1960). Even a discrete point-contact transistor is smaller, more reliable and less power hungry than a tube.
  • A few things to note (Score:4, Informative)

    by JoshuaZ (1134087) on Thursday September 01, 2011 @05:51PM (#37280398) Homepage

    The term Von Neumann architecture has a variety of different meanings. One common meaning of the term is one in which instructions and data retrieval share a common bus. The original meaning was a bit more specific referring to a system that had a CPU, a separate memory for data and instructions, and input/output capability. Originally the real step forward was storing data and instructions together and treating them in some sense the same way which in many ways allowed a lot more flexibility in programming. Treating data and instructions the same way is something that still creates issues; SQL injection attacks are essentially just this: adding data that is formatted to look like instructions. But the upshot is that this use of the term- to use Von Neumann architecture to mean just having a working memory is a less common use of the term.

    Moving on from there, the system in question uses superconductors to control qubits. This is one of a variety of different systems being proposed. For example, the most recent quantum computing article on Slashdot ahref=http://hardware.slashdot.org/story/11/08/31/1844252/Record-Low-Error-Rate-For-Qubit-Processorrel=url2html-5998 [slashdot.org]http://hardware.slashdot.org/story/11/08/31/1844252/Record-Low-Error-Rate-For-Qubit-Processor> used ion traps. It is important to realize that different systems cannot be used together in any meaningful way. This means that improvements on any one type don't really carry over to the others. This is important if one is thinking in terms of when all this research will come together. A really good example of this is how early quantum computing used NMR systems http://en.wikipedia.org/wiki/Nuclear_magnetic_resonance_quantum_computer [wikipedia.org] which was then abandoned due to scaling and other issues. A lot of what was learned with NMR systems could not be applied to later quantum computers.

    • by notany (528696)

      Correct me if I'm wrong, but Von Neumann architecture quantum computer is not very usable. There is not any way to program it in a usable way. Nor there is way to get results out.

      • 99,9% (did I miss any 9?) of the programmers out threre don't seem to have many problems doing it.

    • by Anonymous Coward

      The processor architecture has no relevance at the level of SQL. Even if you were running a Harvard architecture with all program memory in read-only mode, SQL injection would work just like it does right now: you introduce unexpected characters in a text string and the SQL interpreter ends up doing something that the programmer did not want.

      However, buffer overflows are an intrusion technique that actually does sometimes depend on Von Neumann architecture, but even they can be performed by modifying only

  • If you can't copy a quantum state ...

    • My understanding was that you can copy a quantum state, it's just a destructive copy. Also commonly known as "teleportation" do the sensationalist press.
      • by geekoid (135745)

        But you can copy it's results once they have been rendered readable.

      • You can copy it, but it copies both a one and a zero. You don't know which it is until you look.

      • by blueg3 (192743)

        Quantum teleportation is a much more boring operating in quantum computing: it's the exchange operator, where the state of qbit A and the state of qbit B are swapped. This is a valid and boring procedure.

        Further, you can make it destructive, since you can set either of the two qbits to the zero state. To set qbit A to zero, you measure qbit A, which puts it in either the state 0 or the state 1 (with probability depending on its original state). You then apply the controlled-not operator so that it's bit-fli

    • is that because copying is stealing?

    • by aglider (2435074)

      For sure he cannot copy links into Slashdot.

    • I can't copy a bank note (well, with enough effort I might be able to, but it would be illegal anyway). Still, I can store one in my wallet.

  • When I first saw the article, I thought they had created a quantum version of THIS:-

    http://en.wikipedia.org/wiki/Von_Neumann_probe#Von_Neumann_probes [wikipedia.org]

    Oh well, we'll have to destroy the universe in using "classical" replicating machines instead.

  • Everything I know about quantum computing leads me to believe this is a silly exercise.

    - There's no benefit to having memory on the same chip as it's easier and more reliable to frame the problem and process the results with a non-quantum computer.
    - Having anything that close to the qubits makes it that much harder to handle decoherence which remains an unsolved problem on large scales.
    - "Conventional electrical circuits" aren't going to scale and if your quantum computing model can't scale, it's trash.

    Publ

    • by meza (414214)

      I'm not sure I understand your criticism completely but please not that it is not a normal memory they put on the chip but a "quantum memory" that can actually store the state of a qubit.

    • How do you know what you can and can't do with this type of architecture if you don't make an attempt to turn whiteboard theories into real world proof? They are not building this thing to run Windows it is a research tool.
      • by byeley (2451634)

        Quantum computing has been a hot research topic for 3 decades now. Do you really think no one considered "Conventional electrical circuits" before moving on to the elaborate qubit registers that current mainstream models use? Issues included error rate and scalability.

        There's nothing new about this model except that they're trying to make it programmable. Quantum computers don't need to be programmable; they're best suited for solving a small set of specific problems.

        • My only point is that people are still conducting this type of research for purely research purposes and not commercial applications at this time. In their efforts to make the model programmable whose to say they won't run into problems that end up leading them in an entirely different direction in their research.
        • by ogdenk (712300)

          Quantum computers don't need to be programmable; they're best suited for solving a small set of specific problems.

          The same things were said about the first electromechanical and electronic computers due to the extreme costs involved and limited initial interest commercially. I'm sure good old Konrad Zuse and Turing had to do an awful lot of convincing to sell people on the idea of funding such pipedreams. Remember, ENIAC was only programmable in an extremely limited fashion and didn't become truly "programmable" until much later and that program could only be read-only.

          Quantum computing now looks like the stone age o

  • Their quantum processor made up of two superconducting quantum bits can use a 2-bit "quantum RAM" to save entangled bit values into.

    And you thought your 32-bit system needed upgrading.

  • by Anonymous Coward

    The Manchester Baby of 1948 and the EDSAC of 1949 were the world's first computers with internally stored programs. They implemented a concept frequently (but erroneously) attributed to a 1945 paper of John von Neumann and colleagues. Von Neumann's own papers give proper credit to Alan Turing, and the concept had actually been mentioned earlier by Konrad Zuse himself, in a 1936 patent application (which was rejected).

    -- http://en.wikipedia.org/wiki/Z3_(computer)#Relation_to_other_work [wikipedia.org]

  • by jensend (71114) on Thursday September 01, 2011 @06:50PM (#37280946)

    That's what this computer is good for.

  • That's one small step forwards, backwards, both or neither for electrons and one indeterminate state for quantumkind.

  • Yes, but does it run Linux?
    • by ogdenk (712300)

      Heh I hope you're joking..... that's as bad as all the monkeys who wouldn't buy the Amiga or Atari ST because Lotus 1-2-3 and DOS wouldn't run without add-on hardware.

  • Where has all this quantum computing development brought the writing devices to now? Is there cheap (&lt$500) tech that can set the quantum spin value of electrons, photons, or of any other particle? Is there any spin-setting device that consumes less than the amount of energy differential between spin states it sets in order to set the state (in addition to the states' energy differential)?

  • Seriously if the US did not kidnap his ass he's ending up in Nuremberg for all those rocket they sent at London.
    • by rubycodez (864176)
      You are confused or need new material for your stand-up act. Von Neumann was Jewish from Hungary where he lived until in 20s, was a professor in Berlin 1926 to 1930, came to the U.S.in 1930 and was professor at Princeton ever since. In World War Two he did some work for allied side in the USA.
    • by Noughmad (1044096)

      von Neumann != von Braun. Not every "von" is the same.

  • It's not the one billion entangled bits so much. Its the terabyte of entangled kittens...

I find you lack of faith in the forth dithturbing. - Darse ("Darth") Vader

Working...