Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware Science Technology

Researchers Create First All Optical Nanowire NAND Gate 50

mhore writes "Researchers at the University of Pennsylvania have created the first all optical, nanowire-based NAND gate, which paves the way towards photonic devices that manipulate light to perform computations. From the release: 'The research team began by precisely cutting a gap into a nanowire. They then pumped enough energy into the first nanowire segment that it began to emit laser light from its end and through the gap. Because the researchers started with a single nanowire, the two segment ends were perfectly matched, allowing the second segment to efficiently absorb and transmit the light down its length.' The gate works by shining light on the nanowire structure to turn on and off information transported through the wire. The research appeared this month in Nature Nanotechnology (abstract)."
This discussion has been archived. No new comments can be posted.

Researchers Create First All Optical Nanowire NAND Gate

Comments Filter:
  • Wrong direction (Score:4, Insightful)

    by nurb432 ( 527695 ) on Saturday September 08, 2012 @01:50PM (#41275017) Homepage Journal

    I think we are wasting the potential of future optics if we think in binary, as this team is doing.

    Optics scream for multilevel logic.

    • If we can do binary then we don't need to completely rethink the programming languages as well as test out the hardware.

      • Re: (Score:2, Interesting)

        by Teresita ( 982888 )
        Betcha didn't know all the computations for the holodeck were done inside the holograms themselves.
      • by Anonymous Coward

        If we can do binary then we don't need to completely rethink the programming languages as well as test out the hardware.

        But that's what we need to do. Current software technology won't measure up to the new hardware technology. As it is, current development languages don't even use the multiprocessor CPUs most efficiently and these same tools are going to be horribly inadequate.

        And we will see most of what we know about computer science will be obsolete with this new computational machine.

        But that's the way it goes. People have to frame new technology in old paradigms because humans aren't that adaptable. But one day, some

        • by Anonymous Coward

          Computer science is not going to be made obsolete by optical logic.

          • by nurb432 ( 527695 )

            That isn't exactly what he meant. Current computer science will be mostly obsolete, but not the concept of computer science. It will adapt, it has to.

        • by mo ( 2873 )
          It's not that humans are not adaptable, it's that parallel computing is hard for humans to figure out. Linear execution lends itself to all kinds of easy abstractions: loops, branches, methods, etc. Parallel computing, not so much. Mutexes are awful. The best we've got is message passing and functional programming, but even that is hard to design correctly to be both understandable and exploit inherent parallelism.

          Y'know what's even harder to design? Analog computing. Holy cow. Remember, digital
          • Y'know what's even harder to design? Analog computing. Holy cow. Remember, digital computing was invented by Touring before we even had built a computer. It's easy to visualize how it works. My brain explodes though trying to imagine a fuzzy-logic analog equivalent of a touring machine.

            His name was Alan Turing. Honor him by at least spelling his name correctly...

      • by nurb432 ( 527695 )

        Never tried to imply it would be painless or trivial, but sometimes the end results are worth the effort to truly advance to a new level.

    • by Anonymous Coward

      I think we are wasting the potential of future optics if we think in binary, as this team is doing.

      I agree with this statement about 22% (orange).

    • by Anonymous Coward

      You can do that with electric current, too. There's a reason analog computers died out.

      • by nurb432 ( 527695 )

        Yes there was, but it was due to technology/cost not keeping up with the far simpler digital world, not due to an inherent problem with the concept.

        I'm also not talking analog, but multilevel logic. There is a difference. You can still have your 'digital' accuracy, but increase your bandwidth several times over.

    • Computers used to be analog, but they screamed for digital logic. We live in a digital era for a reason.

    • by Kergan ( 780543 )

      Not sure what you mean by multilevel logic, but I'd suggest it screams for multiplexed logic. (By this, I mean using the same gates several times at once by multiplexing, who knows, different wavelengths, polarizations or angular moments.)

    • by Anonymous Coward

      > Optics scream for multilevel logic.

      Dunno where you are getting that from. Transistors are analogue as well (as used in amplifiers), we just chose to use them binarily (not-a-word) because it's easier to deal with. The Russians tried some stuff with trinary systems (more efficient because they are closer to base-e), but they were abandoned.

    • by jovius ( 974690 ) on Saturday September 08, 2012 @02:56PM (#41275383)

      There already are 10 levels.

    • Re:Wrong direction (Score:5, Insightful)

      by perl6geek ( 1867146 ) on Saturday September 08, 2012 @03:05PM (#41275465)
      I've worked two years on a PHD thesis involving all-optical signal processing (though I worked on all-optical signal regeneration, not logical gates), and one of my conclusions is that multi-level is an order of magnitude more challenging than two values. The reason is that if you do multiple processing steps, you usually get some random fluctations, so you need to have components that fix that, i.e. fix to a certain level. Now you have basically two options, you can encode your information in the phase or in the amplitude/power. In the case of power levels you can use something like nonlinear loop mirrors, but they have the problem that they change the power ratio level between the states. In the case of phase encoded signals, a you can use a saturated phase-sensitive amplifier (for example two symmetric pumps), but they require quite high powers, and you have to injection-lock the pumps to compensate phase drifts, and they still only work for two levels. There is exactly one scheme that works for multiple levels (see http://eprints.soton.ac.uk/336325/1.hasCoversheetVersion/Thesis.pdf [soton.ac.uk] for a PHD thesis about it), but it turns phase noise into amplitude noise, so you need an amplitude regenator after it. So, binary logic is plenty of challenge to get working; once that's establish, we can still think about multiple levels.
      • by Anonymous Coward

        Now, is that a binary or decimal order of magnitude? :-)

  • by frovingslosh ( 582462 ) on Saturday September 08, 2012 @02:59PM (#41275423)

    I remember reading over twenty years ago of an all-optical nand gate. This was pre-web, so it might not be easy to find, but I remember the article. The gate was much much larger, although the developers (of course) said that they expected to be able to shrink it in size to suitable dimensions. And a good part of the article was a prediction of an all optical computer within 5 years. The logic behind this prediction was that all steps in creating the state of technology that we had the were created very incrementally, We used hand labor to create the first ICs. We used those to create powerful computers. We created CAD software for those to develop even smaller computers. But all of those steps had been done and were in place, the prediction was that it should take no more than 5 years to substitute in optical technology for electronic technology into computer design software and start cranking out optical devices.

    Not sure what happened to that optical nand gate from over two decades ago. Maybe it just couldn't shrink down. Maybe it was just falsified. Or maybe someone already has optical computers but will not share them with us conspiracy theorists. But I'm jaded now and not so inclined to get excited on yet another "first" announcement.

  • The two major disadvantages of all Optical processing in the past were:

    1/ The Wavelength of light is much larger than the structures used in modern day chips. So the optical circuity wouldn't be as dense as modern day electronics.

    2/ When you turn off an optical signal it gets turned into heat (i.e the transistor goes black) not true in electronics as there is no electrical flow when the transistor is off. This causes a theoretical optical devices run hotter than electronic ones and therefore hurting density
  • by Anonymous Coward

    Moore's law will be saved by rainbows and kittens.

Genius is ten percent inspiration and fifty percent capital gains.

Working...