Researchers Create First All Optical Nanowire NAND Gate 50
mhore writes "Researchers at the University of Pennsylvania have created the first all optical, nanowire-based NAND gate, which paves the way towards photonic devices that manipulate light to perform computations. From the release: 'The research team began by precisely cutting a gap into a nanowire. They then pumped enough energy into the first nanowire segment that it began to emit laser light from its end and through the gap. Because the researchers started with a single nanowire, the two segment ends were perfectly matched, allowing the second segment to efficiently absorb and transmit the light down its length.' The gate works by shining light on the nanowire structure to turn on and off information transported through the wire. The research appeared this month in Nature Nanotechnology (abstract)."
Wrong direction (Score:4, Insightful)
I think we are wasting the potential of future optics if we think in binary, as this team is doing.
Optics scream for multilevel logic.
Re: (Score:2)
If we can do binary then we don't need to completely rethink the programming languages as well as test out the hardware.
Re: (Score:2, Interesting)
Re: (Score:2)
Betcha didn't know the holodeck is made-up technology.
Sure does speed up debugging!
Yes, we need to revisit everything. (Score:1)
If we can do binary then we don't need to completely rethink the programming languages as well as test out the hardware.
But that's what we need to do. Current software technology won't measure up to the new hardware technology. As it is, current development languages don't even use the multiprocessor CPUs most efficiently and these same tools are going to be horribly inadequate.
And we will see most of what we know about computer science will be obsolete with this new computational machine.
But that's the way it goes. People have to frame new technology in old paradigms because humans aren't that adaptable. But one day, some
Re: (Score:1)
Computer science is not going to be made obsolete by optical logic.
Re: (Score:3)
That isn't exactly what he meant. Current computer science will be mostly obsolete, but not the concept of computer science. It will adapt, it has to.
Binary will go away. (Score:1)
CompSci isn't married to electronics.
Binary math perfectly for switches: relays, then vacuum tubes and then transistors.
Binary will fail miserably in a quantum computational environment..
Operating System theory will be thrown out the door. So will networking.
Datastructures will definitely have to be reworked ...
CS as we know it is a goner; which is wonderful! I wish I can be around for it all. Alas, I'll probably be long gone by the time this technology makes to the stage where it will be useful to CS people.
Re: (Score:3)
Y'know what's even harder to design? Analog computing. Holy cow. Remember, digital
Re: (Score:1)
Y'know what's even harder to design? Analog computing. Holy cow. Remember, digital computing was invented by Touring before we even had built a computer. It's easy to visualize how it works. My brain explodes though trying to imagine a fuzzy-logic analog equivalent of a touring machine.
His name was Alan Turing. Honor him by at least spelling his name correctly...
Re: (Score:2)
And "honor" is the US spelling of the UK "honour". Your reaction is unproductive, childish and cowardly.
Re: (Score:2)
Never tried to imply it would be painless or trivial, but sometimes the end results are worth the effort to truly advance to a new level.
Re: (Score:1)
I think we are wasting the potential of future optics if we think in binary, as this team is doing.
I agree with this statement about 22% (orange).
Re: (Score:1)
You can do that with electric current, too. There's a reason analog computers died out.
Re: (Score:2)
Yes there was, but it was due to technology/cost not keeping up with the far simpler digital world, not due to an inherent problem with the concept.
I'm also not talking analog, but multilevel logic. There is a difference. You can still have your 'digital' accuracy, but increase your bandwidth several times over.
Re: (Score:2)
Computers used to be analog, but they screamed for digital logic. We live in a digital era for a reason.
Re: (Score:3)
Not sure what you mean by multilevel logic, but I'd suggest it screams for multiplexed logic. (By this, I mean using the same gates several times at once by multiplexing, who knows, different wavelengths, polarizations or angular moments.)
Re: (Score:1)
> Optics scream for multilevel logic.
Dunno where you are getting that from. Transistors are analogue as well (as used in amplifiers), we just chose to use them binarily (not-a-word) because it's easier to deal with. The Russians tried some stuff with trinary systems (more efficient because they are closer to base-e), but they were abandoned.
Re:Wrong direction (Score:4, Funny)
There already are 10 levels.
Re:Wrong direction (Score:5, Insightful)
Re: (Score:1)
Now, is that a binary or decimal order of magnitude? :-)
Re:NAND? Sounds like an AND gate to me... (Score:5, Informative)
Not sure where you read this... Per TFA:
[quote]
A NAND gate, which stands for “not and,” returns a “0” output when all its inputs are “1.”
[/quote]
And the Nature Nanotechnology article's summary says nothing specific.
Re: (Score:2)
NAND is read as Not-AND
For an AND gate, the output is only on if both inputs are on. The output is off in all other conditions (including both inputs being off)
A NAND gate is the reverse of that, it's output is only OFF when both inputs are on, and the output is on in all other cases.
The most importaint detail about the NAND gate, is that you can built all the other gates from nothing but NAND gates.
NAND and NOR gates are the only two logic gates that share this ability.
This article shows all the base logi
Re: (Score:3)
Been there (Score:3)
I remember reading over twenty years ago of an all-optical nand gate. This was pre-web, so it might not be easy to find, but I remember the article. The gate was much much larger, although the developers (of course) said that they expected to be able to shrink it in size to suitable dimensions. And a good part of the article was a prediction of an all optical computer within 5 years. The logic behind this prediction was that all steps in creating the state of technology that we had the were created very incrementally, We used hand labor to create the first ICs. We used those to create powerful computers. We created CAD software for those to develop even smaller computers. But all of those steps had been done and were in place, the prediction was that it should take no more than 5 years to substitute in optical technology for electronic technology into computer design software and start cranking out optical devices.
Not sure what happened to that optical nand gate from over two decades ago. Maybe it just couldn't shrink down. Maybe it was just falsified. Or maybe someone already has optical computers but will not share them with us conspiracy theorists. But I'm jaded now and not so inclined to get excited on yet another "first" announcement.
Density and No True Off (Score:2)
1/ The Wavelength of light is much larger than the structures used in modern day chips. So the optical circuity wouldn't be as dense as modern day electronics.
2/ When you turn off an optical signal it gets turned into heat (i.e the transistor goes black) not true in electronics as there is no electrical flow when the transistor is off. This causes a theoretical optical devices run hotter than electronic ones and therefore hurting density
Re:Density and No True Off (Score:4, Insightful)
Re: (Score:2)
Nyan gate (Score:1)
Moore's law will be saved by rainbows and kittens.