Asynchronous Logic: Ready For It? 192
prostoalex writes "For a while academia and R&D labs explored the possibilities of asynchronous logic. Now Bernard Cole from Embedded.com tells us that asynchronous logic might receive more acceptance than expected in modern designs. The main advantages, as article states, are 'reduced power consumption, reduced current peaks, and reduced electromagnetic emission', to quote a prominent researcher from Philips Semiconductors. Earlier Bernard Cole wrote a column on self-timed asynchronous logic."
Kurzweil (Score:3, Insightful)
Problem with Async (Score:3, Insightful)
ok, but... (Score:1, Insightful)
We all know about the advantages async logic has in many respects to clocked one. The problems is, the async logic *design* tools are nowhere as good or as many as the tools available for designing clocked logic.
Chicken and egg problem? Maybe, or maybe just another untapped opportunity for those crazy software people...
Re:What's wrong with synchronous? (Score:5, Insightful)
You're just talking about I/O. Of course I/O has to be synchronous, because it involves handshaking.
I think there are some general misconceptions about what "asynchronous" means. Seriously, all I'm seeing are comments from people without a clue about chip design, other than what they read about at arstechnica.com or aceshardware.com. And if you don't know anything about the *real* internals of synchronous chips, then how can you blast asynchronous designs?
So-called asynchronous processors have already been designed and prototyped. Chuck Moore's recent (as in "ten years old") stack processors are mostly asynchronous, for example. Most people are only familiar with the x86 line, and to a lesser extent the PowerPC, and a much, much lesser extent the Alpha and UltraSPARC. Unless you've done some research into a *variety* of processor architectures, please refrain from commenting. Otherwise you come across like some kind of "Linux rules!" weenie who doesn't have a clue what else is out there besides (Windows, MacOS, and UNIX-variants).
What if? (Score:5, Insightful)
Seriously though, does the temperature affect the switching time? Or does the liquid nitrogen trick just prevent meltdown of an overclocked chip?
Re:Some further information (Score:2, Insightful)
BTW, he goes by different names, usually those with the word "Physics" in it.
Here's another example of his copy and pasting:
This post: http://developers.slashdot.org/comments.pl?sid=42
is copied from this web page:
http://www.intuitor.com/moviephysics/mpmain.html [intuitor.com]
Take a look for yourself at his post history, the wide range of topics, and supposed knowledge.
Dave
Re:Some further information (Score:4, Insightful)
Those of us that have been around the block more than twice know that asynchronous design has been the technology of the future for a long, long time. My personal experience goes back to the mid-seventies, but I'm sure there were asynch he-men doing their thing with vacuum tubes and RTL. :-)
The catch, then as now, is that asynch logic is just plain more difficult for our tiny little human brains to grok. This was true back in the days when humans designed their own logic, and it is even more true now when 99%+ of all logic is designed not by humans, but by logic synthesis software (Synopsys DC and Cadence PKS).
That said, there are always folks out there doing Cool Stuff w/asynch circuits. Hope that Ivan Sutherlands's group [sun.com] at Sun Labs survives Sun's recent massive layoffs. [theregus.com]
Re:Kurzweil (Score:3, Insightful)
First off, there's no proof of this. The brain certainly appears to be asynchronous, but there's no evidence to suggest that there isn't some kind of internal, distributed clocking mechanism that keeps the individual parts working together. There's not enough evidence either way.
Async logic might very well bring large neural net research into practicality.
Why does everyone seem to think that ANNs are the way toward "true AI?" ANNs are superb pattern matching machines. They can predict, and are resilient to link damage to some degree. But they do not think. ANNs have nothing to do with what's really going on in a biological brain, except that they are made of many interacting simple processing elements. The biological brain inspired ANN, but that's all.
Re:Kurzweil (Score:5, Insightful)
Fourth, when an ANN is trained, every weight in the network is changed. In a biological brain, particular links form and are destroyed, but learning is not a global process. I'm not a neuroscientist, so if I'm wrong, someone please point that out.
Fifth, you can ask a human why he/she came to a particular conclusion. You can't ask an ANN why it reached a particular conclusion. Sometimes, analysis is possible on smaller networks. But for multi-layer networks with thousands of hidden units, this becomes impossible. I really don't think it's a question of computational power. I have a deep sense that somehow, biobrains are fundamentally different from their mathematical cousins.
I won't claim that ANNs have no place in thinking machines. But having worked with them extensively, I feel that, although they are extremely valuable computational tools, they are not a magic wand. Many pattern recognition and data organization tasks can be much better performed by traditional symbolic algorithms.
Re:timerless design elements in pentium4? (Score:3, Insightful)
Re:timerless design elements in pentium4? (Score:2, Insightful)