Asynchronous Logic: Ready For It? 192
prostoalex writes "For a while academia and R&D labs explored the possibilities of asynchronous logic. Now Bernard Cole from Embedded.com tells us that asynchronous logic might receive more acceptance than expected in modern designs. The main advantages, as article states, are 'reduced power consumption, reduced current peaks, and reduced electromagnetic emission', to quote a prominent researcher from Philips Semiconductors. Earlier Bernard Cole wrote a column on self-timed asynchronous logic."
alright, question... (Score:2, Interesting)
timerless design elements in pentium4? (Score:4, Interesting)
Isn't this the same as having a CPU without a timer? (i.e. no MHz/GHz rating).
What's wrong with synchronous? (Score:5, Interesting)
The advantage outlined here seems to be independant functionality between different areas of the PC. It would be nice if the components could work independently and time themselves, but is there really a huge loss in sustained synchonous data transfer?
From what I've understood, in most aspects of computing, synchronous data communication is preferable. IE, network cards, sound-cards, printers, etc. Don't better models support bi-directional synchronous communication?
Cyclic History (Score:5, Interesting)
We even did some design work in async. Cool stuff. Easy to do, fast as hell...
Never did figure out why it never caught on. Except for the difficulty in being general purpose.. so easy of a job with sync logic. And i guess it does take a certian mind-set to follow it.
Intel and asynch clocks (Score:4, Interesting)
Intel recognise clockless as the future, and hence the P4 actually has portions designed that are clockless.
Before know-it-all's follow this up with "but it runs at 2.xx GHz", let them please read an article on about how much of your chip is oscilating at that immense speed.
As it's said in the EE industry, "oh god imagine if that clock speed was let free on the whole system"
might be a while... (Score:2, Interesting)
Beowulf redundant once you have async chips (Score:4, Interesting)
In this context, your notions of parallel computing will change greatly. Currently, individual nodes in a cluster are islands of computation, separated by (comparably) vast distances. Messages between nodes take orders of magnitude more time than messages within a node.
When you set out to build a supercomputing cluster in the asynchronous world, ideally the entire cluster would be within a single die. Then the latency between nodes would be reduced to microseconds or nanoseconds, and nodes could split work more effectively. The high-speed buses and complex arbitration schemes required for asynchronous computing will be equally useful for designing massively parallel clusters-on-a-chip.
Asynchronous logic vs radiation ? (Score:4, Interesting)
On a synchronous circuit most of the time such glitch won't do anything because it won't occur at the same time the clock "ring" so the incorrect transient value will be ignored.
As the "drawing size" of circuits gets lower and lower, every circuit must be hardened against radiations, not only circuits which must go on space or in planes..
asynchronous logic? (Score:2, Interesting)
But seriously, isn't that an oxymoron?
At first, I thought it meant that we take a program, break it up into logic elements and scramble them like an egg. That won't work.
But after reading, I see it means that everything isn't pulsed by the same clock. So, if a circuit of 1,000 transistors only needs 3 picoseconds to do it's job, while another 3000 transistors actually need 5 picoseconds, then entire 4000 transistors are turned on for5 picoseconds. So, 3000 transistors are needlessly powered for 2 picoseconds.
This adds up when we're talking 4 million transistors and living in the age of the Gigahertz.
Re:Cyclic History (Score:4, Interesting)
I think the internet is a good metaphor of this technology. Take Quake 3 for example. Think about what all it takes to get several people playing over the net. They all have to respond within a certain time-out phase, for adequate performance they have to respond in a fraction of the timeout time, and there's a whole lot of code dedicated to making sure that when I fire my gun, 200ms later it hits the right spot and dings the right player for it.
It works, but the logic to make that work is FAR more complex than the logic it takes to make something like a 'clocked internet' work. The downside, though, is that if you imagine what a clocked internet would be like, you'd understand why Q3 wouldn't work at all. In other words, the benefits would probably be worthwhile, but it's not a simple upgrade.
Async research group at Caltech (Score:3, Interesting)
They've even built a nearly complete MIPS 3000 compatible processor [caltech.edu] using async logic.
Seems pretty cool, but I'm waiting for some company to expend the resources to implement a more current processor (such as the PowerPC 970 perhaps) in this fashion.
Re:Kurzweil (Score:2, Interesting)
I'm not sure quite how this can be. ANNs are inspired by and based on the biological brain. But they are not related? ANNs are just pattern matchers, and our brains are nothing like that? I beg to differ. ANNs are very similar to our brains. Humans are giant pattern matchers. How do we learn? Does stuff just pop into out heads and !BANG! we know it? No, we discover it or are told first. Science is based on being able to reproduce the results of an experiment, matching a cause->effect pattern. Speech is matching sound to meanings. Not everyone equates a sound to the same meaning, as patterns that people learned can be different. Animals are great examples of pattern machines. How are animals trained? Most often, by positive and negative reinforcement, which is essentially, conditioning. We do the same thing to our ANNs, you get a cookie if it's right, nothing if it's wrong. The close matches are kept, the others are thrown out. So, in what way are ANNs nothing like a biological brain? To me, it seems that they are incredibly similar, just on different scales. Our ANNs today are tiny and don't do to much as compared to the standard of a brain, which is layers upon layers of interlaced patterns. ANNs use simple structures as the basis for their overall structure, are our brain cells not very similar? To me, it seems that they are incredibly similar, just on different scales.
Re:Kurzweil (Score:5, Interesting)
I couldn't agree more. I remember reading a comparison between the current state of AI and the state of early Flight Technology. (it may have even been here, I don't recall. I make no claim to thinking this up myself. Perhaps someone can point me to a link discussing who first thought of this?)
One of the reasons that early attempts at flight did not do well is because the people designing them merely tried to imitate things that fly naturally, without really understanding why things were built that way in the first place. So, people tried to make devices with wings that flapped fast, and they didn't work. It wasn't until someone (Bernoulli?) figured out how wings work - the scientific principles behind flight - that we were able to make flying machines that actually work.
Current AI and "thinking machines" are in a similar state as the first attempts to fly were in. We can do a passable job at using our teraflops of computing power to do a brute-force imitation of thought. But until someone understands the basic scientific principles behind thought, we will never make machines that think.
Cute example of Asynchronous logic? (Score:3, Interesting)
As the rat speeds up or slows down the chip compensates for it.
Not often that you cant play with toy mice and call it research.
Re:timerless design elements in pentium4? (Score:3, Interesting)
I think you parsed my sentence incorrectly.
I could say "The US might be a free country, but we still have laws to protect others' freedoms from impinging on our own."
Does this mean that I am in doubt, as to whether the US is a free country? No; the US is a free country by definition, the Constitution having defined it as such.
The "Might...but" construct is frequently used in the English language to introduce a fact, and then qualify it.
Article in Engineering Magizine (Score:1, Interesting)
1) Power usage and heat are important in many devices. The Via Cyrix chips do reasonably well, even without the power of Intel and AMD designs.
2) Fully asynchronous chips do have the capacity to perform far better than synchronous designs.
The article also discusses various designs for asynchronous chips. A good read, if you can find it. The magazine came out some time towards the end of 2000 (I am thinking November), but not for sure.