Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Hardware

Asynchronous Logic: Ready For It? 192

prostoalex writes "For a while academia and R&D labs explored the possibilities of asynchronous logic. Now Bernard Cole from Embedded.com tells us that asynchronous logic might receive more acceptance than expected in modern designs. The main advantages, as article states, are 'reduced power consumption, reduced current peaks, and reduced electromagnetic emission', to quote a prominent researcher from Philips Semiconductors. Earlier Bernard Cole wrote a column on self-timed asynchronous logic."
This discussion has been archived. No new comments can be posted.

Asynchronous Logic: Ready For It?

Comments Filter:
  • alright, question... (Score:2, Interesting)

    by Anonymous Coward on Monday October 21, 2002 @11:19AM (#4495645)
    so how long do you think this will take before it's implemented on any sort of a large scale?
  • by sp00nfed ( 518619 ) on Monday October 21, 2002 @11:21AM (#4495665) Homepage
    Doesn't the pentium 4 and most CPU's nowadays have at least some elements of asynchronous logic? And I'm pretty sure that certain circuits in most current CPU's are implemented asynchronously.

    Isn't this the same as having a CPU without a timer? (i.e. no MHz/GHz rating).

  • by phorm ( 591458 ) on Monday October 21, 2002 @11:24AM (#4495693) Journal
    On the flip side, the millions of simultaneous transitions in synchronous logic begs for a better way, and that may well be asynchronous logic

    The advantage outlined here seems to be independant functionality between different areas of the PC. It would be nice if the components could work independently and time themselves, but is there really a huge loss in sustained synchonous data transfer?

    From what I've understood, in most aspects of computing, synchronous data communication is preferable. IE, network cards, sound-cards, printers, etc. Don't better models support bi-directional synchronous communication?
  • Cyclic History (Score:5, Interesting)

    by nurb432 ( 527695 ) on Monday October 21, 2002 @11:27AM (#4495725) Homepage Journal
    Isn't this where the idea of digital logic really got started? At least its how it was taught when I was in school.

    We even did some design work in async. Cool stuff. Easy to do, fast as hell...

    Never did figure out why it never caught on. Except for the difficulty in being general purpose.. so easy of a job with sync logic. And i guess it does take a certian mind-set to follow it.
  • by catwh0re ( 540371 ) on Monday October 21, 2002 @11:28AM (#4495739)
    A while back I read an article about intel making p2 clockless chips, that performed rougly 3 times(in MHz terms not overall performance) faster.

    Intel recognise clockless as the future, and hence the P4 actually has portions designed that are clockless.

    Before know-it-all's follow this up with "but it runs at 2.xx GHz", let them please read an article on about how much of your chip is oscilating at that immense speed.

    As it's said in the EE industry, "oh god imagine if that clock speed was let free on the whole system"

  • might be a while... (Score:2, Interesting)

    by jaredcoleman ( 616268 ) on Monday October 21, 2002 @11:31AM (#4495764)
    These chips are great for battery powered devices, such as pagers, because they don't have to power a clock. Extends the batt life at least 2x. But even if the advantages are superior to clocked chips for larger markets, how do you market something like this to people who want to see "Pentium XXXVIV 1,000,000 Ghz" on the packaging?
  • In a way, an asynchronous circuit design already is a parallel computer. An asynchronous machine contains many (largely) independent components that communicate with each other in order to solve computational problems more efficiently by breaking them down into small pieces and working on them in parallel.

    In this context, your notions of parallel computing will change greatly. Currently, individual nodes in a cluster are islands of computation, separated by (comparably) vast distances. Messages between nodes take orders of magnitude more time than messages within a node.

    When you set out to build a supercomputing cluster in the asynchronous world, ideally the entire cluster would be within a single die. Then the latency between nodes would be reduced to microseconds or nanoseconds, and nodes could split work more effectively. The high-speed buses and complex arbitration schemes required for asynchronous computing will be equally useful for designing massively parallel clusters-on-a-chip.
  • by renoX ( 11677 ) on Monday October 21, 2002 @11:39AM (#4495838)
    I'm wondering how asynchronous logic stand up against transiant errors induced by a cosmic ray?

    On a synchronous circuit most of the time such glitch won't do anything because it won't occur at the same time the clock "ring" so the incorrect transient value will be ignored.

    As the "drawing size" of circuits gets lower and lower, every circuit must be hardened against radiations, not only circuits which must go on space or in planes..
  • asynchronous logic? (Score:2, Interesting)

    by snatchitup ( 466222 ) on Monday October 21, 2002 @11:58AM (#4496012) Homepage Journal
    Sounds like my wife.

    But seriously, isn't that an oxymoron?

    At first, I thought it meant that we take a program, break it up into logic elements and scramble them like an egg. That won't work.

    But after reading, I see it means that everything isn't pulsed by the same clock. So, if a circuit of 1,000 transistors only needs 3 picoseconds to do it's job, while another 3000 transistors actually need 5 picoseconds, then entire 4000 transistors are turned on for5 picoseconds. So, 3000 transistors are needlessly powered for 2 picoseconds.

    This adds up when we're talking 4 million transistors and living in the age of the Gigahertz.
  • Re:Cyclic History (Score:4, Interesting)

    by Anonvmous Coward ( 589068 ) on Monday October 21, 2002 @12:29PM (#4496335)
    "Never did figure out why it never caught on."

    I think the internet is a good metaphor of this technology. Take Quake 3 for example. Think about what all it takes to get several people playing over the net. They all have to respond within a certain time-out phase, for adequate performance they have to respond in a fraction of the timeout time, and there's a whole lot of code dedicated to making sure that when I fire my gun, 200ms later it hits the right spot and dings the right player for it.

    It works, but the logic to make that work is FAR more complex than the logic it takes to make something like a 'clocked internet' work. The downside, though, is that if you imagine what a clocked internet would be like, you'd understand why Q3 wouldn't work at all. In other words, the benefits would probably be worthwhile, but it's not a simple upgrade.

  • by mfago ( 514801 ) on Monday October 21, 2002 @12:38PM (#4496414)
    Here at Caltech [caltech.edu] the CS department is into this kind of thing.

    They've even built a nearly complete MIPS 3000 compatible processor [caltech.edu] using async logic.

    Seems pretty cool, but I'm waiting for some company to expend the resources to implement a more current processor (such as the PowerPC 970 perhaps) in this fashion.
  • Re:Kurzweil (Score:2, Interesting)

    by LordKane ( 582228 ) <.moc.liamtoh. .ta. .66696enak.> on Monday October 21, 2002 @01:01PM (#4496659)
    "ANNs have nothing to do with what's really going on in a biological brain, except that they are made of many interacting simple processing elements."

    I'm not sure quite how this can be. ANNs are inspired by and based on the biological brain. But they are not related? ANNs are just pattern matchers, and our brains are nothing like that? I beg to differ. ANNs are very similar to our brains. Humans are giant pattern matchers. How do we learn? Does stuff just pop into out heads and !BANG! we know it? No, we discover it or are told first. Science is based on being able to reproduce the results of an experiment, matching a cause->effect pattern. Speech is matching sound to meanings. Not everyone equates a sound to the same meaning, as patterns that people learned can be different. Animals are great examples of pattern machines. How are animals trained? Most often, by positive and negative reinforcement, which is essentially, conditioning. We do the same thing to our ANNs, you get a cookie if it's right, nothing if it's wrong. The close matches are kept, the others are thrown out. So, in what way are ANNs nothing like a biological brain? To me, it seems that they are incredibly similar, just on different scales. Our ANNs today are tiny and don't do to much as compared to the standard of a brain, which is layers upon layers of interlaced patterns. ANNs use simple structures as the basis for their overall structure, are our brain cells not very similar? To me, it seems that they are incredibly similar, just on different scales.
  • Re:Kurzweil (Score:5, Interesting)

    by imadork ( 226897 ) on Monday October 21, 2002 @01:02PM (#4496667) Homepage
    Why does everyone seem to think that ANNs are the way toward "true AI?" ANNs are superb pattern matching machines. They can predict, and are resilient to link damage to some degree. But they do not think. ANNs have nothing to do with what's really going on in a biological brain, except that they are made of many interacting simple processing elements. The biological brain inspired ANN, but that's all.

    I couldn't agree more. I remember reading a comparison between the current state of AI and the state of early Flight Technology. (it may have even been here, I don't recall. I make no claim to thinking this up myself. Perhaps someone can point me to a link discussing who first thought of this?)

    One of the reasons that early attempts at flight did not do well is because the people designing them merely tried to imitate things that fly naturally, without really understanding why things were built that way in the first place. So, people tried to make devices with wings that flapped fast, and they didn't work. It wasn't until someone (Bernoulli?) figured out how wings work - the scientific principles behind flight - that we were able to make flying machines that actually work.

    Current AI and "thinking machines" are in a similar state as the first attempts to fly were in. We can do a passable job at using our teraflops of computing power to do a brute-force imitation of thought. But until someone understands the basic scientific principles behind thought, we will never make machines that think.

  • by brejc8 ( 223089 ) on Monday October 21, 2002 @01:39PM (#4497119) Homepage Journal
    This [man.ac.uk] is my favorite example of asynchronnous logic.
    As the rat speeds up or slows down the chip compensates for it.
    Not often that you cant play with toy mice and call it research.
  • I don't see how your statement is in any way different from mine. I brought up the fact that CPUs make use of asynchronous logic within a single clock cycle, and explained that this does not make them asynchronous machines.

    I think you parsed my sentence incorrectly.

    I could say "The US might be a free country, but we still have laws to protect others' freedoms from impinging on our own."

    Does this mean that I am in doubt, as to whether the US is a free country? No; the US is a free country by definition, the Constitution having defined it as such.

    The "Might...but" construct is frequently used in the English language to introduce a fact, and then qualify it.
  • by Anonymous Coward on Monday October 21, 2002 @04:21PM (#4498727)
    In an article in Engineering magazine (it is distributed to all engineers at Kansas State University and probably many others) and Intel representative (I can't recall the name) claimed that anyone who came up with a completely asynchronous x86 solution would have a significant advantage. He is right for several reasons.

    1) Power usage and heat are important in many devices. The Via Cyrix chips do reasonably well, even without the power of Intel and AMD designs.

    2) Fully asynchronous chips do have the capacity to perform far better than synchronous designs.

    The article also discusses various designs for asynchronous chips. A good read, if you can find it. The magazine came out some time towards the end of 2000 (I am thinking November), but not for sure.

All the simple programs have been written.

Working...