Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Hardware

Introducing Probability into Chip Design 271

prostoalex writes "The August issue of Intel Developer Update has an interview with Shekhar Borkar, Intel Fellow and Director of Circuit Research at Intel Corp. talking about the future of microprocessor design and what goes on inside Intel Labs. Borkar tells why we need even faster processors and how probability will make its way into future chip designs - "It's like the shift from Newtonian mechanics to quantum mechanics. We will shift from the deterministic designs of today to probabilistic and statistical designs of the future.""
This discussion has been archived. No new comments can be posted.

Introducing Probability into Chip Design

Comments Filter:
  • 1 + 1 (Score:5, Funny)

    by rastos1 ( 601318 ) on Monday August 25, 2003 @07:10AM (#6782936)
    1 + 1 = 2. However there is a 0.0009% probability of it being 1.999999999.

    Sorry could not resist.

    • by TrekkieGod ( 627867 ) on Monday August 25, 2003 @07:15AM (#6782969) Homepage Journal
      Is that .9 repeating? If so, there's a 100% possibility of 1 + 1 = 1.9999...

      .999... is exactly equal to 1. To the non-believers out there, consider that 1/3 = .333..., and that 1/3 + 1/3 + 1/3 = 1.

      • 1 is the canonical representation of 0.9999... in the decimal system. incidentally, it is also the canonical representation of 1.00000...1. just to throw a spanner in the works, what's the largest real number smaller than 1? most folks want to say 0.99999... any mathheads out there who've done calculus or analysis more recently than i want to take a crack at it? (fwiw, i think the supremum of the set of all numbers less than 1 is 1, but the set itself doesn't contain a maximum)
        • by blancolioni ( 147353 ) on Monday August 25, 2003 @07:40AM (#6783088) Homepage

          The supremum of all reals less than one is one. The set itself, as you said, doesn't have a maximum element.

          In a not-at-all-patronising way, I'm surprised that this is even up for discussion on /. but that's probably my bad. Anyway, say X was the maximum real number less than one. Let Y = 1 - (1 - X) / 2. Now clearly Y is less than 1, but also Y - X = (1 - X) / 2 which is > 0 since 1 - X > 0, so Y > X, and therefore X is not the maximum.
      • It's true! ...and the same goes for any numeric base:

        In base 7, 0.66666666... is equal to 1
        In base 2, 0.11111111... is equal to 1
        etc.

        Notice that, in binary (base 2):
        1/3 = 0.010101010101...
        2/3 = 0.101010101010...
        3/3 = 0.111111111111...

      • This is in fact not true, however, the proof which I have witnessed on at least two occaisions requires calculus which I have done my best to remove all traces of from my mind. You will just have to trust me, as well as take into account the fact that all proofs given so far to the contrary are flawed.
    • Re:1 + 1 (Score:5, Funny)

      by SpaghettiPattern ( 609814 ) on Monday August 25, 2003 @07:24AM (#6783016)
      That was done before in the first batches of Pentium 0.99999999.
      • That was done before in the first batches of Pentium 0.99999999.

        So that would be why the first batches of Pentium chips are actualy 486.99999-class processors.

    • There is a 90% chance that you are correct. But there is only a 50% chance of that.
    • Well, if you want to build a real improbability engine you need to be able to do that kind of math. One step at a time...

  • by Wameku ( 629442 ) on Monday August 25, 2003 @07:10AM (#6782940)
    UM, Ford. theres an infinite number of monkeys outside that want to talk to us about a script for hamlet they've hammered out. PROBABILITY FACTOR OF 1 to 1: any other problems are your own lookout.
  • Is this new? (Score:5, Insightful)

    by Jugalator ( 259273 ) on Monday August 25, 2003 @07:11AM (#6782944) Journal
    "We will shift from the deterministic designs of today to probabilistic and statistical designs of the future"

    Doesn't branch predictions in current processors use probabilities already?
    • Re:Is this new? (Score:5, Interesting)

      by bentini ( 161979 ) on Monday August 25, 2003 @07:13AM (#6782951)
      Probabilities that will always be the same if you run the exact same sequence of commands.

      What he appears to be suggesting is transistors that we acknowledge to be based in an analog world -- their state depends not only on the data you feed them, but also on the temperature they are immersed in, etc.

      • Re:Is this new? (Score:5, Informative)

        by Jugalator ( 259273 ) on Monday August 25, 2003 @07:28AM (#6783034) Journal
        Oh, I see... The page now loaded for me, and it seems they're simply considering the fact that previously, hardware performance didn't vary that much, but now when we've got down to real small components -- down to atom level -- that are packed closely together, the probability that the chips will behave differently from environment changes becomes greater. And as the probability of chips "misbehaving" increases, there will also be an increasing need of chips that can take this probability of fluctations into account.

        I first thought the article was about speeding up stuff by probabilities and statistics, but it's mostly about solving a currently theoretical problem that might soon become an actual, real world problem. And to solve that problem, we might even have to move away from some of the computer architecture as we know it today.
    • Re:Is this new? (Score:3, Informative)

      by mgessner ( 46612 ) *
      I can't speak for all branch prediction models, but in the PowerPC, the answer is yes, but it's static.

      In the PowerPC, unless a hint is given by the programmer/compiler, forward branches (positive offset) are predicted as NOT taken, and backward branches are predicted as BEING taken.

      This is simply because lots of branching (aside from function calls) takes place in for, while and do-while loops (or for, while, and repeat-until for you Pascal geeks :)

  • I remember... (Score:5, Interesting)

    by Muad'Dave ( 255648 ) on Monday August 25, 2003 @07:17AM (#6782979) Homepage

    ...back in the heady days of Concurrent Computer their top-of-the-line 3280 processor has "usual branch" instructions. The compiler could use the usual branch instructions to provide hints about the probability of the branch being taken to the processor. In a loop, for instance, you'd use a "usual branch not equal" (UBNE) instruction to send execution back to the top. This would indicate to the processor that it should preemptively invalidaate the cache and pipeline.

    I'm sure many mainstream processors have this now, but it's funny to think that CCUR had this technology in the late 1980's.

    • I always figured this was used for stuff like this:

      for (i = 0; i 10; i++)
      { // do something
      }

      The branch to jump outside the loop
      is the unlikely one. This is probably
      a good assumption to make for loops.
    • Why are you invalidating the cache and pipeline on a branch? wow.

      Anyway, static branch prediction has been around for a long long time. That's not what intel is talking about.
  • Hey, this is nothing new as anyone who owned an original Pentium can tell you. It probably gave you the right answer, except for the occassional FDIV.
  • it is an old story (Score:5, Informative)

    by wannasleep ( 668379 ) on Monday August 25, 2003 @07:26AM (#6783024)
    In the interview, a lot of things have been left out. The topic is first and foremost old. It goes back to the 80s. Statistical variations have always been taken into account by using worst cases. Problem is that the worst case approach sucks in the latest technologies, so more sophisticated methods have to be used. There has been a lot of research in the last 10 years (Check american, german, and italian universities, just to name few).
    Also, the problem is old, meaning that analog designers had to deal with these problems since the early stages (example: the offset in the operational amplifiers is caused by transistor performance mismatch). Now, digital designs are affected too. First on the clocking network and now all the rest. Furthermore, it is widely known (in the community) that interconnect variations are of the same order of magnitude of the device (i.e. transistor)performance variations, and on the top of that dynamic effects (like cross talk) may severely affect the performance.
    I don't agree with him on the fact that all the variations are gaussian, there is plenty of literature that states the contrary, and major chip makers know it very well.
    Last but not least, there are already tools that deal with statistical variations, although none of them can handle a microprocessor, as they are mostly circtuit simulation-based. All in all, the good news is that awareness is spreading thru the designers.
    • by fupeg ( 653970 )
      I think there is a slight misunderstanding here. The "variance" of analogy systems is really just manufacturing fault tolerance. The manufacturing process is imperfect and so there is a standard error associated with each quality of a given component. This is a little different than the kind of variance that is related to quantum mechanics.

      Imagine an AND gate that is a single silicon atom. For such a gate to be "open" a single electron would have to be "flow" through it. This requires the electron to bon
  • Old news... (Score:3, Funny)

    by qtp ( 461286 ) on Monday August 25, 2003 @07:27AM (#6783029) Journal
    Didn't intel already do this whith the original Pentium [electronicreferences.com]?

  • How is this like the transition from newton to quantum mechanics? Hasn't computing always been quantized? I for one would love to see a change form quantum type computing to newton type computing using smooth instead of discrete. Ready for infinite FPS? How about ultra-realistic physics models? It seems to me like this intel fellow is merely riding the quantum-this quantum-that hype.
  • About time! (Score:3, Funny)

    by sco08y ( 615665 ) on Monday August 25, 2003 @07:34AM (#6783063)
    Kinda like we've been releasing software that "probably" works for the past 40 years?

    It's good to see computer engineering is finally catching up with computer science!
  • HAL (Score:2, Funny)

    by Anonymous Coward
    This will make HAL even worse:

    OPEN the DOOR HAL!

    Proberbly not, Dave
  • from the interview...
    Look at the whole proactive computing model, where computers will anticipate our needs and sometimes take action on our behalf. That's one.

    so I can relax and count on my Opera v40 to post the first post for me?
  • by PsiPsiStar ( 95676 ) on Monday August 25, 2003 @08:18AM (#6783312)
    Isn't probability already a part of chip design.
    "Our new P4 has a 40% probability of being out in May, a 20% chance of being out in June..."
    • Our new P4 has a 40% probability of being out in May, a 20% chance of being out in June

      Exactly, and a 95% chance of being delayed until at least September.

      -
  • by elwinc ( 663074 ) on Monday August 25, 2003 @08:27AM (#6783379)
    I believe the kind of probabalistic computing Intel's talking about is analagous to error correction. On your average data CD about 15% of the bits are redundant and devoted to error correction. This reduces the probability of erroneously reading the CD, although the probability of error is still non-zero. Same deal with ECC memory. I'm guessing Intel is looking at ways to apply that kind of trick to the computational logic.
  • by mr_luc ( 413048 ) on Monday August 25, 2003 @08:30AM (#6783398)
    Listen to that guy. He just GETS it.

    I am actually, to some extent, inspired by that article. Corporate BS policies aside, whatever you think of Intel or AMD or any other company as a company, as a political entity, or as a producer or consumer goods, you still have to feel good that there are people like that, people that just GET the overriding vision of advancing technology, and are actively working to advance it.

    I don't have time advance technology much in my current job. I don't have the mind or the skills or the time for boundary-pushing endeavors. Some at /. do, and contribute all their mind and skills and time to furthering open-source and other efforts, and that is very commendable.

    But as we often lament, it sometimes seems like the Big Boys don't have the same spark. Let's not forget that somewhere within the pudge of even the fattest multinational technology company, there are brilliant, passionate minds working to further everything we hold dear. These are people who aren't just brilliant scientists or passionate geeks -- they're both. And they're on our side. :)
  • And we'll all be traveling in flying cars while eating meals in pill form!!
  • No wonder we'll need faster processors. The more times you run the same calculation, the more certain you can be of the result. For critical applications that need 10-20 sigma reliablility, we'll need very fast processors indeed.
  • by Theovon ( 109752 ) on Monday August 25, 2003 @09:18AM (#6783739)
    Random number generators are used in ASIC and FPGA logic placement and interconnect routing.

    The goal is to place and route logic in a way that meets the designer's timing and area constraints. The problem is that a deterministic algorithm for that is NP-complete. Instead of considering all possibilities, a number of randomly-generated possibilities are considered, with some ability to make adjustments when one is chosen.

    The randomly-generated possibilities, of course, are not completely random -- it's a matter of multiple gates competing for the same fixed-location logic cell, etc. Who gets the one closest to where they all want to be? Where do you place the rest? What about others competing for THOSE locations? It's complicated. :)
  • They should just paint the chip pink and put an SEP (Somebody Else's Problem) field around it - saves mucking about with all of that improbability stuff.
  • A first step? (Score:3, Interesting)

    by djeaux ( 620938 ) on Monday August 25, 2003 @09:40AM (#6783941) Homepage Journal
    Assuming that the overarching goal of computer (and software) design is to emulate the human brain -- or even the brain of a flatworm -- hardware is going to have to break free of the confines of binary true-false logic, tight tolerances, etc., and embrace variation.

    Since physical science (and by extrapolation, engineering) is built on a "reductionist" paradigm where every problem is broken to its simplest components & solved piecemeal at that level, it makes sense for a "probabilistic" approach to chip design to happen some time. Might as well be now.

    But when we operate under the reductionist model, we forget emergent properties at the system level. In developing a "learning" system -- which again, I assume to be the overarching goal -- we have to learn to deal with variation. Situations are almost never exactly the same. In the beginning of a "learning" system, things probably (pun intended) do look random. But as special cases, exceptions, subtle cues, etc. are encountered by the system & incorporated into the decision-making process, things appear to become increasingly deterministic.

    So, if a "probabilistic" chip design is implemented properly, it likely will look pretty "deterministic" to the end-user, who expects certain kinds of results.

    The problem now is that the hardware is "deterministic" & any attempt to create a "probabilistic" learning system has to happen in software. Right now, the limit to AI, IMO, is simply that chips aren't even in the same league with neurons. "Learning" software built on "learning" hardware ought to be a pretty powerful concept.

    Of course, this may just be a way to get around the fact that manufacturing may be pushing the limits for tight tolerances & probabilistic chip design is the only out. Whatever it takes to force a paradigm shift.

    "Most places a paradigms won't buy you a cup of coffee..."

  • Probability (Score:3, Funny)

    by geekoid ( 135745 ) <{moc.oohay} {ta} {dnaltropnidad}> on Monday August 25, 2003 @09:41AM (#6783958) Homepage Journal
    Intel:
    The addition is probably right.

    Amd:
    It will probably melt through your desk.

    Me:
    I will probably be modded to Hell.
  • "We will shift from the deterministic designs of today to probabilistic and statistical designs of the future"

    Statistics? There are lies, damn lies, and 824633702441(WARNING! STAT FLOATING POINT ERROR)
  • The article mentions the increasing productivity and quality of life that increasing speeds will bring. Yet computers are becoming noisier all the time, for some this is a reduction in their productivity and quality of life.
  • by MojoRilla ( 591502 ) on Monday August 25, 2003 @11:02AM (#6784560)
    Q4: What are some other applications that need more power?

    Look at the whole proactive computing model, where computers will anticipate our needs and sometimes take action on our behalf. That's one.


    When he said this, all I could think of was, yeah, computers need more power to run the heavy virus workload and still make them usable.
  • He's not talking about non-deterministic computing. He's talking about ways to salvage the chip if one or more subcircuits don't function correctly. The article isn't very technical, but this probably alludes to having redundant circuits, possibly even taking the answer that the most redundant circuits produce.

    I'm not a smart enough man to know whether or not this is feasible. Keep in mind that introducing these redundancy checks actually increases the "length" of the circuit, increasing propogation del
    • In many computer systems designed for orbit, that use, say, 486s, they'll put in three of them. Any given calc gets run through all three. Which ever answer is most popular between all three, that's the answer used.

      Why? Because 1+1 = 2, unless a stray bit of radiation or cosmic energy has flipped a bit in your processor.

  • ... and wasn't it named "The Pentium floating point bug"??? Damn it, I don't want a chip that probably gives the right answer, I want a chip that deterministically gives the right answer!
  • "It's like the shift from Newtonian mechanics to quantum mechanics. We will shift from the deterministic designs of today to probabilistic and statistical designs of the future."

    So these processors will work but as quantum physics states we cannot know why, or will they just work in an alternate universe with threads to this? Very interesting but I smell some very unexpected results. Dissappearing sales revenues is first that comes to mind, rediculous developement costs is on the plate as well! I wish succ

  • ...making windows completely reliable. and the only blue screen will display a huge "42".
  • Basically this guy is saying that the variance of transistor parameters will increase in the futute, so more attention must be paid to circuit design to avoid producing a lot of chip that perform poorly. People already use monte carlo sims to determine the effects of process variation in analog designs, and to bound digital design performance. Digital circuit designers must look at new ways to overcome the issue of transistor performance variations, rather than accepting them as inevitable is the message, I

  • Probably.

    Maybe.

    Will I be /.'d today?

    Oh, that's a certainty.

"Trust me. I know what I'm doing." -- Sledge Hammer

Working...