Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware Technology

Unpredictability in Future Microprocessors 244

prostoalex writes "A Business Week article says increase in chip speeds and number of transistors on a single microprocessor leads to varying degrees of unpredictability, which used to be a no-no word in the microprocessor world. However, according to scientists from Georgia Tech's Center for Research in Embedded Systems & Technology, unpredictability becomes a great asset leading to energy conservation and increased computation speeds."
This discussion has been archived. No new comments can be posted.

Unpredictability in Future Microprocessors

Comments Filter:
  • by swg101 ( 571879 ) on Saturday February 12, 2005 @10:37PM (#11656198)
    Degrees of probability and uncertainty have been in given in the communications industry for quite some time. This just seems to be pointing out that the same ideas can be applied to the actual processing of the data.

    Now that I think about it, it does seem to make some sense. I am not sure that I would want to program on such a chip right now though (I imagine that debugging could become a nightmare really quickly!).
  • by Anonymous Coward on Saturday February 12, 2005 @10:50PM (#11656270)
    Signed,

    FDIV
  • by Inmatarian ( 814090 ) on Saturday February 12, 2005 @11:27PM (#11656482)
    Intel has hit a brick wall in terms of their processors. They invested heavily in their processor fabrication centers and are now coming to terms that they won't be able to produce reliably anymore. That said, lets discuss the nature of 1s and 0s. Typically, a 0 is broadcast across a chip as a lack of voltage, and a 1 brodcast as a +5 volts. Each transistor has to be capable of being just right of a resistor to not degrade the +5 volts. Heres where "unpredictability" comes into play: you have a handful of volts to play with. The article's talking about having unpredictable algorythms is the press agent not knowing what he's talking about, but certainly allowing a voltage threshold within the confines of the transistors is an okay thing. The only problem is when its across a lot of serial lines, because that compounds into significant loss. This is just my opinion, but I think this guy is talking about chip designs where the data isn't broadcast in 1s and 0s anymore, but in whatever multiples of electronvolts that would correspond to a number. I'm not comfortable with this, and I would like someone to tell me I'm just paranoid.
  • more info (Score:5, Informative)

    by mako1138 ( 837520 ) on Saturday February 12, 2005 @11:42PM (#11656569)
    This article left me rather insatisfied, so I looked for a better one. I found it here [gatech.edu], a collection of papers on the subject, with real-world results, it seems. The first article is a nice overview, and there's some pics of odd-looking silicon. They have funding from DARPA, interestingly enough.
  • by theguywhosaid ( 751709 ) on Sunday February 13, 2005 @12:20AM (#11656752) Homepage
    heh, 63 is the median in your given array. since the returned value does not need to be in the array, just add 1 and return that. or multiply by some large constant. or just return the sum of all the numbers you encountered. theres lots of options here.
  • by slashnot007 ( 576103 ) on Sunday February 13, 2005 @12:53AM (#11656911)
    Problem: find a number larger than the median

    proposed solution: pick 1000 entires at random and retain the highest.

    analysis: at first glance it might seem that the problem seems ill formed since the size of the array is not specified. But note that this is not a parametric problem. You are asked for the median, so the actual numerical values of the array irrelevant, only the rank order. Some wiseguys here have suggested returning the largest double precision number as a gaurenteed bound. While a wise ass answer it does raise a second interesting false lead. Even if the number were represented in infinite precision and this could be aribtrarily large or small the proposed solution does not care. Again this is because all that matters is the ranking of the numbers not their values.

    COnsider the proposed solution. pick any cell at random and examine the number. if this number is returned there is a 50% chance it is equal to or greater than the median of the set. (if this is not obvious, dwell on the meaning of the word median: it means half the numbers are above/below that number.). So the chance it is below the median is 0.5. if you choose 1000 numbers the chance that all are below the median is 0.5^1000 which is roughly 1 part in a google.

    So the author is right, this algorithm fails less often than the probability that there is a cosmic ray that corrupts the calculation or their is a power blackout in the middle of it or that you have a heart attack.
  • Re:Another use (Score:3, Informative)

    by PingPongBoy ( 303994 ) on Sunday February 13, 2005 @02:54AM (#11657484)
    Probably and even bigger boon for encryption and key-generation.
    I vote key-generation and not encryption. Otherwise, how would you decrypt it?


    Unpredictability really is useful for encryption because random numbers are very important for better encryption.

    The second application that comes to mind is the one-time pad. Of course you have to save the random padding data somewhere but you always had to do that. The unpredictability just makes one-time pad that much better.

    Random numbers may be used to generate keys that people can't guess. Of course you have to memorize the key.

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...