Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Math Power Hardware Science Technology

'Inexact' Chips Save Power By Fudging the Math 325

Barence writes "Computer scientists have unveiled a computer chip that turns traditional thinking about mathematical accuracy on its head by fudging calculations. The concept works by allowing processing components — such as hardware for adding and multiplying numbers — to make a few mistakes, which means they are not working as hard, and so use less power and get through tasks more quickly. The Rice University researchers say prototypes are 15 times more efficient and could be used in some applications without having a negative effect."
This discussion has been archived. No new comments can be posted.

'Inexact' Chips Save Power By Fudging the Math

Comments Filter:
  • Graphics cards (Score:3, Interesting)

    by Anonymous Coward on Friday May 18, 2012 @10:17AM (#40040645)

    Don't they do this too, but fudge the maths so they can be a bit faster?

    Hence the Cuda stuff needed special modes to operate in IEEE floats etc...

  • by dkleinsc ( 563838 ) on Friday May 18, 2012 @10:22AM (#40040701) Homepage

    Well, really this is following in a long and glorious tradition of fuzzily incorrect arithmetic [wikipedia.org].

  • by bjourne ( 1034822 ) on Friday May 18, 2012 @10:30AM (#40040801) Homepage Journal
    Before someone comes up with that stupid remark, not much. :) If the chips are 15 times as efficient as normal ones, it means that you could run for instance four in parallel and rerun each calculation in which one of them differs. That way you would both get both accurate calculations and power savings. Modify the number of chips to run in parallel depending on the accuracy and efficiency needed.
  • by Anonymous Coward on Friday May 18, 2012 @10:54AM (#40041117)

    GPS. I don't need to know that I'm precisely in the middle of the left lane of the 4-lane highway going 59.2MPH. I'd rather it use the processor for screen refreshes and finding a better route around Dallas or Chicago at Rush Hour.

    Scales at the checkout - the faster is gets a reading on how much my apples weigh, the faster I get away from the "People of Wal-Mart" and I'll bet there's less than a penny difference anyway.

    Video Games (see GPS) - many switch to integer maths already for speed, how about fuzzy integers? ;)

    DHS airport scanners - the faster they scan, the less I'll glow in the dark

  • by cpu6502 ( 1960974 ) on Friday May 18, 2012 @10:56AM (#40041147)

    I envision the "less precise" CPUs being used in consumer laptops where people are just watching movies or listening to music.

    It does not matter if the MPEG4 conversion is slightly off with the color, because the consumer's eye won't detect it. The selling point will be a laptop or tablet that lasts 10x longer on a battery charge.

  • Re:AI Chip (Score:4, Interesting)

    by SharpFang ( 651121 ) on Friday May 18, 2012 @11:03AM (#40041247) Homepage Journal

    Your definition of math is very limited. Descriptive Geometry is math too.

    Path finding systems may use imprecise weight function when making decisions - calculating weights is a major burden.

    Using cameras involves image analysis. In essence, you're working with what is a bunch of noise, and you analyze it for meaningful data, by averaging over a whole bunch of unreliable samples. If you can do it 15 times faster at cost of introducing 5% more noise in the input stage, you're a happy man.

    In essence, if input data is burdened by noise of any kind - and only "pure" data like typed or read from disk isn't, any kind of real world data like sensor readouts, images or audio contains noise, the algorithm must be resistant to said noise, and a little more of it coming from math mistakes can't break it. Only after the algorithm reduces say 1MB of raw pixels into 400 bytes of vectorized obstacles you may want to be more precise.... and even then small skews won't break it completely.

    Also, what about genetic algorithms, where "mistakes" are introduced into the data artificially? They are very good at some classes of problems, and unreliable calculations at certain points would probably be advantageous to the final outcome.

  • by MonsterTrimble ( 1205334 ) <monstertrimble&hotmail,com> on Friday May 18, 2012 @12:19PM (#40042223)

    Hardly. With engineering projects (especially with regards to people losing their lives) you ALWAYS build in safety factors. Large ones in fact. If you are within (from the article) 0.54% of the limits of the material you have a lot bigger problems then the processor.

    Secondly, we are talking about low-power hardware here, not a software application. I see these chips being pushed into tablets and mobile devices, not things like laptops & desktops where they do some serious mathematical lifting.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...