'Inexact' Chips Save Power By Fudging the Math 325
Barence writes "Computer scientists have unveiled a computer chip that turns traditional thinking about mathematical accuracy on its head by fudging calculations. The concept works by allowing processing components — such as hardware for adding and multiplying numbers — to make a few mistakes, which means they are not working as hard, and so use less power and get through tasks more quickly. The Rice University researchers say prototypes are 15 times more efficient and could be used in some applications without having a negative effect."
Graphics cards (Score:3, Interesting)
Don't they do this too, but fudge the maths so they can be a bit faster?
Hence the Cuda stuff needed special modes to operate in IEEE floats etc...
Re:Ok, now move the decimal point left.. (Score:4, Interesting)
Well, really this is following in a long and glorious tradition of fuzzily incorrect arithmetic [wikipedia.org].
Whatcouldpossiblygowrong (Score:4, Interesting)
Re:Turtles all the way down (Score:3, Interesting)
GPS. I don't need to know that I'm precisely in the middle of the left lane of the 4-lane highway going 59.2MPH. I'd rather it use the processor for screen refreshes and finding a better route around Dallas or Chicago at Rush Hour.
Scales at the checkout - the faster is gets a reading on how much my apples weigh, the faster I get away from the "People of Wal-Mart" and I'll bet there's less than a penny difference anyway.
Video Games (see GPS) - many switch to integer maths already for speed, how about fuzzy integers? ;)
DHS airport scanners - the faster they scan, the less I'll glow in the dark
Re:Turtles all the way down (Score:5, Interesting)
I envision the "less precise" CPUs being used in consumer laptops where people are just watching movies or listening to music.
It does not matter if the MPEG4 conversion is slightly off with the color, because the consumer's eye won't detect it. The selling point will be a laptop or tablet that lasts 10x longer on a battery charge.
Re:AI Chip (Score:4, Interesting)
Your definition of math is very limited. Descriptive Geometry is math too.
Path finding systems may use imprecise weight function when making decisions - calculating weights is a major burden.
Using cameras involves image analysis. In essence, you're working with what is a bunch of noise, and you analyze it for meaningful data, by averaging over a whole bunch of unreliable samples. If you can do it 15 times faster at cost of introducing 5% more noise in the input stage, you're a happy man.
In essence, if input data is burdened by noise of any kind - and only "pure" data like typed or read from disk isn't, any kind of real world data like sensor readouts, images or audio contains noise, the algorithm must be resistant to said noise, and a little more of it coming from math mistakes can't break it. Only after the algorithm reduces say 1MB of raw pixels into 400 bytes of vectorized obstacles you may want to be more precise.... and even then small skews won't break it completely.
Also, what about genetic algorithms, where "mistakes" are introduced into the data artificially? They are very good at some classes of problems, and unreliable calculations at certain points would probably be advantageous to the final outcome.
Re:Turtles all the way down (Score:4, Interesting)
Hardly. With engineering projects (especially with regards to people losing their lives) you ALWAYS build in safety factors. Large ones in fact. If you are within (from the article) 0.54% of the limits of the material you have a lot bigger problems then the processor.
Secondly, we are talking about low-power hardware here, not a software application. I see these chips being pushed into tablets and mobile devices, not things like laptops & desktops where they do some serious mathematical lifting.