'Approximate Computing' Saves Energy 154
hessian writes "According to a news release from Purdue University, 'Researchers are developing computers capable of "approximate computing" to perform calculations good enough for certain tasks that don't require perfect accuracy, potentially doubling efficiency and reducing energy consumption. "The need for approximate computing is driven by two factors: a fundamental shift in the nature of computing workloads, and the need for new sources of efficiency," said Anand Raghunathan, a Purdue Professor of Electrical and Computer Engineering, who has been working in the field for about five years. "Computers were first designed to be precise calculators that solved problems where they were expected to produce an exact numerical value. However, the demand for computing today is driven by very different applications. Mobile and embedded devices need to process richer media, and are getting smarter – understanding us, being more context-aware and having more natural user interfaces. ... The nature of these computations is different from the traditional computations where you need a precise answer."' What's interesting here is that this is how our brains work."
Numerical computation is pervasive (Score:5, Informative)
This is not about data centers and databases. This is about scientific computation -- video and audio playback, physics simulation, and the like.
The idea of doing a computation approximately first, and then refining the results only in the parts where more accuracy is useful is an old idea; one manifestation are multigrid [wikipedia.org] algorithms.
Fuzzy Logic anyone? (Score:4, Informative)
While the concept was interesting, it did not really catch up. Progress of silicon devices made it simply unnecessary. It ended up being used as a buzz word for a few years and quietly died away.
I wonder if this is going to follow the same trend.
Re:meanwhile... (Score:5, Informative)
The majority of CPU cycles in data centers is going to be looking up and filtering specific records in database
Approximate Computing is especially interesting in databases. One of the coolest projects in this space is Berkeley AMPLab's BlinkDB [blinkdb.org]. Their cannonical example
should give you a good idea of how/why it's useful.
Their bencmarks show that Approximate Computing to 1% error is about 100X faster than Hive on Hadoop.
Re:meanwhile... (Score:5, Informative)
DB: Queries with Bounded Errors and Bounded Response Times on Very Large Data
Re:Analog (Score:4, Informative)
This is also how analog computers work. They're extremely fast and efficient, but imprecise. It had a bit of traction in the old days, but interest seems to have died off.
Analog is not imprecise. Analog computing can be very precise and very fast for complex transfer functions. The problem with Analog is that it is hard to change the output function easily and it is subject to changes in the derived output caused from things like temperature changes or induced noise. So the issue is not about precision.
Clive SInclair did this in 1974. (Score:5, Informative)
Due to ROM and cost limitations the original Sinclair Scientific calulator only produced approximate answers, maybe to 3 or four digits.
This was far more accurate than the answers given by a slide rule....
For more info have a look at this page Reversing Sinclair's amazing 1974 calculator hack - half the ROM of the HP-35 [righto.com]