from the for-sufficiently-large-values-of-1 dept.
Barence writes "Computer scientists have unveiled a computer chip that turns traditional thinking about mathematical accuracy on its head by fudging calculations. The concept works by allowing processing components — such as hardware for adding and multiplying numbers — to make a few mistakes, which means they are not working as hard, and so use less power and get through tasks more quickly. The Rice University researchers say prototypes are 15 times more efficient and could be used in some applications without having a negative effect."
Programmers used to batch environments may find it hard to live without
giant listings; we would find it hard to use them.
-- D.M. Ritchie