from the for-sufficiently-large-values-of-1 dept.
Barence writes "Computer scientists have unveiled a computer chip that turns traditional thinking about mathematical accuracy on its head by fudging calculations. The concept works by allowing processing components — such as hardware for adding and multiplying numbers — to make a few mistakes, which means they are not working as hard, and so use less power and get through tasks more quickly. The Rice University researchers say prototypes are 15 times more efficient and could be used in some applications without having a negative effect."
Nothing ever becomes real till it is experienced -- even a proverb is no proverb
to you till your life has illustrated it. -- John Keats