Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Power Hardware

Whither Moore's Law; Introducing Koomey's Law 105

Joining the ranks of accepted submitters, Beorytis writes "MIT Technology review reports on a recent paper by Stanford professor Dr. Jon Koomey, which claims to show that the energy efficiency of computing doubles every 1.5 years. Note that efficiency is considered in terms of a fixed computing load, a point soon to be lost on the mainstream press. Also interesting is a graph in a related blog post that really highlights the meaning of the 'fixed computing load' assumption by plotting computations per kWh vs. time. An early hobbyist computer, the Altair 8800 sits right near the Cray-1 supercomputer of the same era."
This discussion has been archived. No new comments can be posted.

Whither Moore's Law; Introducing Koomey's Law

Comments Filter:
  • by PaulBu ( 473180 ) on Tuesday September 13, 2011 @05:32PM (#37392522) Homepage

    Yes, there is if you "erase" intermediate results -- look up 'von Neumann-Landauer limit', kT*ln(2) energy must be dissipated for non-reversible computation.

    Reversible computation can theoretically approach zero energy dissipation.

    Wikipedia is your friend! :)

    Paul B.

  • by MajroMax ( 112652 ) on Tuesday September 13, 2011 @05:44PM (#37392614)
    Without reversible computing [wikipedia.org], there indeed is a fundamental limit to how much energy a computation takes. In short, "erasing" one bit of data adds entropy to a system, so it must dissipate kT ln 2 energy to heat. This is an extremely odd intersection between the information theoretic notion of entropy and the physical notion of entropy.

    Since the energy is only required when information is erased, reversible computing can get around this requirement. Aside from basic physics-level problems with building these logic gates, the problem with reversible computing is that it effectively requires keeping each intermediate result. Still, once we get down to anywhere close to the kT ln 2 physical constraint, reversible logic is going to look very attractive.

  • by bunratty ( 545641 ) on Tuesday September 13, 2011 @06:07PM (#37392788)
    Yes, reversible computation can theoretically approach zero energy dissipation, but if you use no energy, the computation is just as likely to run forwards as backwards. You still need to consume energy to get the computation to make progress in one direction or the other. Richard Feynman has a good description of this idea in his Lectures on Computation.

Programmers used to batch environments may find it hard to live without giant listings; we would find it hard to use them. -- D.M. Ritchie