Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware

Research Reveals Mislaid Microprocessor Megahertz 99

SlashRating©
6*10^23
slashdottit! tm
ransom1982 noted a new article on The Register that says "Not only are chip companies regularly releasing ever-faster microprocessors, but new research has revealed that modern CPUs actually lose megahertz over time." This makes it even more complicated to compare the performance of Intel and AMD CPUs since you have differing architecture, clockspeeds AND the year of manufacture to consider. Buyer beware!
This discussion has been archived. No new comments can be posted.

Research Reveals Mislaid Microprocessor Megahertz

Comments Filter:
  • by Technician ( 215283 ) on Sunday April 01, 2007 @01:39PM (#18567611)
    I'm too dumb to figure out if this is an April fools joke or not.

    Let me update you. In simplistic terms on each clock cycle, the CPU performes an operation. Using just that metric, it would not be the chip that changed. If the clock slowed down, so would the chip. Now lets get just a little more complicated.. On each clock cycle, the chip requests information and sends requests, computes results, stores results etc. Sometimes the chip has to wait for several clock cycles such as requesting data from memory or the hard drive. An old Pentium III chip at one Ghz clock for example would typicaly have used PC-100 memory. From the time the CPU requested a memory fetch, at least 10 clock cycles may have passed before the data is delivered. Due to the time the memory takes to set up the address, fetch the result, present the result and tell the CPU the data is here and valid, many more clock cycles may have passed. If the transistors have gotten weak, it may take an additional couple clock cycles for the rise time before the CPU accepts the data as valid. This is an age realated slowdown.

    Another age related slowdown is built into some chips. (Feature, not flaw) The speed setp technology would be like having a car that when the radiator became plugged, it would auto reduce power on a long steep hill to keep the temprature below boiling. It's better to go up the hill slow and not boil over than stay at full speed and blow a headgasket due to warped heads.

    As the heatsink compound dries out and the face of the heatsink developes an insulating layer of oxide, under intense computing, some chips slow down to prevent destruction by overheating. Cleaning and replacing the heatsink grease (maybe replacing a fan with worn bearings) will restore new operation.

    The article appears to be an April Fools prank, but there are valid reasons systems slow down. Most are related to it takes longer to complete any task because all the Windows tasks keep stacking up so there is just so much more to do. Have you looked at the number of patches and hotfixes out for XP these days. This is in addition to any rootkits, (SONY & Others) toys (Weatherbug, google toolbar, internet radio) or other things the user may have added which run constantly using up clock cycles. You think automatic updates doesn't require CPU time?

    Let's face it, your CPU is very busy even before you log in and request it to do something else in it's busy schedule. Most times your CPU gets more done in a day before you log in than you get done all day.

It is easier to write an incorrect program than understand a correct one.

Working...