Forgot your password?
typodupeerror
Hardware Technology

CPU DB: Looking At 40 Years of Processor Improvements 113

Posted by timothy
from the why-in-my-day-we-didn't-have-binary dept.
CowboyRobot writes "Stanford's CPU DB project (cpudb.stanford.edu) is like an open IMDB for microprocessors. Processors have come a long way from the Intel 4004 in 1971, with a clock speed of 740KHz, and CPU DB shows the details of where and when the gains have occured. More importantly, by looking at hundreds of processors over decades, researchers are able to separate the effect of technology scaling from improvements in say, software. The public is encouraged to contribute to the project."
This discussion has been archived. No new comments can be posted.

CPU DB: Looking At 40 Years of Processor Improvements

Comments Filter:
  • by mikael (484) on Saturday April 07, 2012 @12:06PM (#39606895)

    It's deliberate. There was a podcast interview with some Microsoft engineers regarding future COM enhancements. They were waiting for the hardware to get faster and for memory to increase just so they could give every class member its own lock.

  • by mccalli (323026) on Saturday April 07, 2012 @12:06PM (#39606897) Homepage
    It's slashdotted so I can't tell, but any definitive database really needs MOS and Zilog in there as well. The home and micro computer revolution depended on them, Zilog's Z80 and MOS's 6502.

    Cheers,
    Ian
  • by nurb432 (527695) on Saturday April 07, 2012 @12:24PM (#39606983) Homepage Journal

    Same problem here.. cant see it..

    And especially don't forget Motorola, or IBM, Dec, Sun, RCA, Intersil, TI, MIPS, etc etc... Even within the Intel camp id hope they branch into other architectures other than ix86, like ix432 and960, for example.

  • by hendrikboom (1001110) on Saturday April 07, 2012 @12:33PM (#39607025)

    They were smart. At least the one I had documentation for was. And the programmer could override it if he thought he was smarter. But the assembler needed a newer model of computer than the one I had -- one that could actually read and write letters on its typewriter instead of just digits. (though u, v, w, x, y, z counted as hexadecimal digits)

    The machine, in case anyone is interested, was a Bendix G-15d.

    That was in 1962, if I remember correctly. I think the machine was about 5 years old. The next year the university got an IBM 1620, with alphanumeric I/O and 20,000 digits of actual core memory. Change was relentlessly fast in those days, too. THe big difference is that every few years we got qualitative, not just quantitative change.

    -- hendrik

  • I can't even look at what Stanford is trying to do right now, but there have existed for years at least two online CPU "museums" that serve this goal. The one that readily springs easiest to mind - the one I've used most myself - is CPU-World [cpu-world.com]. It has extensive coverage of all the major CPU lineages, including photos submitted by users, and even includes some non-CPU silicon. It seems to be largely the creation of one guy, Gennadiy Shvets, with eager collaboration from a lot of fellow enthusiasts, and there seems to be no profit motive to the site that I've ever noticed. He even thanks the most prolific contributors by name.

    WHY would Stanford feel it was necessary to "divide and conquer" this enthusiasm by creating an entirely new site and museum, rather than focusing the collective interest by contributing to or partnering with the one(s) that have already existed for many years? On the face of it this effort looks like either ignorance or pointless competition.

  • by hendrikboom (1001110) on Saturday April 07, 2012 @10:37PM (#39610203)

    The point of strong typing is not to tell the compiler how to make your program efficient. That's a pleasant side effect, but it's not the point at all. The point is to tell the compiler, possibly redundantly, what kind of values you intend to have in variables, so it can tell you when you get it wrong. Proper strong typing can catch almost all coding errors before your program is ever executed. It takes longer to code it and get it through the compiler, sure, but the time you lose this way is far outweighed by the reduced time you spend debugging.

    In addition, the explicit type information on all function definitions makes it vastly easier to understand a program you've never seen before when it's handed to you.

    Yes, there are a few situations where you can't specify types statically. They are pretty rare in a properly designed type system. A good language has mechanisms to handle the few cases that still remain.

    -- hendrik

You are in a maze of little twisting passages, all alike.

Working...