Intel's 4004 Microprocessor Turns 40 126
harrymcc writes "On November 15th 1971, Intel introduced the 4004 — the first single-chip microprocessor. Its offspring, needless to say, went on to change the world. But first, Intel tried using the 4004 in a bunch of products that were interesting but often unsuccessful — like a pinball machine, an electronic vote-counting machine, and Wang's first word processor. Technologizer's Benj Edwards is celebrating the anniversary with an illustrated look back at this landmark chip." Here's another nostalgic look back at V3.co.uk, and one at The Inquirer. And an anonymous reader points out another at ExtremeTech, from which comes this snippet: "Designed by the fantastically-forenamed Federico Faggin, Ted Hoff, and Stanley Mazor, the 4004 was a 4-bit, 16-pin microprocessor that operated at a mighty 740KHz — and at roughly eight clock cycles per instruction cycle (fetch, decode, execute), that means the chip was capable of executing up to 92,600 instructions per second. We can’t find the original list price, but one source indicates that it cost around $5 to manufacture, or $26 in today’s money."
Who could ever need more than 740KHz? (Score:4, Funny)
Oh wait, that was something else...
Re:Who could ever need more than 740KHz? (Score:5, Funny)
Re: (Score:3)
740kHz? Sounds great if you're making eMPty3's for your iPod.
I use at least 1.48MHz, 256-bit. And even then I can tell it's digital. It just doesn't have the warmth and rich sound of vinyl or that mix tape I made in 1993.
Re: (Score:3)
1.48 MHz?, 256-bit? Wow! I can't believe you can hear anything through the distortion and aliasing at that rate!
I sample with at least 2.4GHz and 2048-bit accuracy. And maybe, just maybe, I can get a basic approximation to what I get from my 1928 Victrola. Now, maybe that lack of sound quality is because I'm only using Monster cables for my monitors right now. The good news is that I'm on a waiting list to sell a kidney so I can get these [avland.co.uk] - a bargain at $21K for 3 meters! But I have to believe that the samp
Re: (Score:2)
Steve Martin would just tell you to do googolphonic sampling using a moon-rock needle.
Re: (Score:2)
Sounds like shit.
You must be another old fart around here.
Re: (Score:2)
Well, it's OK for the car.
Re: (Score:2)
Oh, you couldn't be more wrong! The rotating platter will smear the zero bits, causing them to look like ones to the DAC.
I'm afraid the only proper way to store digital audio is on punch cards.
Re:Who could ever need more than 740KHz? (Score:5, Informative)
There's actually a sane reason for 95kHz sampling - filtering.
Before any ADC, you need to put in an analog filter (anti-aliasing filter - basically it ensures that the signal going to the ADC is bandlimited below the Nyquist frequency).
The problem with filters is that it's very hard to get the desired characteristics (flat response in frequency band, narrow transition band) without doing a ton of work (lots of parts, etc).
Using 44kHz or 48kHz sampling, it means if you want to capture to 20kHz, your filter must "brickwall" in the 2-4kHz region between 20kHz and 22/24kHz. This is extremely hard to do, and the eng result is you usually get rolloff around 16kHz or so. Or the frequency response gets wilder and definitely not flat.
Using 96kHz or even 192kHz means the anti-aliasing filter can be much gentler and designed with more stuiable characteristics. When you can have rolloff start at 20kHz and extend all the way out to 48/96kHz, you can make some very nice filters indeed - flat from 20Hz-20kHz, very little phase distortion, etc.
The other end benefits as well - the antialiasing filter on the DAC side can be a lot nicer as well.
And of course, the more bandwidth you have to play with, it means the filters are also much cheaper and simpler (and that also means less distortion).
Re: (Score:2)
Using 44kHz or 48kHz sampling, it means if you want to capture to 20kHz, your filter must "brickwall" in the 2-4kHz region between 20kHz and 22/24kHz. This is extremely hard to do
Huh? A simple bandpass filter does the trick. Higher sampling rate, higher frequency filter. You realize that there are radio frequency filters, which are far higher frequencies than sound?
The higher your sampling rate, the less aliasing you get. At 44.1 sample rate has only three data points to describe a 15 kHz wave's shape. That
Re: (Score:2)
Huh? A simple bandpass filter does the trick. Higher sampling rate, higher frequency filter. You realize that there are radio frequency filters, which are far higher frequencies than sound?
The maximum achievable dynamic range of an ADC is determined by the bit depth of the converter. In practice, this dynamic range is limited by the steepness of the roll-off of the anti-aliasing filter - you can have a 16 bit converter, which implies 16*6 = 96 dB of dynamic range, but if your low pass filter only rolls off 12 dB before you hit the Nyquist frequency, your dynamic range will be limited to 12 dB. The OP is working under the assumption that the "final" sample rate, as indicated by the informat
Re: (Score:2)
Re: (Score:2)
The maximum achievable dynamic range of an ADC is determined by the bit depth of the converter.
I completely forgot about dynamic range. Yes, the greater the bit depth the greater the range, and the bit depth does indeed have to match all the way through the system; a chain is only as good as its weakest link. But the dynamic range of CDs, greater than that of vinyl, isn't ever used that I've seen. In fact, the CD I made from Boston's first LP has more dynamics than the CD I bought (the band itself complaine
Re: (Score:2)
Re: (Score:2)
In the early days of CDs, a brick-wall analog filter was relatively cheap compared to doubling the bitrate of the laser... of course, that's all turned on its head now.
Cue Kurzweil... (Score:2, Redundant)
If we've come this far in 40 years, where will we be in 40 more?
Re: (Score:2)
Nearly 70 and doing everything I can to avoid a computer for my entire retirement?
Re:Cue Kurzweil... (Score:5, Insightful)
Nearly 70 and doing everything I can to avoid a computer for my entire retirement?
You miss the Kurzweil reference, if medical progress keeps pace, 70 will be young.
I think the half-way mark 1991 makes an interesting reference point: in 1991, my desktop PC at work cost 2 months salary, it was a 16MHz 386 with a 640x480 resolution 15" color monitor. My desktop PC at work today cost about 3 days pay and is a 2+GHz i5 with two 1920x1080 24" flat panels.
Re: (Score:2)
Good thing you got that raise!
Re: (Score:3)
Good thing you got that raise!
More like, my boss in 1991 believed in the value of cutting edge hardware (more than the value of paying people anything above minimum market rate...) Today I work for a startup that supplies us from the lower end of the Dell catalog. Not that I'm complaining, the hardware is no longer the limiting factor, although I could use a third screen....
Re: (Score:2)
No, you'll be avoiding some tech that will be indespensible to modern life that we don't have yet.
My dad is one of those guys, he's 80. "I lived eighty years without a computer and cell phone and I don't need one now." My maternal grandfather was the same way about indoor plumbing! Even after my uncle installed a bathroom in Grandpa's house, Grandpa still used the outhouse.
I'm hopeful I haven't inherited the "get off my lawn" gene. At least I don't have a nice yard so far (and don't give a damn if someone w
Re: (Score:2)
Be glad you got to know your grandparents... some people miss out on that experience entirely.
Re: (Score:2)
Re: (Score:1)
Predicting that the new ten billion core processors will usher in the year of Linux on the desktop and IPv6 finally overtaking IPv4.
Re: (Score:2)
Hmm probably 80 years.
Re: (Score:2)
Well, at this rate the last die shrink will be in the 2020s as we're down to single atoms, no more magic computers that get faster every year. The rest is a bit like that flying car, even 40 years ago that 4004 would have been a helluva human coprocessor for math but progress on any real cyborg functionality has never materialized. And if it does, it'll have almost nothing to do with CPU development as such.
Re: (Score:2)
Laughing at pictures with people who are using the huge bulky iPhones. Or... Huddled around the fire trying to keep warm.
Re: (Score:2)
Re: (Score:2)
http://singinst.org/overview/whatisthesingularity/ [singinst.org]
The question is, will the "smarter than human intelligence" see any reason to improve human lifespan?
Visions of the powercells in the Matrix just popped to mind....
Re: (Score:2)
The Matrix was a pretty twisted and mostly bad concept, but I give them credit for one thing: We can't predict what machine intelligence will do, any more than tuna fish could have predicted fish farming. What do the whales think of us? They've been watching us for tens of thousands of years, all of a sudden we start crossing the ocean instead of staying near shore and then we start being a voracious predator of theirs for a few hundred years, when they are close to extinction, we back off, mostly, but sc
Re: (Score:3)
We'll be in 2051.
Re:Cue Kurzweil... (Score:4, Insightful)
If we've come this far in 40 years, where will we be in 40 more?
CMOS process shrinks will probably poop out around 2020. Intel claims to have things figured out until 8nm. When the CMOS process shrinks cease there will be no more massive numbers of "free" transistors every year. Intel and other will likely start playing with gallium arsenide and other stuff to try to squeeze more performance out of stagnated process sizes. Once those tricks are played out it could very well be the end until radical new alternative technology is developed.
Re: (Score:3)
Although it's nice to call it CMOS, and indeed both N and P channel devices are used, all the fastest silicon processors use N channel devices for the logic path. CMOS runs about 1/4 the speed on NMOS.
My understanding of gallium arsenide MOS (and I could easily be wrong) is that its speed advantage for logic started running out at about the 0.35 micron (350 nm) node, which is where Vitesse gave up and very nearly went out of business. The future might not be silicon, but there's little change of it being Ga
Re: (Score:2)
Egads, I've must be more careful. 1/4 the speed of NMOS
little chance of it being GaAs.
Sorry
Re: (Score:1)
Although it's nice to call it CMOS, and indeed both N and P channel devices are used, all the fastest silicon processors use N channel devices for the logic path. CMOS runs about 1/4 the speed on NMOS.
Not any more. Intel's 45nm and 32nm hi-K gate dielectric / metal gate processes delivered two interesting things: a dramatic reduction in leakage current, and PMOS transistors nearly as strong as the NMOS. As a result, everything Intel's designed starting with the Nehalem generation (the first Core i7) uses pure CMOS logic.
As I understand it, dynamic logic based mostly on NMOS transistors (like fast domino logic) is still somewhat faster than complementary in Intel HK/MG, but insufficiently so to make up
Re: (Score:2)
My understanding of gallium arsenide MOS (and I could easily be wrong) is that its speed advantage for logic started running out at about the 0.35 micron (350 nm) node, which is where Vitesse gave up and very nearly went out of business. The future might not be silicon, but there's little change of it being GaAs.
Intel has recently been talking about using GaAs on future processes. Everything old is new again.
http://nextbigfuture.com/2011/06/intel-talks-about-8-nanometer-nodes-for.html [nextbigfuture.com]
Re: (Score:2)
If we've come this far in 40 years, where will we be in 40 more?
I'll be dead. At least I hope I'll be dead, I have no wish to live to be a hundred. Like my grandmother said at age 95, "I don't know why everybody wants to live to be a hundred. It ain't no fun bein' old."
But man, look at how far things have progressed. Forty years ago the pot was only 1/4th as potent (and 1/20th the price); there were no flat screen TVs, no microwave ovens, no VCRs, self-opening doors were brand new, there were no cell phones
Re: (Score:2)
I was four in 1971... you could still get lead in your gasoline, you could dream of being an astronaut when you grew up, you could imagine surviving an all out thermonuclear war, jet travel was still expensive for most people, and nobody had the faintest idea what caused ulcers...
I mentioned Kurzweil mostly because he's pushing the idea that, due to exponential progress, people will live forever, real soon now. He is kinda quiet about how much money (power, whatever) you're going to need to do it. Persona
Re: (Score:2)
Reminiscing on the old chip my pacemaker used to use.
And that's Word Wang! (Score:1)
Wait, would that be Word Wang? The sequel to the off season hit Number Wang?
Link to 35th Anniversary site (Score:5, Interesting)
http://www.4004.com/ [4004.com]
In particular, that fully-functional 4004 mock-up someone made by using 1G TTL chips on a large circuit board is absolutely awesome.
Re: (Score:2)
Re:Link to 35th Anniversary site (Score:5, Interesting)
In 1971, an Intel 4004 had 2300 transistors, on a die size 12mm square (144mm^2).
In 2011, an Intel i7 had 560,000,000 transistors on a die size 296mm^2
Going by those dimensions, you could get 24378 4004's into the die size of an i7. Or the i7 mockup would be 24000 times the area of the 4004 mockup. If you were to build the i7 with the same brass and copper technology as a Diffference Engine, it would probably fill Manhattan.
For comparison, an Intel 80386 had 275000 transistors, and an 80486 had 1,180,000 transistors.
For those CPU's, you could get 2000 80386's into the die size of an i7, and 474+ 80486's into the same die size.
I'd guess in reality that would be less because you would need cache management for all those processors.
Re: (Score:3)
The original 4004 was about 12 mm^2 on a 10micron process, 2300 transistors. To make it work 3 other chips of about the same size are needed. The linear shrink factor would be 312.5, and the original die was about 3.5 mm, so a 4004 at 32nm would be about 11 microns on a side, (just larger than the original line width!), or 22 microns with the other chips included. The clock speed is not easy to figure, but with such a small circuit should be on the order of 1-10GHz. Figuring 7.4 GHz (10000x faster) and 25,0
Re: (Score:2)
If the 4004 was actually 12 mm square rather than 12 square mm, then the above numbers need to be adjusted by a factor of 12.
Re: (Score:2)
"How big would a model of an i7 be if made at the same scale as the mock-up?"
The working replica appears to be about 56cm on a side, (50,000x what it would be on a 32nm process) an i7 is about 15mm on a side, so 15mm x 50000 = 750m
Re: (Score:1)
Re: (Score:2)
Re: (Score:3)
Hey, what happened to all the Apple fans saying the Motorola chips where better?
I don't know, but this is surely cool:
http://www.visual6502.org/ [visual6502.org]
Re: (Score:2)
That's a Rockwell chip. Mot was 6800.
Re: (Score:2)
That's a Rockwell chip. Mot was 6800.
True, that, but it was also the chip in the Apple II.
Re: (Score:2)
The flame wars back in the day were between sixers and eighters. TRS-80 & CP/M vs C64 & Apple folks, and even before that was the Altair & IMSAI vs SWTPC & AIM-65 people. The main difference was a Harvard vs Von Neumann architecture design in the CPU.
Re: (Score:2)
Yeah, I had the (6502 based) Atari 800 when I was about 14, and I never could (and still can't) understand who would ever want Harvard Architecture, or the insane segmented addressing of the 80286....
FIXED THAT FOR YOU (Score:1)
Hey, what happened to all the Apple fans saying the Motorola chips were better?
Re: (Score:2)
For a long time, they were. Things changed. Freescale (Motorola spinoff) and IBM started devoting themselves to embedded systems, game consoles, big iron -- rather than tailoring their chips to Apple's needs. Notably, the G5 was too hot and power-hungry for notebooks and IBM just didn't care to fix it. It was wise to jump ship at that point.
Re: (Score:2)
Frak that. Forget about PowerPC. Let's take it way back.
IBM should have chosen the 68000 for the PC.
How would starting the 32-bit age in 1981 sound to you?
While the 68K had a 24-bit address bus and 16-bit data bus, all of the internal registers were 32-bit, aside from the CCR. That meant any non-droolingly-retarded-code would run just fine when the 68020 was released with complete and total 32-bit capabilities.
If the cost was too high (and that's utter BS right there, we're talking about an $80 part in a
Programmable Calculator (Score:3)
Re: (Score:2)
Re:Programmable Calculator (Score:4, Funny)
Yes, he does. Her name is Radio Shack EC-4004.
Interesting typo* (Score:2, Interesting)
From the technologizer article:
as Intel churned out more powerful chips throughout the rest of the 1970sâ"the predecessors of the ones inside every current Windows PC and Mac.
Really? I was pretty sure my computer has an AMD inside.
*Well, not really a typo but more of a poorly considered sentence.
Re:Interesting typo* (Score:4, Insightful)
It's not poorly worded. The history of AMD is poorly understood (by you, not them).
Re:Interesting typo* (Score:5, Informative)
There *is* an unbroken chain of compatibility from the latest AMD processors back to the 8008, which was Intel's first 8-bit microprocessor (the design of which was actually started before the 4004 design, IIRC). So they were indeed "predecessors".
Not to mention that AMD got its start in the PC business by being an officially licensed 2nd source for Intel's 8086 chips.
Re: (Score:3, Informative)
Actually, AMD processors are not 100% compatible. There are differences in behavior.
For example, everyone knows an x86 resets at FFFF:0000. But an AMD processor will throw an exception if somewhere along the line, it doesn't encounter a branch and ends up wrapping to 0000:0000. An Intel processor doesn't generate the exception. (This is because way back when, instead of putting ROM at the end of memory, designers co
AMD64 (Score:2)
There is an unbroken chain of compatibility from the latest AMD processors back to the 8008...
Of course, no discussion about Intel vs AMD compatibility would be complete without some jerk* remarking that with x86-64, Intel is actually cloning AMD's 64-bit extensions to x86. One of the more ironic twists in that race, to be sure.
* Today's jerk being played by DragonHawk
Re: (Score:2)
Well, I counted Intel providing official tools to automatically translate assembler code to the newer chips as a form of compatibility. It may have sucked, but running real mode 16-bit code on an Opteron would suck, too.
Still Kickin' (Score:3)
Failed cuz.... (Score:1)
It mostly failed because it was put in a 16-pin package, meaning that all the addresses and data had to be shuffled out and in through a narrow bus. This is a slow process. That also means you had to surround the chip with a lot of decoders and latches and buffers to hold the memory and I/O addresses and shuffle the data in and out. Same downside to the 8008. You needed like 30 chips around the CPU chip just for the very basics of generating an actual memory address and data bus.
Re: (Score:3)
But, you missed the business model. They already had (and sold) decoders and latches and buffers that "digital designers" were using for other purposes. This was the one chip to rule them all, one chip to find them, one chip to bring them all and in the darkness require them (thus driving sales...)
PCs waited until 8bits came out couple years later (Score:2)
Re:PCs waited until 8bits came out couple years la (Score:4, Informative)
8008, 6800, and 8086
Eh? While there were a few designs using 8008 and 6800, I don't think any of them was successful; high volume commercially available PCs used Z80s (the TRS-80, the Sinclair ZX-80 and Spectrum, the MSX machines) or 6502s (Apple II, Atari, Commodore). The successor of the 6800, the excellent 6809 was used in the TRS-80 Color Computer; years later, when IBM launched their PC, they used the reduced data bus version of the 8086, that is the 8088.
Re: (Score:1)
What was the largest number that it could handle? (Score:2)
Could it handle at least 16-bit numbers?
Re: (Score:1)
Wrong question. I've seen 32-bit PCs handle arbitrary precision (with some appropriate library of course), not only 32 bits. Okay, there is a limit of the size of available RAM ...
Re: (Score:3)
You are thinking the wrong way.
It was used in calculators.
4bit is enough to encode 0-9. The rest was done in software (using arbitrary precission math, although for very limited values of "arbitrary", given past constrains...
Re: (Score:3)
Corrrect. It's probably better to describe the 4004 as BCD (Binary Coded Decimal) rather than as "four bit." Storing a number larger than 9 requires eight bits, the first four store the first digit, and the second four store the second digit. The bit patterns 0xA through 0xF were actually special patterns used for various things (like marking negative numbers).
Since the original purpose of the 4004 was a calculator, this system makes a lot of sense. It might not be the most efficient use of bits (an eig
Re: (Score:2)
I meant native support for numbers. If you wrote an assembler program for this kind of processor, what would be the biggest type you can use to store integers, pointers, etc.? On 8-bit processors it was 16 bit numbers I think, which is restricting but still reasonable. If you can access at most 8 bits for pointers and native integers however, I can't imagine how this would work.
Re: (Score:2)
There are encoding schemes that allow arbitrary precision on any sized processor. The Atari 800 series running on 8 bit 6502s used a BCD encoding, it is a little slower, but fractional results are more predictable than binary representations. You can implement something like that to an arbitrary number of digits... as many bits as you have memory for (and time to wait for processing).
You might not realize.... (Score:3)
Re: (Score:1)
The R&D needed for the moon landing actually unleashed a whole lot of new stuff which shaped the tech scene for the years to come
Re: (Score:3)
...that we landed on the MOON before the invention of microprocessors! Now that's scary.
Mostly scary if you're the guy in the rocket capsule steering it by hand. Kind of comforting when you think about the accuracy of enemy ICBM targeting capabilities.
Re: (Score:2)
...that we landed on the MOON before the invention of microprocessors! Now that's scary.
Correction: they faked the moon landings before the invention of microprocessors. That's why it's so obvious. If they had today's CGI, we would be far less likely to know the truth [badastronomy.com].
HTTP Error? (Score:2)
I clicked on the link and only saw a 4004 ;)
DIY (Score:2)
Re: (Score:3)
Why not get a Raspberry Pi and be done? If you want to play at making computer systems, I'd recommend getting into the Altera / Nios design software... I'm working on a triple core system right now with each core customized (by our team) to a particular task...
Ah good times. (Score:2)
They mention photolithography. It was still being used in the mid 80's at Texas Instruments. I went from micrographics to photolithography in 84'ish. You shoot rubyliths with room sized cameras then stack the negatives, positives, or both, on top of each other on a reducer. It was all .001 tolerance work.
Re: (Score:2)
If the same rate of price reduction could be applied to 4004, then without inflation in today's money 4004 would have cost literally 0. With inflation it's less than 0, but that makes no sense
I think you need to work on your compound interest.... if year 1 is 26 and year 2 = year1*0.71, then year 40 is four thousandths of a cent.
Which is probably a fair price, compared to the cost/performance ratio of something like a pic 10f220 series chip.
One thing economists are remarkably poor at understanding, is something that cannot go on forever, eventually stops.
Re: (Score:1)
Yes, obviously it's a tiny number, in one thousands of one percent. It's zero for all purposes.
Re: (Score:3)
Obligatory:
Imagine a Beowulf cluster of zero cost 4004s....
Re: (Score:3)
So the value of dollar went down by over factor of 5 since 1971.
In 1971 the US Dollar was pegged to gold at $35 per Oz Its ~$1700 today. I don't remember exactly when during the Nixon administraion the US decoupled the dollar from gold but I think it was after the election in 1972. At any rate an oz of gold would about buy a 4004 in 1971 and a ~3.5GHz 6-core Xeon today.
Re: (Score:2)
I don't remember exactly when during the Nixon administraion the US decoupled the dollar from gold but I think it was after the election in 1972.
Actually it was August 15, 1971, three months prior to the release of the 4004. http://en.wikipedia.org/wiki/Nixon_Shock [wikipedia.org]
Re: (Score:2)
Our progress is being destroyed by our monetary policy.
Money is nothing more than some arbitrary magnetic patterns on some mainframe disk drive. Don't waste your time obsessing over it.
You don't keep money around for decades; It's a short-term medium used to buy real investments.
In the meantime, if you want progress, quit whining and start working towards progress.
Re: (Score:2)
It's a short-term medium used to buy real investments.
Yeah you try eating your illiquid assets one day when you're in a pinch. The more money you have, the more wealth you have to keep around as cash. And that's when inflation bites you in the ass.
Re: (Score:2)
Try eating your money.
Re: (Score:2)
Hebrews 13:5
Re: (Score:2)
I will never leave thee, nor forsake thee.
I claim breach of contract.
Re: (Score:3, Funny)
You would have been if your computer didn't run a 4004 microprocessor.