Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel's 4004 Microprocessor Turns 45 (4004.com) 74

mcpublic writes: Tuesday marked the 45th anniversary of the 4004, Intel's first microprocessor chip, announced to the world in the November 15, 1971 issue of Electronic News . It seems that everyone (except Intel) loves to argue whether it was truly the "first microprocessor"... But what's indisputable is that the 4004 was the computer chip that started Intel's pivot from a tiny semiconductor memory company to the personal computing giant we know today. Federico Faggin, an Italian immigrant who invented the self-aligned, silicon gate MOS transistor and buried contacts technology, joined Intel in 1970. He needed both his inventions to squeeze the 4004's roughly 2,300 transistors into a single 3x4mm silicon die. He later went on to design the Intel 8080 and the Zilog Z80 with Masatoshi Shima, a Japanese engineer with a "steel trap mind," the once-unsung hero of the 4004 team [YouTube].
Long-time Slashdot reader darkharlequin also flags the " fascinating, if true" story of Wayne D. Pickette, who was hired by Intel in 1970, worked on the 4004 project, and according to ZDNet "claims that prior to that, during his job interview with Intel founder Bob Noyce, he showed the company a block diagram of a microprocessor he'd started to work on three years previously when he was 17."
This discussion has been archived. No new comments can be posted.

Intel's 4004 Microprocessor Turns 45

Comments Filter:
  • by Anonymous Coward

    Does anyone know if any 4004s are still functioning in a regular everyday way (not museums / collections)?

    • Re:Any still used? (Score:4, Interesting)

      by Lisandro ( 799651 ) on Saturday November 19, 2016 @03:48PM (#53322693)

      Don't think so. Around 1980 a lot of devices turned to the Z80 for a cheap 8-bit microprocessor - that one, yes, is still widely used.

      • by Anonymous Coward

        Around the 80's there was also a lot of 8-bit stack from Intel, but it was mostly based on the 8085 (as it was Z80 from a ABI perspective, if I recall correctly - of the ~153 instructions, 70-something was the complete instruction set from 8085). Some of this tech is still widely available today (abreit with complete different manufacturing processes and usually a ton of additions) as the 80x31/80x51 line of microcontrollers (and a ton of clones that extend the base instruction set).

      • Re:Any still used? (Score:4, Interesting)

        by MichaelSmith ( 789609 ) on Saturday November 19, 2016 @05:01PM (#53322975) Homepage Journal

        Yeah I am pretty sure a lot of the road traffic monitoring/incident detection systems in my city run on Z80s.

    • They were used for the controllers in high end Litton microwave ovens built in the mid 1970's, so it is possible.
    • My best guess would be in some industrial setting like an early CNC machine.

    • Yes, Cateye bicycle speed indicators.

    • I doubt it. It was made for a specific product, for which it was inadequate. It was barely able to implement a four function calculator, and needed a lot of support logic to do that.
    • Yes the Voyager space craft is still running but difficult to communicate with as it has left our solar system uses an Intel 4004

      • by ogdenk ( 712300 )

        Wrong, Voyager used a mutant custom architecture. The first real microprocessor in space was the RCA 1802 which flew on some MAGSAT, Galileo and even hubble. The 1802 was the first microprocessor built in a radiation hardened version. Kinda weird but it was a cool CPU and very power efficient but a bit slower than the 8080 or 6502.

        • The RCA 1802 (COSMAC) was a CMOS micro, which gave it the capability of low power and static operation. Most early micros were NMOS, but the 4004 was PMOS.
      • This is a myth. Viking and Voyager used custom processors.
    • The 4004 was used in the first electronic taximeter, the Argo Kiensle 1140, and these were in service for many decades, but have since been largely replaced. I'd speculate that there are still some traffic lights that have an Intel 4004 inside. The chip went out of production in 1986.
      • Finally, a tech story - a rarity for /. these days

        I wonder if one could take the 4004 and all its supporting logic, along w/ even some RAM, and put it together in an FPGA? Along w/ maybe code to control things like traffic signals? That could then get a huge market in the Third World

  • Wow! (Score:2, Troll)

    by backslashdot ( 95548 )

    FFFUCK. This thing is amazing! It's got a whopping sixteen 4 bit registers! Finally ... I have been waiting a while for this. And I can't believe they've actually got the accumulator and push down stack one ONE chip ... not to mention the 4 bit parallel adder. Can't wait to purchase,

    • by Adam C ( 4306581 )
      You may mock but I cut my teeth on the 6502 and Z80 which weren't that much more advanced than this chip.

      It's not what you have, it's how you use it -.o;
      • by _merlin ( 160982 )

        Yeah, sixteen registers is actually quite a lot - for a long time you were lucky to get eight general-purpose registers.

      • I wrote assembly for the Z-80 back in 1978 on a TRS-80.

        I also wrote articles for 80-Microcomputing.

        I designed a thermometer and a battery checker using an A-D converter from Analog Devices.

        I wrote a sales rep in Houston asking if I could buy a single chip.

        They sold for $22 each, in lots of 1.000.

        He agreed to sell me one in exchange for rights to use the article after I published.

        Great times.

        I had heard of the 4004, but I never messed with it.

      • Ehh I wasn't mocking it all, but I can see how it can come off that way to some people.

      • Maybe, but those were far better. For all the nostalgic fondness, the 4004 had a lot of compromises in order to make it viable. When you have 8 bits to play with you can actually do something useful in a single register. They're useful enough that they're both still in production, when the 4004 hasn't been since 1981 or something.
    • Sixteen four bit registers is not what you think ...

      That holds ONE sixteen digit decimal number - or, marginally more usefully, two eight digit ones (unsigned, of course). The instructions were 8-bits wide. That is what you call RISC!

      If you really want to experience the true horrors of an early 4-bit micro, you can probably still get the National Semiconductors COP range. I used the high end parts to implement ASCII pagers, an ECG, and several selective calling radios (a kind of primitive cellphone) - inc

    • Dude this 1970. Before that we mainframes the size of rooms, punch card storage, and computers with boards and boards stacked up to the size of a fridge. That one we called a mini computer because it was so small and had up to 1 meg of RAM

      • There were computers smaller than fridge and room sized before 1970.

        Here's a famous one:

        https://en.wikipedia.org/wiki/... [wikipedia.org]

  • The first era of silicon valley was the commercialization the transistor between the 1950s and early 1970s by Shockley and his renegades from the east coast. This established the culture of quick startup companies and nomadic engineers you really hadnt seen elsewhere in the world. Then when the number of transistors on a single chip exceeded a thousand using cheap CMOS technology you could put a whole CPU on chip and complete computer in a box for a couple thousand dollars. This lead to the second era of th
  • by XNormal ( 8617 ) on Saturday November 19, 2016 @04:29PM (#53322877) Homepage

    I was born one the same date (... 4 digit slashdot id checks out...). I have been using microcomputers since I was 10. I have never worked at anything other than software and hardware development.

    Our contemporary computing ecosystem has evolved from the microcomputers I was born with. They actually have some architectural details that can be traced to the 4004's successor, the 8008.

    Our computers are not descendants of the mainframes that came before them. By now, they have acquired many of the advanced features of mainframes. Implemented badly, several decades later. It is fascinating to learn about the history of mainframes. It is also somewhat depressing.

    Those who do not learn from history are doomed to repeat it. Those who do learn are doomed to watch everyone else repeat it.

    • People want their computers to be mobile, and this means they can't do much computation. Also its easier to make money when stuff is in the cloud, or so it seems at least.

      • I think a modern phone could crunch an IBM mainframe 370 quite easily in computation. The question is why would anyone want that?

        • Could but usually doesn't. As the hardware was more costly and slower, and labor relatively cheaper, mainframes ran in some sense "better" code with far less bloat and frillage. An A was just an A (ascii or baudot or ebcdic) - not a picture of a letter in some font taking many times the bits to store and draw for just one example. Audio or video which were (And still are) largely irreducible to small bits/second were right out for real time use.
          Mainframes had "acceleration" hardware to compensate. Lin
          • Could but usually doesn't. As the hardware was more costly and slower, and labor relatively cheaper, mainframes ran in some sense "better" code with far less bloat and frillage. An A was just an A (ascii or baudot or ebcdic) - not a picture of a letter in some font taking many times the bits to store and draw for just one example. Audio or video which were (And still are) largely irreducible to small bits/second were right out for real time use. Mainframes had "acceleration" hardware to compensate. Line printers took a few bits and did the drawing parts (as did plotters for other uses).

            The old line printers didn't do any drawing of pixels, they just had a hammer slam a piece of metal, with a character formed on it, against a ribbon, hitting a piece of paper. The laser printers like the IBM 3800 just let the processor in the printer draw all those bits, rather than doing it in the main CPU.

            (And, in a file, an A is still just an A, these days - although if it's UTF-16, it's two bytes rather than one. Those pictures aren't stored in most documents.)

    • Our computers are not descendants of the mainframes that came before them. By now, they have acquired many of the advanced features of mainframes. Implemented badly, several decades later. It is fascinating to learn about the history of mainframes. It is also somewhat depressing.

      And sometimes it goes the other way; the IBM z13 microprocessor cracks z/Architecture instructions into micro-ops and schedules and executes the micro-ops, just as the Pentium Pro and later x86 microprocessors do.

      But you're probably referring to system architecture characteristics, in addition to CPU characteristics.

      (Speaking of system architecture characteristics, the z13 has I/O instructions to bang on PCI space, so you could plug PCI devices in and have minicomputer/microcomputer-style drivers, rather

    • They actually have some architectural details that can be traced to the 4004's successor, the 8008.

      If this is true then it is coincidental. The 8008 was originally designed by Datapoint and came in "sideways" to Intel. The 8008 was not an evolution of the 4004.

      • the chip was commissioned by Computer Terminal Corporation (CTC) to implement an instruction set of their design for their Datapoint 2200 programmable terminal.

        wikipedia

  • by Ungrounded Lightning ( 62228 ) on Saturday November 19, 2016 @05:49PM (#53323145) Journal

    Federico Faggin ... later went on to design the Intel 8080 and the Zilog Z80 with Masatoshi Shima, a Japanese engineer with a "steel trap mind ...

    Which leaves out the fact that the 8008 was not an in-house-conceived upgrade of the 4004. Instead it was a commission, from Datapoint corporation, to implement the instruction set of their Dreatapoint 2200 terminal as a microprocessor chip.

    A failed commission at that: TI dropped out early, and Intel got theirs to work, but with a chip that came in late, and slower than Datapoint's 100-ish chip TTL design (even though the latter's ALU was serial rather than parallel). So Datapoint and Intel agreed to settle the contract, with Datapoint being refunded the costs and Intel getting to sell the chip as their own when they got it finished, and make derivatives.

    Great deal for Intel. Not so hot for Datapoint, whose flagship terminal was now facing competition based on their own instruction set and designs.

    When you cut a deal with a big semiconductor house, you have to watch out for this sort of thing. As I understand it, the TI calculators came from a similar situation where TI built a 4-bit processor as a commission for a calculator manufacturer, then built and sold their own products around it and its follow-ons.

    Similarly with Ford and Motorola. Ford commissioned the processor for the EEC-III without including an option for a spin to include design upgrades identified as very-useful-to-necessary. They identified several things that would make the chip better. So they reported them to Motorola in the hopes they'd incorporate them in a follow-on despite no contractual obligation to do so. They did make a follow-on with the improvements, which they sold to GM. B-b

    So, as with a Deveel, if you think you cut a good deal with a semiconductor company, be sure to count your fingers, then your toes, then your relatives...

    • Not so hot for Datapoint, whose flagship terminal was now facing competition based on their own instruction set and designs.

      I went to work for Datapoint in 1978. Was put to work writing software for the 1500 which, ironically, used a Z80 processor. So Datapoint actually used their own design two generations removed. (i.e. 8008 -> 8080 -> Z80)

      The 1500 wasn't all that successful because it was tooaffordable. Datapoint salesmen preferred selling the 5500 and 6600 which were much more profitable (commission-wise).

    • by epine ( 68316 )

      A failed commission at that: TI dropped out early, and Intel got theirs to work, but with a chip that came in late, and slower than Datapoint's 100-ish chip TTL design

      That's an extremely loaded definition of "failure".

      Order-of-magnitude integration density improvement using a new and relatively unproven technology—with one foot now securely fastened on Gordon Moore's neck-breaking turbo-lift.

      Nevertheless, I'm sure some impatient PHB with his neck in the traditional delivery noose managed to turn chick

  • by Anonymous Coward

    In 1970 a reasonable CPU would need ~20k transistors, and often had some user visible registers. In that year a single chip RAM could have 256 bits
    of static RAM (registers), the limit being chip area and power dissipation. Furthermore 16 pin chips were practical, but more pins were too costly.
    Ted Hoff's trick was using dynamic RAM for the CPU's registers (4 x16, the PC, and stack), and a multiplexing schema for using only 16 pins for
    each of the 3 chips: CPU, ROM, RAM. Note: the applications for the microco

    • put I/O on the ROM and RAM chips

      Like the 6502 which only had one bus, for ROM, RAM and IO. But memory mapped video RAM was obviously IO as well.

  • Imagine a Beowulf Cluster of these!!!

  • If bipolar transistors are faster than IC (MOS) transistors, then why don't they try to make bipolar chips now, being manufacturing either may have improved, or R&D goes further since chips are big biz? Or, do other factors matter more?

    • Mips or flops/watt, and easy-cheap manufacture. I did some work in the real bipolar world, from discrete to ECL to - IIL (integrated injection logic, a TRW thing)....it was "hot stuff". Easier to make complimentary cmos such that you can make an inverter with just two transistors tieing the gates together - gotta drive both high and low some way. Enhancement mode CMOS is a lot easier to do that with.
    • Bipolar transistors are current-controlled current sources; which means that bipolar digital logic is always pulling current. Complex bipolar digital logic thus needs a lot of power, so much so that a bipolar equivalent to an Intel i7 series is simply not possible. Bipolar logic also tends to require more components per logic function.

      By way of contrast, MOS transistors are voltage controlled current sources, and the logic circuits are designed so that current is pulled only during transitions.

"If it ain't broke, don't fix it." - Bert Lantz

Working...