Forgot your password?
typodupeerror
Intel Hardware Hacking Build Hardware

Intel Allows Release of Full 4004 Chip-Set Details 124

Posted by timothy
from the cool-move-from-the-valley dept.
mcpublic writes "When a small team of reverse engineers receives the blessing of a big corporate legal department, it is cause for celebration. For the 38th anniversary of Intel's groundbreaking 4004 microprocessor, the company is allowing us to release new details of their historic MCS-4 chip family announced on November 15, 1971. For the first time, the complete set of schematics and artwork for the 4001 ROM, 4002 RAM, 4003 I/O Expander, and 4004 Microprocessor is available to teachers, students, historians, and other non-commercial users. To their credit, the Intel Corporate Archives gave us access to the original 4004 schematics, along with the 4002, 4003, and 4004 mask proofs, but the rest of the schematics and the elusive 4001 masks were lost until just weeks ago when Lajos Kintli finished reverse-engineering the 4001 ROM from photomicrographs and improving the circuit-extraction software that helped him draw and verify the missing schematics. His interactive software can simulate an ensemble of 400x chips, and even lets you trace a wire or click on a transistor in the chip artwork window and see exactly where it is on the circuit diagram (and vice-versa)."
This discussion has been archived. No new comments can be posted.

Intel Allows Release of Full 4004 Chip-Set Details

Comments Filter:
  • Awesome! (Score:1, Funny)

    by Anonymous Coward
    Maybe that means that computer architecture classes can finally start using real chips to study rather than made up chip designs?

    One of the things I hated most about my computer arch class was that we had to learn about a completely made up system design which didn't translate to ANYTHING in the real world. Oh yeah, and it was RISC. *Snoooreeee*
    • by CityZen (464761)

      I don't think you can get more RISC than the 4004's instruction set. Remember, it's a 4-bit CPU!

      • Re: (Score:3, Interesting)

        by CityZen (464761)

        I guess I'm wrong. They crammed 45 instructions into the architecture using instruction words of varying width.

        • Re: (Score:3, Interesting)

          Real programmers use wave diagrams - far more subtle than butterflies.

          I have an original hardcopy Intel 4004 User's Guide I nabbed from the 1970 Wescon exhibition. Reading through that - butterflies. Yes, the quantum weather software butterfly would have been an easier IDE.

        • IMHO the 400x designs should have fallen into public domain long ago. i.e. The government-granted monopoly on that design revoked after 28 years time (per the original 1790 copyright act).

    • Re: (Score:3, Informative)

      by m85476585 (884822)
      Real chips were made up at some point. Computer architecture classes should teach you the concepts, then when you go work for Intel you can find out all about the latest secret architectures, and you can apply what you learned in CA to making them better. Obviously you can't expect Intel to give out schematics for Core i7's or they would quickly go out of business.
      • Re: (Score:1, Offtopic)

        by seifried (12921)
        Yeah, because there are a lot of microchip pirates with several billion dollars lying around to create a modern chip fab and copy cpu's willy-nilly, putting Intel out of business inside of a few weeks probably (heck, with the speed of modern chip pirates probably a few days!).
        • Yeah, because there are a lot of microchip pirates with several billion dollars lying around to create a modern chip fab and copy cpu's willy-nilly, putting Intel out of business inside of a few weeks probably (heck, with the speed of modern chip pirates probably a few days!).

          You're being facetious, but you're forgetting that the folks at AMD, NVidia, VIA, IBM, and ARM would all love to get a look at the inner workings and design specifications of the latest Core i7. There's only so much that looking at one

          • by m85476585 (884822)
            I though AMD was starting to outsource it's fab work to save money. Link [infoworld.com]

            That means whoever they are outsourcing to probably has 45nm (newest Phenoms) and certainly 65nm capability. Maybe no one with a 45nm process would clone an Intel chip (if all the 45nm fabs are in countries where there would be a risk of lawsuit, for example), but someone with a 65nm process could clone a slightly older Core2Quad, which are still fairly competitive with the i7's.
            • I though AMD was starting to outsource it's fab work to save money

              Not exactly. AMD spun off its fab company (as The Foundry Company) to make it easier for AMD to use multiple companies for production and for The Foundry Company to get business from multiple chip designers. This means that when the fab part of AMD is having problems getting enough capacity on their latest process they can now use their excess capacity on the older process to produce chips for other people and AMD can get other companies to fab their chips.

      • Re:Awesome! (Score:5, Informative)

        by loose electron (699583) on Monday November 16, 2009 @04:11PM (#30121508) Homepage

        For the most part - Newer digital designs are language driven, not schematic driven. The advent of Verilog & VHDL lead to purely digital designs done up in code.

        Some of the special devices are done using transistor level design, but synchronous logic these days is a HDL (hardware description language) followed by gate level synthesis, and then autoplace and auto routing.

        A lot of fine tuning along the way for high performance items does get tweaked a lot but for the most part, digital chips are created as a coding exercise.

        • by m85476585 (884822)
          Of course--A graphical schematic with 600 million transistors would be useless.

          Intel probably has a custom HDL compiler/synthesizer, which they use to create the actual gates (or a description of gates they send to be manufactured, to be precise). If someone wanted to make an exact copy of an Intel chip, they would need the output, so a listing of gates and wires and their positions, not the code that went into Intel's compiler/synthesizer (unless they had access to that too). Otherwise there's no way
        • That is way oversimplifying what is needed to make a competitive chip. If it was that easy, there would be a lot of people doing it, giving Intel a lot more competition than they have. And it wouldn't take ~2-3 years per generation.

          In order to get high performance (== high frequency, and == reasonable die size), you cannot rely completely on automated tools.

      • by hairyfeet (841228)

        While I wouldn't expect them to give out Core, or hell even the "smoking hot" Netburst P4 (damn that thing was a space heater!) but why not the old x86 designs? I mean is there anybody out there that has a real use for a 286 or 386 except for history class? It would be nice to check out those old designs and I doubt they'd be giving away any trade secrets on Core with chips that old.

        In fact it would be cool, at least IMHO if we could see Intel 386 VS AMD VS Cyrix VS WinChip, just to see how each company

        • by mattack2 (1165421)

          I mean is there anybody out there that has a real use for a 286 or 386 except for history class?

          Aren't many microcontrollers based on older CPUs than that? e.g. 8 bit microcontrollers.

          • by hairyfeet (841228)

            Actually I think most of those you'll find are built on the classic Zilog Z80 [wikipedia.org] arch. They are extremely low power, used in everything from classic PCs to MIDI designs so the chips are well known, and can be made royalty free with a license from Zilog.

            So if you are looking at the embedded space I'm betting the good old Z80 is what you would find the most. From talking to engineering buds who have worked with it they say it is a sweetheart to program for and use. The only thing i remember the old 286/386 de

    • by ByOhTek (1181381)

      Likewise for me, something like "SAM". It was a nice simple case, but not terribly interesting.

      But maybe that's why they do the fake arch - because a real arch would be too complex? At least, that would explain undergraduate classes.

    • Re: (Score:3, Interesting)

      by tehSpork (1000190)
      Unfortunately the Intel 4004 is much less sophisticated than even the simplistic models I studied as an undergrad. Not to mention that real chips suffer from real compromises and real problems, something our academic fantasy-land models never had to deal with. The simple models allow the students to learn the important concepts (such as multi-cycle instructions, pipelining, caching) without having to worry about why it was implemented a certain way, the concepts are what counted.

      In my computer architectu
      • by nurb432 (527695)

        IA32?

        Damn kids these days.. Back when i was your age we had 8 bits and appreciated it!

        All kidding aside, learning the Z80 inside and out ( and designing my own 8 bit machine later ) didn't hurt me one bit.

        • Did it hurt you eight bits? :-)

          Kidding aside, I was a 6502 guy back in the day. There was a book called "How to build a microcomputer and really understand it" (or something along those lines) that took you through what all the control lines did and how the interrupts worked, etc. That, and the reading through the OS listing for the Atari 400/800 really gave me a firm grasp on how it all fits together.

          I'd really like to get a copy of that book again (loaned it out, never got it back). It had printed
    • Re:Awesome! (Score:5, Interesting)

      by dissy (172727) on Monday November 16, 2009 @04:46PM (#30122188)

      One of the things I hated most about my computer arch class was that we had to learn about a completely made up system design which didn't translate to ANYTHING in the real world. Oh yeah, and it was RISC. *Snoooreeee*

      That's only because you dropped out before getting to the FPGA [wikipedia.org] classes!

      Any functional CPU design (technically non-functional ones too, for whatever good that would do) can be flashed into an FPGA and become as real as any other silicon chip.

      And identical to psudocode, psudo-chipfab can be translated into any real code/fab language by anyone that knows basic design and the target language. You were supposed to be learning the basic design part, so once you got to using a real language used in the real world, you would have some clue what to do with it.

      • Very true

        One of the extra courses I could take was making our 32 bit MIPS design run on FPGAs. In that class the teachers would give us pre-designed modules for memory controller, IO (keyboard) and video to boot a very simple OS on them.

        Didn't take that course though.

    • by Alioth (221270)

      Learning about designing your own CPU from scratch? Snore?

      I think you may be on the wrong course.

  • by CityZen (464761) on Monday November 16, 2009 @03:39PM (#30121050) Homepage

    When we get the Core i7 details, will it seem as quaint as the 4004 does now?

    • Re:So in 2047... (Score:4, Interesting)

      by V!NCENT (1105021) on Monday November 16, 2009 @04:00PM (#30121368)

      At that point in time retired Intel employees would say: "It was all binary... You know ones and zero's on solicon *audience laughs*, which was a bunch of sand basically. Heh... And at that time we were bumping against the limits of this technology so we decided to bake a multitude of them on a single die. Haha... dear God... can you imagine? *audience laughs* Programming this was, well you can imagine, not so pretty. Taking advantage of this technology was still very hard at that time, but OpenCL largely made up for it, so... Any questions?"
      -"I worked for a RAM company at that time. And I realised that while the CPU was in fact doing everything in parallel, the RAM was actually serialy read out. What was your stand on this?"
      Ühm... *audience laughs* That question is for [person sitting next to the speaker]. *audience laughs harder*

      I think that the Core i7 is a little bit too complex to understand right away. I mean with the 4004 everything was realy, realy basic. It had a design team consisting of four people. Nowadays it takes a whole team to improve it all. So I guess the awnser is no.

      • by Elder Entropist (788485) on Monday November 16, 2009 @04:19PM (#30121608)

        I mean with the 4004 everything was realy, realy basic. It had a design team consisting of four people. Nowadays it takes a whole team to improve it all.

        Yes, one person for each bit. Nowadays you need 64 or 128 person teams.

        • by nschubach (922175)

          How are you supposed to find out if the chip is working right if you don't have enough people to stand or sit based on their current instruction?

          • That's why Intel's HR department has such a high turnover rate. Scheduling vacation time is a massive headache, let alone the unexpected family emergencies. They've tried to automate it, but there's a lot sitting in the inbox to process at any time.

        • Re: (Score:1, Funny)

          by Anonymous Coward

          And quantum computers require one person per qubit.

          The only problem is they're both working on it and not working on it at the same time ... if you know what I mean.

          • by frozenray (308282)

            > The only problem is they're both working on it and not working on it at the same time

            I imagine this poses one hell of a problem for middle management when it comes to year-end reviews. I don't know, do they put their developers into boxes containing poison gas flasks linked to geiger counters in order to determine who's slacking off and who's actually working?

    • Re: (Score:3, Interesting)

      by MobyDisk (75490)

      No. (I know the question was rhetorical, but I can't resist answering).

      The 4004 had 2,300 transistors. A college student can create and debug a processor more powerful than that in a semester. It is possible to memorize the entire thing. A Core i7 has around 300 million transistors. Unless human intelligence changes significantly, one human could not memorize and understand 300 million transistors.

      • Re: (Score:1, Insightful)

        by Anonymous Coward

        Suicide: commit it.

      • by CityZen (464761)

        > Unless human intelligence changes significantly...

        Ah, so now we get to the meat of the matter!

      • by Alioth (221270)

        Although a great deal of those transistors will be the same thing over and over again - the cache.

  • by V!NCENT (1105021) on Monday November 16, 2009 @03:39PM (#30121058)
  • Italian business (Score:5, Interesting)

    by VincenzoRomano (881055) on Monday November 16, 2009 @03:44PM (#30121116) Homepage Journal
    It'd be nice to remember that the Italian Business [wikipedia.org] was a good thing in this case at least!
  • I wonder what clockspeed it would get. I know it's completely useless/pointless, but I'd be interested to see anyway.

    • Re: (Score:3, Interesting)

      by hydromike2 (1457879)

      better question, how would they physically handle a processor that small, 4004 has 2300 transistors, http://en.wikipedia.org/wiki/Intel_4004 [wikipedia.org] , and the i7 has 731 million transistors at 45nm at 263 mm^2, http://www.legitreviews.com/article/824/1/ [legitreviews.com] , So by those numbers the 4004 on a 45 nm process would have an area of .00082749 mm^2 or 1/317826th the physical size of an i7 die. Disclaimer: this is a very rough calculation, but in any case it is more than 5 orders of magnitude smaller than an i7. On the other

      • I thought Intel was already doing something like this? It was going to somewhat similar to a Cell processor except with something like 128 Pentium 1 cores on it.

    • Re: (Score:3, Informative)

      Probably the same 740kHz that the original 4004 had.

      The manufacturing process used has nothing to do with the maximum clock speed a chip can achieve. It's about energy bleeding (heat loss) and the transistor density. If you manufacture a 4004 using 1950's-era technology, with actual honest-to-goodness 1mm-thick copper wire and large physical transistor switches, it'd be a *lot* bigger, but it'd achieve the same 740kHz that the design allows for.

      The reason using a smaller manufacturing process translates int

      • by mako1138 (837520) on Monday November 16, 2009 @04:54PM (#30122302)

        This means that you can cram more transistors in to the same area of silicon, allowing you to complete more operations per clock cycle.

        This is true, but smaller process nodes also produce faster transistors. When you make things on the chip smaller, you have the practical effect of reducing parasitic capacitance in transistors and interconnect. Lower capacitance means a smaller RC time constant (using a first-order model), so logic will work faster. Intel's 45nm process can create an inverter with a delay of less than 5 ps.

        Your statements imply that transistors have a fixed speed, and that the only way to improve performance is parallelism. This is false.

        • by Dadoo (899435)

          smaller process nodes also produce faster transistors

          I was thinking the same thing. In fact, I'd be inclined to believe that, since the resulting chip would be so small, you could actually get it up to a higher clock speed than a current CPU. However, you wouldn't be able to interface it with anything, because you'd never get I/O signals at that frequency off-chip, without ruining them. You'd need to have at least some memory and some type of I/O controller on the same chip, to make it work.

    • by Hatta (162192)

      For that matter, what if you made a CPU with a hundred million [slashdot.org] of these?

    • by mako1138 (837520)

      This would be an interesting homework problem for a digital design class. First, find the single-cycle instruction that will take the longest amount of time. Then, figure out the critical path. Find the logic delay given a particular modern standard cell [wikipedia.org] library.

  • by Anonymous Coward

    Imagine a beowulf cluster of 4004 emulators...

  • by filesiteguy (695431) <kai@perfectreign.com> on Monday November 16, 2009 @03:52PM (#30121260) Homepage
    ...run Linux?

    j/k

    This should actually be quite cool. I can see garage-based tinkerers messing with this chip, the registry and even coming up with a retro User Group.
  • Imagine a beowulf cluster of these!
    • In Soviet Russia, 4004 releases details of you!
    • by NoYob (1630681)

      Imagine a beowulf cluster of these!

      First Post said just that.

      I know that most first posts are GNAA trolls, or something else pretty obtuse, but come on! You're waaaay down here and you honestly thought you'd be the first one to post that?!

      There's already been a "Does it run Linux?" post and if I dug into the -1s, I'm sure there would be a "In Soviet Russia, 4004 processes you!" or some such thing about Cowboy Neal's something using 4004 in the description.

      These are things one learns in the first few days of Slashdotting.

      Man, go and read "S

      • by Splab (574204)

        You are of course aware that slashdot presents posts in different order depending on your settings? So while his post might be "way down there" in other views it will show up in the top.

        Perhaps you yourself should go look for that elusive slashdot for beginners...

  • by Anonymous Coward

    Cruising over to 4004.com gives "page cannot be displayed". While I'm sure it's slashdotted, I can't help but wonder if they used one for their web server......

  • by Anonymous Coward on Monday November 16, 2009 @04:29PM (#30121806)

    http://www.intel4004.com/ [intel4004.com] goes into much greater detail about Federico Faggin (primary co-developer and project leader), and the story of his accomplishments before and at Intel, his physical signature on all 4000 series chips, Intel's successful attempt to discredit him and patent his invention (the buried gate) that he invented at Fairchild before coming to Intel, and his departure to found Zilog with some members of his older design team.

    Intel has been playing their game their way for a very long time.

    • http://www.intel4004.com/ [intel4004.com] goes into much greater detail about Federico Faggin (primary co-developer and project leader), and the story of his accomplishments before and at Intel, his physical signature on all 4000 series chips, Intel's successful attempt to discredit him and patent his invention (the buried gate) that he invented at Fairchild before coming to Intel, and his departure to found Zilog with some members of his older design team.

      Intel has been playing their game their way for a very long time.

      It's a pity and a big mistake that such a great engineer did not get a Nobel prize yet. He revolutionized our world.

      • Engineers don't get Nobel Prizes. The prizes are for science not engineering.
        • Engineers don't get Nobel Prizes. The prizes are for science not engineering.

          In fact I meant nobel for physics. The silicon gate technology, invented by Faggin, is essential for the boost of the microprocessor technology, but it is indeed a physics discovery.

          • No, it's an invention. Inventions are engineering, discoveries and theories are science. Process technologies are definitely not scientific discoveries. The photovoltaic effect is a physics discovery, an efficient photovoltaic cell is an engineering invention. The former can get a Nobel Prize, the latter can not.
            • No, it's an invention. Inventions are engineering, discoveries and theories are science. Process technologies are definitely not scientific discoveries.

              It seems that you are right, the Nobel committee emphasize on discoveries over inventions, but this was not the original intention of Alfred Nobel [wikipedia.org].
              So, if sir Alfred was still alive, he could give the prize to him :-D

    • by Alioth (221270)

      The site doesn't make it clear - was Faggin shafted by Intel while working for him (and left to form Zilog as a consequence), or did Faggin leave and start Zilog, and then Intel tried to discredit him as an act of sour grapes?

  • by SwedishChef (69313) <craig.networkessentials@net> on Monday November 16, 2009 @04:31PM (#30121850) Homepage Journal

    In the very early 70s our engineering group was interested in using the new 4004 to simplify the production of control systems for heavy machinery (windlasses, hydraulic systems, etc). The machinery itself was slightly different from contract to contract and even from item to item within a contract so we had to design a new control system for each unit. When the 4004 came out we were excited to see if we couldn't do it cheaper and faster using a microprocessor.

    We had moved from relays and discrete wiring to CMOS components on printed circuit boards and thought that was a big step. CMOS could be run at 15vdc which meant that the noise inherent in the environments our machinery worked in would not be quite as big a problem.

    Unfortunately we discovered that we had several problems including the limited instruction set and memory capabilities of the 4004 along with the lower voltages needed so we stuck to CMOS until I left a couple of years later.

    Still, the 4004 was my introduction to microprocessors and that changed the course of my career from electronics and electronic control systems to digital control systems and computers.

    It's been an exciting ride, too. I am grateful to have grown up with the technology.

    • By the time I hit the streets, the 80386 was the hotness and the 486 was just around the corner. I love hearing tales of the trenches from the Good Old Days. As exciting as technology remains to me, working in the field and using it constantly, I still miss the simpler times. Maybe it's because what I know now would have made me a Technology God 20 years ago, or maybe it's just because there was something different about a time where things were changing rapidly but the field was still on a scale where s

  • by serviscope_minor (664417) on Monday November 16, 2009 @05:38PM (#30122940) Journal

    Available for non commercial use? Are they even entertaining the possibility that somoent might try to profit from the design?

    • Given the morbid fascination the geek world often has with retro computing, it's not something I'd ground rule myself.

    • by sootman (158191)

      Yes! I intend to use this documentation as a starting point for my own product line. I hope to learn quickly and make more advanced designs, which will also be smaller, and I will compete directly with Intel. I will call my company Advanced Micro Designs. [wikipedia.org]

  • Somewhere around 1975 or 1976 I worked at the micro-electronics lab at Point Mugu Naval Air Station. We did a number of projects using a 4004 and those awful 1702 EPROMs. I remember using one to run a X/Y Table and sensor probe to test thick film (might have been thin film) resistor wafers. If a chip wasn't in tolerance a drop of magnetic ink would be dropped on it.

    We used a timeshare service via a Model 33 teletype with acoustic modem to access a 4004 assembler. It would spit out a paper tape that we wou
  • I just need to make my own chip fab in my garage and hundreds of hazardous chemicals that are sure to get me on DHS's shit list....

"Consistency requires you to be as ignorant today as you were a year ago." -- Bernard Berenson

Working...