Intel Allows Release of Full 4004 Chip-Set Details 124
mcpublic writes "When a small team of reverse engineers receives the blessing of a big corporate legal department, it is cause for celebration. For the 38th anniversary of Intel's groundbreaking 4004 microprocessor, the company is allowing us to release new details of their historic MCS-4 chip family announced on November 15, 1971. For the first time, the complete set of schematics and artwork for the 4001 ROM, 4002 RAM, 4003 I/O Expander, and 4004 Microprocessor is available to teachers, students, historians, and other non-commercial users. To their credit, the Intel Corporate Archives gave us access to the original 4004 schematics, along with the 4002, 4003, and 4004 mask proofs, but the rest of the schematics and the elusive 4001 masks were lost until just weeks ago when Lajos Kintli finished reverse-engineering the 4001 ROM from photomicrographs and improving the circuit-extraction software that helped him draw and verify the missing schematics. His interactive software can simulate an ensemble of 400x chips, and even lets you trace a wire or click on a transistor in the chip artwork window and see exactly where it is on the circuit diagram (and vice-versa)."
Awesome! (Score:1, Funny)
One of the things I hated most about my computer arch class was that we had to learn about a completely made up system design which didn't translate to ANYTHING in the real world. Oh yeah, and it was RISC. *Snoooreeee*
Re: (Score:1)
I don't think you can get more RISC than the 4004's instruction set. Remember, it's a 4-bit CPU!
Re: (Score:3, Interesting)
I guess I'm wrong. They crammed 45 instructions into the architecture using instruction words of varying width.
Re: (Score:3, Interesting)
I have an original hardcopy Intel 4004 User's Guide I nabbed from the 1970 Wescon exhibition. Reading through that - butterflies. Yes, the quantum weather software butterfly would have been an easier IDE.
Re: (Score:2)
IMHO the 400x designs should have fallen into public domain long ago. i.e. The government-granted monopoly on that design revoked after 28 years time (per the original 1790 copyright act).
Re: (Score:3, Informative)
Re: (Score:1, Offtopic)
Re: (Score:2)
You're being facetious, but you're forgetting that the folks at AMD, NVidia, VIA, IBM, and ARM would all love to get a look at the inner workings and design specifications of the latest Core i7. There's only so much that looking at one
Re: (Score:2)
That means whoever they are outsourcing to probably has 45nm (newest Phenoms) and certainly 65nm capability. Maybe no one with a 45nm process would clone an Intel chip (if all the 45nm fabs are in countries where there would be a risk of lawsuit, for example), but someone with a 65nm process could clone a slightly older Core2Quad, which are still fairly competitive with the i7's.
Re: (Score:2)
I though AMD was starting to outsource it's fab work to save money
Not exactly. AMD spun off its fab company (as The Foundry Company) to make it easier for AMD to use multiple companies for production and for The Foundry Company to get business from multiple chip designers. This means that when the fab part of AMD is having problems getting enough capacity on their latest process they can now use their excess capacity on the older process to produce chips for other people and AMD can get other companies to fab their chips.
Re: (Score:2)
Can't touch the China i7 though...
Re:Awesome! (Score:5, Informative)
For the most part - Newer digital designs are language driven, not schematic driven. The advent of Verilog & VHDL lead to purely digital designs done up in code.
Some of the special devices are done using transistor level design, but synchronous logic these days is a HDL (hardware description language) followed by gate level synthesis, and then autoplace and auto routing.
A lot of fine tuning along the way for high performance items does get tweaked a lot but for the most part, digital chips are created as a coding exercise.
Re: (Score:2)
Intel probably has a custom HDL compiler/synthesizer, which they use to create the actual gates (or a description of gates they send to be manufactured, to be precise). If someone wanted to make an exact copy of an Intel chip, they would need the output, so a listing of gates and wires and their positions, not the code that went into Intel's compiler/synthesizer (unless they had access to that too). Otherwise there's no way
Re: (Score:1)
That is way oversimplifying what is needed to make a competitive chip. If it was that easy, there would be a lot of people doing it, giving Intel a lot more competition than they have. And it wouldn't take ~2-3 years per generation.
In order to get high performance (== high frequency, and == reasonable die size), you cannot rely completely on automated tools.
Re: (Score:2)
Re: (Score:2)
Aren't many microcontrollers based on older CPUs than that? e.g. 8 bit microcontrollers.
Re: (Score:2)
Re: (Score:1)
Intel is making tons of money off their chips, and they have little competition. There's AMD, but they are only barely keeping up. If the Core i7 schematics were released, any old fab company could start making their own i7's for next to nothing. There would be no R&D to pay for and almost no cost to them other than what actually goes
Re:Awesome! (Score:5, Informative)
> If the Core i7 schematics were released, any old fab company could start making their own i7's for next to nothing.
Wrong.
Let's even imagine for the moment that you really meant that they'd release the verilog/vhdl, instead of schematics. There are still a few minor problems in the way:
1 - Intel really does have absolutely top-notch processing capability. Typically their top-end CPU pushes their top-end process for all it's worth, both in performance and capacity. (I'll add the caveat that "all it's worth" is a moving bar, which is why speed bumps and die shrinks come along as a process and design mature.) Chances are most fabs in the world simply won't be able to handle the Core i7 - not enough transistors.
2 - Let's pretend that you have a fab that can put out bigger-than-postage-stamp sized chips, and they can handle the sheer number of transistors. Most likely you still can't hand over such HDL, push a button, and have a layout come out, even bigger and slower. For one thing, a significant fraction of those transistors are in cache - probably SRAM. HDL doesn't build SRAM, it instantiates it. You need either a compiler or an SRAM design team(s) to get the cache(s) built, and they have to be specifically matched to the interface the HDL is expecting - these aren't garden-variety commodity SRAMs, by any means.
3 - So let's pretend we have SRAMs too, and that the design we had in our back pocket could be tweaked to meet the interface requirements of the Core i7. We have datapath/dataflow problems. In the first place, those datapaths are highly regular - kind of like bit-slices. A lot like bit slices, in fact. Most likely the design was carefully partitioned into functional blocks, and those functional blocks were further partitioned, etc. Then they were floorplanned with an eye to the final design. Far from the smallest concern was getting all of those bits from point-A to point-B to point-C. These things have some pretty big buses inside, and just about everything is high-performance.
In short, a schematic, even verilog/vhdl is a far cry from the whole picture. Even in today's push-button world, you don't push-button a thing like the Core i7, or even latest-generation AMD CPUs, to be fair. You need to have a talented, experienced physical design team, and there's as much work there, maybe more, than simply coming up with the logical design. Then again, frequently the logical and physical design may not be that separated - a really tight feedback loop between the two can work well.
So go back to your super-sized non-optimized chip done with push-button tools - oh and by the way, you may have a hard time finding such tools with enough capacity. The resulting chip won't be a little bigger and a little slower - it'll be a LOT bigger and a LOT slower.
Does anyone know what the technology was for the 4004? (Is that metal-gate, with double-metal, or polysilicon gate with single-poly, single-metal?)
Re: (Score:2, Interesting)
Does anyone know what the technology was for the 4004? (Is that metal-gate, with double-metal, or polysilicon gate with single-poly, single-metal?)
Well, I do look at photomask stacks as part of my job from time to time as a process integration engineer (mask bugs do make it past design rule checking and tapeout sometimes) but I will start with a disclaimer that this chip and process was designed before I was alive.
It looks from the composite drawing that this is a single poly/single metal/self aligned doped poly/source/drain. That should have existed at the time and to my knowledge no metal gate process has been in wide use because of manufacturabili
Re: (Score:2)
It looked like that to me, too. But then I remembered that I working with a metal gate technology later in the 70's, so I wasn't sure. I agree with the red being gate, blue metal, and green diffusion. I didn't take a lot of time looking at the contacts - now that I think a bit more, the metal-gate technology I was working with in the 70's of course had a gate mask, and I saw no such relief on those images. Doh!
Re: (Score:2)
Thought about it again. You're right - single-poly self-aligned gate. Of course there would be no gate level visible in relief, because this was a mask shot, not a photo, and there was no visible gate mask level. I actually RTFA just a little, and there are bootstrap loads in there. Layout for bootstrap loads is a dead giveaway for the difference between self-aligned silicon-gate and metal-gate with a gate mask. Just bootstrap loads, though. No Mostek bootstrap drivers, but I'm not sure when those cam
Re: (Score:1)
So go back to your super-sized non-optimized chip done with push-button tools - oh and by the way, you may have a hard time finding such tools with enough capacity. The resulting chip won't be a little bigger and a little slower - it'll be a LOT bigger and a LOT slower.
I'm not sure that's true. I read that most of the logic on an Intel chip is synthesized, only a small amount is done by hand. In a way it's a bit like applications - most code is written in a high level language and compiled an the most critical one percent or less is done in assembly. Actually my guess is that now the world is moving from x86 to x64, computers are fast and compilers are so smart that's not true for most mainstream applications and the whole thing is written in a high level language just be
Re: (Score:2)
I wasn't so much talking about the detail gate layout as I was the high-level floorplanning. Certainly the design has logical partitioning, but there are occasions where a concise logical partitioning divides things one way, but good physical partitioning wants to be different. A good simplified example would be the classical bitslice. Logically you'd like to have an ALU, a shifter, a register file, and a mux. Physically it works out better to partition into bitslices, each with a 1 (or few) bit adder,
Re: (Score:1)
There are synthesizable cores sold - Phoronix just had an article about AMD coming out with a new one to compete with Atom. ARM has a whole business model based on them. I don't know whether they're blind pushbutton, or whether they come along with hints on how to guide place and route for best power/performance. Now that I think of it, I know a few people a few floors up that I can ask. I'm in a different part of the business, but not ignorant of that side.
Actually most ARM cores are hard macros - i.e. ARM port the CPU to a specific process e.g. the latest TSMC one. Of course you can buy synthesizable cores too - then you get the HDL and it's up to you to do the layout. Those are more expensive. Finally, and most expensive of all you can license the patents on the architecture and make your own chips.
Qualcomm have an architecture license and built their own ARM from scratch (with help from ARM), I think TI did the synthesis themselves and tweaked the layout.
Re: (Score:2)
Back to the "cheap Core i7 clone from the HDL" though, I suspect Core i7 is, or at least was, a stretch for even Intel's fabs. There probably are other fabs in the world that could build it, but not that many. ARM is a much easier target - that's part of its strength. The new small core from AMD will be quite interesting, in that respect.
Re: (Score:1)
Back to the "cheap Core i7 clone from the HDL" though, I suspect Core i7 is, or at least was, a stretch for even Intel's fabs. There probably are other fabs in the world that could build it, but not that many.
Well but you wouldn't clone an i7 - you'd take some of the clever features and reimplement them in your core. Or just build one i7 core rather than four on a chip. I'd much rather one i7 class core than an Atom in a netbook for example.
Re: (Score:2)
> Well but you wouldn't clone an i7 - you'd take some of the clever features and reimplement them in your
> core. Or just build one i7 core rather than four on a chip. I'd much rather one i7 class core than an
> Atom in a netbook for example.
Sure you can. But all of those things involve engineering, and teams of engineers. That puts you out of the "cheap knock-off" league that was back at the start of this whole subthread. I'm merely contending that there is no such possible thing as a "cheap Core
Re: (Score:2)
Never been, probably never will. I'm dealing with co-workers in China right now, but the company is pretty darned stingy with the travel. I've done this particular function 2 or 3 times before without traveling, so I'll manage it this time, too.
I have a friend who had an anecdote about this that he picked up from a friend years back, when an audio company was moving production to China. His friend "escaped" his handlers and went over to the other end of the building where his products were being produced
Re: (Score:2)
What's the environmental situation in Taiwan wrt semiconductor fabs and the electronics industry in general? I know we've had some legacy problems here in the US, and mainland China is probably a nightmare, and I'll guess that Taiwan's not far behind. I have a friend who goes to Taiwan fairly regularly, but never thought to ask him about this.
Re: (Score:2)
I saw both diff and poly stretching around in there, though I didn't notice if the poly was for underpass or just gate reach. Never did pMOS myself, just nmos and cmos. Never used buried contacts either - someone higher-placed got burned by buried contacts before I arrived, and they were politically taken off of our plate. I did see some other designs that used a combination of diff/poly/buried-contact for a lower resistance underpass.
Re: (Score:1)
Re: (Score:3)
*repeats a mantra* I will not feed the trolls, I will not feed the trolls....
Linux is Software. And Red Hat doesn't sell the software, they sell support contracts for the software. You can get RedHat's distribution for free through CentOS and are only paying for technical support and the nice pretty RedHat-specific graphics when you buy RHEL. Nobody is going to make money giving away modern chip designs for anybody else out there to manufacture, because there's no way for them to get an ROI on the developme
Re: (Score:2)
Sun's opened the UltraSPARC T1 and T2, although nobody's spun an ASIC from that.
Alternately, Gaisler Research has the LEON, which is dual-licensed under the GPL and a closed license. Want to use it non-commercially, it's GPL for that, want to use it commercially without giving up the source, you have to pay. And, there's a few SoCs here and there based on it.
Re: (Score:2)
Apparently you couldn't release the internal workings of a system (Linux) and have someone make money from it (Redhat). I agree with you, that would be absurd.
I noticed out of four targets to apply to, not a single one was Intel.
What works for one company probably won't work for many others, and one could easily say it will Never work with All others.
Re: (Score:1, Offtopic)
See all the _hardware_ errata at Microchip.com.
Pentium Floating Point error, anyone?
There are many issues with CPUs - they are just so well-hidden and/or obscure that you never see them. With the advent of updatable microcode, you'll see fewer flaws that need permanent work-arounds.
Re: (Score:1)
4195835 * 3145727 / 3145727 = 4195579 ?
Re: (Score:2)
Likewise for me, something like "SAM". It was a nice simple case, but not terribly interesting.
But maybe that's why they do the fake arch - because a real arch would be too complex? At least, that would explain undergraduate classes.
Re: (Score:3, Interesting)
In my computer architectu
Re: (Score:2)
IA32?
Damn kids these days.. Back when i was your age we had 8 bits and appreciated it!
All kidding aside, learning the Z80 inside and out ( and designing my own 8 bit machine later ) didn't hurt me one bit.
Re: (Score:2)
Kidding aside, I was a 6502 guy back in the day. There was a book called "How to build a microcomputer and really understand it" (or something along those lines) that took you through what all the control lines did and how the interrupts worked, etc. That, and the reading through the OS listing for the Atari 400/800 really gave me a firm grasp on how it all fits together.
I'd really like to get a copy of that book again (loaned it out, never got it back). It had printed
Re: (Score:2)
All professors are heavily in to MIPS, and you, like me, and everyone else who's taken Introduction to Computer Architecture, know why.
I had a feeling about it (see all the 16-bit "KIPS" chips designed by college students after having read through Computer Organization and Design). But I wasn't completely sure that MIPS was the ideal teaching model until it was proven pretty much patent-free by Plasma and Loongson.
Re:Awesome! (Score:5, Interesting)
One of the things I hated most about my computer arch class was that we had to learn about a completely made up system design which didn't translate to ANYTHING in the real world. Oh yeah, and it was RISC. *Snoooreeee*
That's only because you dropped out before getting to the FPGA [wikipedia.org] classes!
Any functional CPU design (technically non-functional ones too, for whatever good that would do) can be flashed into an FPGA and become as real as any other silicon chip.
And identical to psudocode, psudo-chipfab can be translated into any real code/fab language by anyone that knows basic design and the target language. You were supposed to be learning the basic design part, so once you got to using a real language used in the real world, you would have some clue what to do with it.
Re: (Score:2)
Very true
One of the extra courses I could take was making our 32 bit MIPS design run on FPGAs. In that class the teachers would give us pre-designed modules for memory controller, IO (keyboard) and video to boot a very simple OS on them.
Didn't take that course though.
Re: (Score:2)
Learning about designing your own CPU from scratch? Snore?
I think you may be on the wrong course.
So in 2047... (Score:4, Funny)
When we get the Core i7 details, will it seem as quaint as the 4004 does now?
Re:So in 2047... (Score:4, Interesting)
At that point in time retired Intel employees would say: "It was all binary... You know ones and zero's on solicon *audience laughs*, which was a bunch of sand basically. Heh... And at that time we were bumping against the limits of this technology so we decided to bake a multitude of them on a single die. Haha... dear God... can you imagine? *audience laughs* Programming this was, well you can imagine, not so pretty. Taking advantage of this technology was still very hard at that time, but OpenCL largely made up for it, so... Any questions?"
-"I worked for a RAM company at that time. And I realised that while the CPU was in fact doing everything in parallel, the RAM was actually serialy read out. What was your stand on this?"
Ühm... *audience laughs* That question is for [person sitting next to the speaker]. *audience laughs harder*
I think that the Core i7 is a little bit too complex to understand right away. I mean with the 4004 everything was realy, realy basic. It had a design team consisting of four people. Nowadays it takes a whole team to improve it all. So I guess the awnser is no.
Re:So in 2047... (Score:5, Funny)
I mean with the 4004 everything was realy, realy basic. It had a design team consisting of four people. Nowadays it takes a whole team to improve it all.
Yes, one person for each bit. Nowadays you need 64 or 128 person teams.
Re: (Score:2)
How are you supposed to find out if the chip is working right if you don't have enough people to stand or sit based on their current instruction?
Re: (Score:2)
That's why Intel's HR department has such a high turnover rate. Scheduling vacation time is a massive headache, let alone the unexpected family emergencies. They've tried to automate it, but there's a lot sitting in the inbox to process at any time.
Re: (Score:1, Funny)
And quantum computers require one person per qubit.
The only problem is they're both working on it and not working on it at the same time ... if you know what I mean.
Re: (Score:1)
> The only problem is they're both working on it and not working on it at the same time
I imagine this poses one hell of a problem for middle management when it comes to year-end reviews. I don't know, do they put their developers into boxes containing poison gas flasks linked to geiger counters in order to determine who's slacking off and who's actually working?
Re: (Score:3, Interesting)
No. (I know the question was rhetorical, but I can't resist answering).
The 4004 had 2,300 transistors. A college student can create and debug a processor more powerful than that in a semester. It is possible to memorize the entire thing. A Core i7 has around 300 million transistors. Unless human intelligence changes significantly, one human could not memorize and understand 300 million transistors.
Re: (Score:1, Insightful)
Suicide: commit it.
Re: (Score:1)
> Unless human intelligence changes significantly...
Ah, so now we get to the meat of the matter!
Re: (Score:2)
Although a great deal of those transistors will be the same thing over and over again - the cache.
A great presentation of it all on YouTube (Score:5, Informative)
Link: http://www.youtube.com/watch?v=j00AULJLCNo [youtube.com]
Italian business (Score:5, Interesting)
If one was produced with a 40nm process... (Score:2)
I wonder what clockspeed it would get. I know it's completely useless/pointless, but I'd be interested to see anyway.
Re: (Score:3, Interesting)
better question, how would they physically handle a processor that small, 4004 has 2300 transistors, http://en.wikipedia.org/wiki/Intel_4004 [wikipedia.org] , and the i7 has 731 million transistors at 45nm at 263 mm^2, http://www.legitreviews.com/article/824/1/ [legitreviews.com] , So by those numbers the 4004 on a 45 nm process would have an area of .00082749 mm^2 or 1/317826th the physical size of an i7 die. Disclaimer: this is a very rough calculation, but in any case it is more than 5 orders of magnitude smaller than an i7. On the other
Re: (Score:2)
I thought Intel was already doing something like this? It was going to somewhat similar to a Cell processor except with something like 128 Pentium 1 cores on it.
Re: (Score:3, Informative)
Probably the same 740kHz that the original 4004 had.
The manufacturing process used has nothing to do with the maximum clock speed a chip can achieve. It's about energy bleeding (heat loss) and the transistor density. If you manufacture a 4004 using 1950's-era technology, with actual honest-to-goodness 1mm-thick copper wire and large physical transistor switches, it'd be a *lot* bigger, but it'd achieve the same 740kHz that the design allows for.
The reason using a smaller manufacturing process translates int
Re:If one was produced with a 40nm process... (Score:5, Insightful)
This means that you can cram more transistors in to the same area of silicon, allowing you to complete more operations per clock cycle.
This is true, but smaller process nodes also produce faster transistors. When you make things on the chip smaller, you have the practical effect of reducing parasitic capacitance in transistors and interconnect. Lower capacitance means a smaller RC time constant (using a first-order model), so logic will work faster. Intel's 45nm process can create an inverter with a delay of less than 5 ps.
Your statements imply that transistors have a fixed speed, and that the only way to improve performance is parallelism. This is false.
Re: (Score:2)
smaller process nodes also produce faster transistors
I was thinking the same thing. In fact, I'd be inclined to believe that, since the resulting chip would be so small, you could actually get it up to a higher clock speed than a current CPU. However, you wouldn't be able to interface it with anything, because you'd never get I/O signals at that frequency off-chip, without ruining them. You'd need to have at least some memory and some type of I/O controller on the same chip, to make it work.
Re: (Score:1)
And yet, you can't propagate a signal all the way across the chip in one clock cycle.
Re: (Score:2)
For that matter, what if you made a CPU with a hundred million [slashdot.org] of these?
Re: (Score:2)
This would be an interesting homework problem for a digital design class. First, find the single-cycle instruction that will take the longest amount of time. Then, figure out the critical path. Find the logic delay given a particular modern standard cell [wikipedia.org] library.
Re: Intel Allows Release of Full 4004 Chip-Set Det (Score:1, Funny)
Imagine a beowulf cluster of 4004 emulators...
So, will it... (Score:4, Funny)
j/k
This should actually be quite cool. I can see garage-based tinkerers messing with this chip, the registry and even coming up with a retro User Group.
Re: (Score:2, Funny)
Can you compile a Linux kernel into 2048 bytes?
Re: (Score:2)
Yes. Linux will run on anything from a supercomputer to a wristwatch.
Re: (Score:2)
Linux requires at least a 386, always has. I'm not sure there's a UNIX that will run on any 8-bit or earlier processor.
Re: (Score:1)
Linux has been ported to various non-x86 architectures. See: http://www.ibm.com/developerworks/linux/library/l-nonx86.html [ibm.com]
And some folks have even ported a Linux subset to 8086. See: http://elks.sourceforge.net/FAQ-English.html [sourceforge.net]
Still, it would be quite a small subset to fit on the 4004. Being so small, you could consider it a subset of virtually any OS.
Re: (Score:2)
(FWIW, my father-in-law, who works as a magazine editor for a technical magaine, used a Leading Edge XT clone until just a few years ago. They switched him finally to a Macintosh, which is his new primary desktop.)
Re: (Score:2)
Sure, Linux will run on it. Just imagine a whole bunch of blades of these...!!
Re: (Score:2)
Old joke for old hardware... (Score:1, Funny)
Re: (Score:1)
Dude! (Score:1)
Imagine a beowulf cluster of these!
First Post said just that.
I know that most first posts are GNAA trolls, or something else pretty obtuse, but come on! You're waaaay down here and you honestly thought you'd be the first one to post that?!
There's already been a "Does it run Linux?" post and if I dug into the -1s, I'm sure there would be a "In Soviet Russia, 4004 processes you!" or some such thing about Cowboy Neal's something using 4004 in the description.
These are things one learns in the first few days of Slashdotting.
Man, go and read "S
Re: (Score:2)
You are of course aware that slashdot presents posts in different order depending on your settings? So while his post might be "way down there" in other views it will show up in the top.
Perhaps you yourself should go look for that elusive slashdot for beginners...
4004.com = 4004 Web Server? (Score:1, Funny)
Cruising over to 4004.com gives "page cannot be displayed". While I'm sure it's slashdotted, I can't help but wonder if they used one for their web server......
Federico Faggin, intel4004.com (Score:4, Informative)
http://www.intel4004.com/ [intel4004.com] goes into much greater detail about Federico Faggin (primary co-developer and project leader), and the story of his accomplishments before and at Intel, his physical signature on all 4000 series chips, Intel's successful attempt to discredit him and patent his invention (the buried gate) that he invented at Fairchild before coming to Intel, and his departure to found Zilog with some members of his older design team.
Intel has been playing their game their way for a very long time.
Re: (Score:1)
http://www.intel4004.com/ [intel4004.com] goes into much greater detail about Federico Faggin (primary co-developer and project leader), and the story of his accomplishments before and at Intel, his physical signature on all 4000 series chips, Intel's successful attempt to discredit him and patent his invention (the buried gate) that he invented at Fairchild before coming to Intel, and his departure to found Zilog with some members of his older design team.
Intel has been playing their game their way for a very long time.
It's a pity and a big mistake that such a great engineer did not get a Nobel prize yet. He revolutionized our world.
Re: (Score:2)
Re: (Score:1)
Engineers don't get Nobel Prizes. The prizes are for science not engineering.
In fact I meant nobel for physics. The silicon gate technology, invented by Faggin, is essential for the boost of the microprocessor technology, but it is indeed a physics discovery.
Re: (Score:2)
Re: (Score:1)
No, it's an invention. Inventions are engineering, discoveries and theories are science. Process technologies are definitely not scientific discoveries.
It seems that you are right, the Nobel committee emphasize on discoveries over inventions, but this was not the original intention of Alfred Nobel [wikipedia.org]. :-D
So, if sir Alfred was still alive, he could give the prize to him
Re: (Score:2)
The site doesn't make it clear - was Faggin shafted by Intel while working for him (and left to form Zilog as a consequence), or did Faggin leave and start Zilog, and then Intel tried to discredit him as an act of sour grapes?
Control Systems using 4004 (Score:5, Interesting)
In the very early 70s our engineering group was interested in using the new 4004 to simplify the production of control systems for heavy machinery (windlasses, hydraulic systems, etc). The machinery itself was slightly different from contract to contract and even from item to item within a contract so we had to design a new control system for each unit. When the 4004 came out we were excited to see if we couldn't do it cheaper and faster using a microprocessor.
We had moved from relays and discrete wiring to CMOS components on printed circuit boards and thought that was a big step. CMOS could be run at 15vdc which meant that the noise inherent in the environments our machinery worked in would not be quite as big a problem.
Unfortunately we discovered that we had several problems including the limited instruction set and memory capabilities of the 4004 along with the lower voltages needed so we stuck to CMOS until I left a couple of years later.
Still, the 4004 was my introduction to microprocessors and that changed the course of my career from electronics and electronic control systems to digital control systems and computers.
It's been an exciting ride, too. I am grateful to have grown up with the technology.
Re: (Score:2)
By the time I hit the streets, the 80386 was the hotness and the 486 was just around the corner. I love hearing tales of the trenches from the Good Old Days. As exciting as technology remains to me, working in the field and using it constantly, I still miss the simpler times. Maybe it's because what I know now would have made me a Technology God 20 years ago, or maybe it's just because there was something different about a time where things were changing rapidly but the field was still on a scale where s
Non commercial use? (Score:4, Funny)
Available for non commercial use? Are they even entertaining the possibility that somoent might try to profit from the design?
Re: (Score:2)
Given the morbid fascination the geek world often has with retro computing, it's not something I'd ground rule myself.
Re: (Score:1)
Yes! I intend to use this documentation as a starting point for my own product line. I hope to learn quickly and make more advanced designs, which will also be smaller, and I will compete directly with Intel. I will call my company Advanced Micro Designs. [wikipedia.org]
That takes me back (Score:2)
We used a timeshare service via a Model 33 teletype with acoustic modem to access a 4004 assembler. It would spit out a paper tape that we wou
Re: (Score:2)
you must surely agree that compared to Visual Studio, paper tape IS a superior option.
Great, now I cam make my own calculators... (Score:2)
I just need to make my own chip fab in my garage and hundreds of hazardous chemicals that are sure to get me on DHS's shit list....
Re:Wow! Imagine a Beowulf Cluster (Score:5, Funny)
Re: (Score:3, Interesting)
Re: (Score:2)
http://hardware.slashdot.org/article.pl?sid=06/11/14/2356255 [slashdot.org]
it's the full specs now (Score:2)