Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Intel Hardware Technology

Origins of the Modern PC 99

Homncruse writes "ComputerWorld dispels myths about the history of modern day computers — or, more appropriately, the invention of the first microprocessor. Contrary to popular belief, 'the [Intel] 8008 was not actually derived from the 4004 — they were separate projects.' In fact, the 8008 concept didn't originate from Intel (though they were eventually granted IP rights.) The article goes on to explain the events leading up to the invention and first intended use of the 8008 (a predecessor to the 8086, etc.), and how Intel was initially uneasy about the venture."
This discussion has been archived. No new comments can be posted.

Origins of the Modern PC

Comments Filter:
  • From the article,

    The resulting compact enclosure had heat problems

    ... I can't believe they were having problems overclocking back then TOO. You'd think in 40 years, someone would have come up with a better solution that using water..

    • by Chmcginn ( 201645 ) on Saturday August 09, 2008 @09:18AM (#24537083) Journal

      ... I can't believe they were having problems overclocking back then TOO. You'd think in 40 years, someone would have come up with a better solution that using water..

      Fossil fuel engines, refridgeration and air conditioning systems have been around a lot longer than that, and there's still no better way to cool off something hot than running a cool liquid around it.

    • Re: (Score:1, Informative)

      by Anonymous Coward
      They weren't overclocking. They were just trying to put it into a small enclosure. And do you have a better suggestion for cooling?
    • by Kjella ( 173770 ) on Saturday August 09, 2008 @10:29AM (#24537437) Homepage

      Well, energy doesn't just "go away" you can really only transport it elsewhere and to transfer heat away you need to have a medium to transport it with, even if it's just air. Plain old water has a very high specific heat - it takes more energy to heat water one degree than pretty much everything per except hydrogen gas per kilo, and in volume it can't really be matched. Airflow got less than 1/3000th the specific heat as the same volume of water flow. You can put a bottle of water and a solid copper lump the same size on a heater and the copper will heat up quicker.

      The other part is getting the heat out of the medium again - in a case with no airflow or a water cooling with no pump, it doesn't matter much how much energy it can store since it's never released. For my machines it's fine to just let it out into the room. In a server room, you need to get that heat out of the room too. Again there's really no magic to be made - you need a large surface area and a large temperature differential. That's why heatsinks have all the fins and fans blow cool intake air over them. A water cooling system can absorb a lot of it, but it too needs to get rid of it.

      In short, there's no easy way to solve the heat problem because it's down to basic physics. If you can find a way to make the heat just go away, you'll get the nobel prize in physics.

      • If you can find a way to make the heat just go away, you'll get the nobel prize in physics.

        Somehow I think they wouldn't be too keen on giving out the prize sine making the heat just go away would prettymuch destroy the whole of physics.

        • That's the kind of things that the scientific world revolves around.
          If you make a discovery that render the the laws of thermodynamics entirely invalid, it would almost be a crime not to give you the Nobel price. =)

          Punishing people who overthrows your perception of reality is reserved for religions, politics and small minded people.
          There are, of course but unfortunately, lot's of small minded people in the scientific community too.

      • Of being cheap and non-toxic. For whatever other benefits another fluid might have, it just isn't as abundant as water. If a device is water cooled, it is easy for the end user to obtain more coolant when needed. It is also safe for that use to handle the coolant, as it is just water. Thus even if you can make a synthetic with superior energy transfer characteristics to water, it isn't likely to be used in most cases. The benefit of a coolant that, literally, comes from every tap and is literally safe enoug

    • by orasio ( 188021 ) on Saturday August 09, 2008 @10:56AM (#24537599) Homepage

      There are lots of better solutions.
      My favorite is to decrease wattage, it's just simpler.
      Passive heatsinks are good, too.
      Liquid cooling not involving water is used, too.

      The problem comes when you want to define "better".
      We have lots of engines that are better than the internal combustion engine, but in that case, "better" depends on so many things, that nobody agrees to choose a replacement.

  • Curse? (Score:5, Funny)

    by BitterOldGUy ( 1330491 ) on Saturday August 09, 2008 @09:11AM (#24537055)
    Roche died in a car accident in 1975, Ray died in 1987, and Noyce died in 1990. Frassanito left Datapoint to set up his own firm in 1975 and worked on the space shuttle and space station projects, among other things.

    Coincidence? I think not! There's some sort of curse going on with that computer and I predict that, sooner or later, everyone associated with that project will die!

    I know it's hard to believe, but I am clairvoyant.

    • by Tablizer ( 95088 )

      There's some sort of curse going on with that computer and I predict that, sooner or later, everyone associated with that project will die!

      Double chance if they display an image of Tutankhamun's tomb on a DataPort 2200.
         

  • by jeffb (2.718) ( 1189693 ) on Saturday August 09, 2008 @09:16AM (#24537081)

    I enjoyed the Blueprint of the Datapoint 2200 enclosure, showing the crowded interior. I guess the caption writer has never seen the inside of a mechanical calculator. Imagine an object the size of a small desktop PC enclosure, entirely stuffed with mechanical linkages. It's truly astonishing.

    By comparison, a handful of circuit boards stuffed with SSI and MSI chips was delightfully simple. No moving parts! No lubrication! No wear!

    • by urcreepyneighbor ( 1171755 ) on Saturday August 09, 2008 @09:53AM (#24537267)

      Imagine an object the size of a small desktop PC enclosure, entirely stuffed with mechanical linkages. It's truly astonishing.

      Linkage: Extreme example [wikipedia.org]. Cool example [vcalc.net].

      Sometimes, pictures are needed. :)

      • Divisumma was the first thing that came to mind for me, though the MonroeMatic [tfh-berlin.de] could be a contender. You should see the inside of those...

        Though some cash registers [brassregisters.com] give those a run for their money... So to speak.

        • ...that I was thinking of:

          Animated GIF slide show of internals [tfh-berlin.de]

          • Yeah, I think the MM (or 'crasher' as we called them back then) is close to the most complex mechanical calculator ever, save the Difference Engine. Which isn't fair, since the DE is the size of a room.

            When I actually serviced these machines, the engineers in the paper mills called them 'coffee makers'. Input a problem, start, go have a cuppa coffee. Maybe a cigarette. Maybe two. It might be done by then.

            A room of 60 of these going steady from 8 to 5 was deafening.

            Then we delivered Sharp CS-21s or 21As

    • by Anonymous Coward on Saturday August 09, 2008 @06:09PM (#24540587)

      I have worked on the Datapoint 2200, when I was just starting out. A magnificent machine for it's time! It was multi-tasking two dual card reader punches, two daisy wheel printers, was master of a 16 node TTY local network and an SNA controller to boot! This was with a CPU that was, IIRC, mostly 8008 compatible ( plus a second register set al la Z80 ). a maximum load of 16Kb of RAM,
      two true digital cassette decks and a 12 x 80 screen. This was circa 1975 . All one had to do was keep the the dust off the chips and change the tape drive belts once a year and it ran for at least the 7 years I was around.

    • Re: (Score:3, Interesting)

      by DerekLyons ( 302214 )

      Imagine an object the size of a small desktop PC enclosure, entirely stuffed with mechanical linkages. It's truly astonishing.

      I wish I could find a picture online of the interior of a MK8 Range Keeper or MK6 Stable Vertical - two elements of a fire control computer for an Iowa class battlewagon - imagine something larger than the desk that desktop PC sits on stuffed full of mechanical and electromechanical calculating equipment.

      When they rebuilt the Iowas in the 1980's they kept the old analog equip

  • by Rik Sweeney ( 471717 ) on Saturday August 09, 2008 @09:19AM (#24537085) Homepage

    I refuse to read the article because it goes against my creationist beliefs.

    How dare you suggest that the x86 evolved from the 8008. Me and my other enlightened brothers believe that the x86 was created by the supreme BG (MBWH*) who resides in MS, a utopia where all processors will eventually return to.

    *Megahertz Be With Him

    • by notgm ( 1069012 )

      just accept that they were created as a result of their predecessors, then. evolution = creation, heads will asplode.

    • by ProKras ( 727865 )
      Wow, your computer must be REALLY old. After all, wasn't it the great BG who commandeth: "640K ought to be enough for anybody."
  • Tenuous connection (Score:5, Interesting)

    by Ancient_Hacker ( 751168 ) on Saturday August 09, 2008 @09:28AM (#24537121)

    Tracing the x86 back to the 8008 is a mighty tenuous connection.

    There are two very weak links.

    First, the 8008 to 8080 transition was a major re-do. Like ten times the speed, an external stack, more. The opcodes were upwardly compatible to a point, but that's about the only similarity.

    Next, the 8080 to 808x transition was just as abrupt. 16 bit registers, segments, and more. Again there was a certain backward compatibility, if you converted all the mnemonics and register names, but that was about all.

    • Re: (Score:1, Informative)

      by Anonymous Coward

      The Z80 was in between as well.
        If I recall , it was object code compatible with 8080, although the register names were different , it had another alternate set/ copy of its registers which could be swapped between .

      • by dstates ( 629350 ) on Saturday August 09, 2008 @01:40PM (#24538689) Homepage
        The Z80 was an upwardly compatible extension/clone of the 8080A. The Z80 was designed by Federico Faggin at Zylog after he left Intel. Faggin had previously designed the 8080 when he was at Intel. So the Z80 is a derivative, not "in between" any of the Intel CPUs. Interestingly, Zylog licensed the Z80 design royalty free creating a robust second source market. Z80s dominated the 8-bit CPU market in the late 1970s.
        • And just in case anyone wondered where all those Z80s went, they went everywhere. You can find them in the classic Game Boy and on many Adaptec SCSI cards, for example.
    • by lenski ( 96498 ) on Saturday August 09, 2008 @11:07AM (#24537651)

      It was Intel's clear intention to allow simple, fully automatic translation of assembly code between one generation and the next. So the fact that the transition from each generation to the next is expressed in large steps does not make it a mighty tenuous connection. To exemplify:

      (1) The slow speed of the 8008 required hardware acceleration for parity computation, so the 8008 ALU provided a parity bit in the flags register. That bit lasted all the way through the Pentium line. (Could it remain in X86_64? I no longer work in the assembly language world and do not know.)

      (2) The original A,B,C,D,E,H/L register configuration with its byte/word weirdness in the 8008 was still plainly visible in the 16-bit X86 line, and hints of those structures lasted right through IA32, though IA32 does have significant improvements in orthogonality. (This is the genesis of the non-orthogonal register sets that compiler writers complained about all the way through IA32, which are fully rectified only with X86_64.)

      The connection is not only not tenuous, but (I claim, having worked with every CPU they built from the 8008 to my current Core2duo) clearly connected by an intentional, nearly unblemished record of source-level backward compatibility for the 40 years of its history.

      You do have a good point with respect to the way Intel scheduled its generational developments. When my group at AT&T was debating a project based on i486 DX2/66 and i960CA/CF, the Intel FAEs were exceptionally forthcoming with us about the way Intel developed their processor families. One of the more interesting things I learned was that Intel's X86 families were developed using dual teams, each team leapfrogging the other with successive generations. There was constant discussion among the teams, so often ideas from one would slip into the other.

      There is no question that each generation was intended to be as large a leap as possible beyond the last, so you do have a good point about the internal architecture of the processor families.

      • by postbigbang ( 761081 ) on Saturday August 09, 2008 @04:12PM (#24539781)

        Oh, there were problems.

        Compilers didn't. Microprocessors and external FPUs didn't work together, and there were a raft of famous Intel bugs in their own chips. The i386sx was one of those. How they could release wafers that weren't validated just shows how loose and fast they operated, trying to beat Moto and others.

        Thier 'leapfrogging' resulted in recalls, compiler-writer headaches, assy code mistakes, and easily limited motherboard maker designs. True, others like HP, MIPS, and Sun had their own share of mistakes, but Intel mulitplied them with popularity.

        In the interim, they made incredibly dubious marketing claims about their CPUs. Clock speed was it, baby. They never did point to bloatware, and the other incumbent problems of systems processing. Like Microsoft, they think very big of themselves yet are more of an accident of history than they would let you believe.

        • Arguably, the FPU in an x86-compatible processor doesn't work "with" the CPU now - since it's a coprocessor, you have to submit work for it to do, and then check back to see when it's done it. This is true whether it's inside the processor or not... Anyway, intel has a long history of pushing the envelope with sometimes disastrous results. The Pentium is famous for melting its test socket. Intel is flying on a wing and a prayer right now though; its architecture is clearly inferior to AMD's and the only thi
      • Re: (Score:2, Insightful)

        by maestroX ( 1061960 )

        (1) (..) That bit lasted all the way through the Pentium line. (Could it remain in X86_64? I no longer work in the assembly language world and do not know.) (2) (...)(This is the genesis of the non-orthogonal register sets that compiler writers complained about all the way through IA32, which are fully rectified only with X86_64.)

        Am I too audacious in claiming that AMD cleaned up the mess?

    • by mysticgoat ( 582871 ) on Saturday August 09, 2008 @11:34AM (#24537803) Homepage Journal

      Upwardly compatible opcodes was the overarching reason why, in that era, the 8086 was considered a true descendant of the 8080, and the 8080 was considered the true descendant of the 8008.

      Remember we are talking about an era when Assembly Language was the highest level of programming abstraction available on the early micro computers. The compilers that converted AL to binary machine language ran on minicomputers, were state of the art, expensive, hard to acquire, and difficult to use. Developing under these conditions, and attempting to fit working programs into 4, 8, or even a glorious 16 kilobytes of RAM, was an art form that no one has had to practice in more than 30 years.

      There was a tremendous advantage in developing a chip that allowed extension of the existing AL compilers without total rewriting, and allowed the AL programmers of the day to build upon their old skills. That some of the routines developed for the 8008 would also run on the 8086 / 8088 was a fringe benefit.

      Disclaimer: while I was writing my first "HELLO WORLD" programs in Fortran on punch cards at the time the 8008 was put on the market, my first PC was an Apple II+ (about 8 years later) and I learned 6502 Assembly rather than 8086 code. I have since managed to forget all those old skills. Good riddance! It is much better to scratch out new ideas in Perl, and then if there is some reason to optimize, get a code monkey or two to do the low level work.

      • by kamochan ( 883582 ) on Saturday August 09, 2008 @12:55PM (#24538373)

        attempting to fit working programs into 4, 8, or even a glorious 16 kilobytes of RAM, was an art form that no one has had to practice in more than 30 years.

        You know, there are still some of us who routinely develop software for controllers in weather probes, dive computers, GPS chips, and so on... there definitely are times where 16 kilobytes is glorious.

        • Re: (Score:2, Troll)

          by DerekLyons ( 302214 )

          attempting to fit working programs into 4, 8, or even a glorious 16 kilobytes of RAM, was an art form that no one has had to practice in more than 30 years.

          You know, there are still some of us who routinely develop software for controllers in weather probes, dive computers, GPS chips, and so on... there definitely are times where 16 kilobytes is glorious.

          Oh indeed - damm few slashdotters are familiar with anything but PC hardware. Even those 'familiar' with the hardware really aren't except at the 'box mar

        • Re: (Score:3, Informative)

          by mysticgoat ( 582871 )

          Thank you for correcting my oversight. Embedded device controllers and similar applications are a world of their own. Forth is glorious: the first programming language commercially implemented on the 8086 back in the day, and still, when you count up all the cars and trucks, elevators and diesel-electric locomotives, the most commonly used computer language in the world.

      • Advance warning: pipe & slippers alert!

        The compilers that converted AL to binary machine language ran on minicomputers, were state of the art, expensive, hard to acquire, and difficult to use. Developing under these conditions, and attempting to fit working programs into 4, 8, or even a glorious 16 kilobytes of RAM, was an art form that no one has had to practice in more than 30 years.

        Heh, I know in another reply you've already acknowledged those of us who work with embedded systems, but even so, some of those development computers back then were running later than you think! My first job was writing code in assembler & PL/M for an embedded 8085 system. (The 8085 was basically just a better integrated 8080A that didn't need so many/any support chips, I'm surprised it's not been mentioned in the article or this topic so far.)

        How? Well betw

    • by fm6 ( 162816 ) on Saturday August 09, 2008 @11:24PM (#24542747) Homepage Journal

      the 8080 to 808x transition was just as abrupt. 16 bit registers, segments, and more. Again there was a certain backward compatibility, if you converted all the mnemonics and register names, but that was about all.

      You're basically correct, but the transition wasn't as abrupt as all that. A 16-bit register can be designed so it looks like an 8-bit register to 8-bit code. And one reason the 808x has memory segments instead of a simple flat memory space is to provide a memory model that works with old 8-bit code; you pointers just refer to an address within a 64K segment instead of a flat 64K address space. I seem to recall that it was possible to run a lot of 8080 assembly code on the 808x simply by reassembling it. You could not, however, use your old binaries because the op codes were different. That lack of binary compatibility is more to the point, transitionwise, than add-on (but backward compatible) features like bigger registers and memory segments.

      The 8080/808x transition was certainly more abrupt than subsequent transitions to 80186, 80286, etc., where there was binary compatibility. But it was less abrupt than, say, Motorola's transition from the 6800 to the 68000.

      • Re: (Score:3, Interesting)

        > And one reason the 808x has memory segments instead of a simple flat memory
        > space is to provide a memory model that works with old 8-bit code; you pointers
        > just refer to an address within a 64K segment instead of a flat 64K address space.

        Actually, 64K is 16 bits of address space. Looking back, there was one thing I wish Intel had done differently. Their X86 addressing scheme consisted of two 16-bit registers. To get the absolute address you...
        * took the base register and multiplied by 16
        * ad

        • DOS with XMS memory and a 32 bit extender can access at least 3GB memory on x86 platforms. DOS and the PC BIOS both suck horribly and we should do our best to forget that they ever existed. (And yes, I DID enjoy the simplicity of producing a "working" program when I studied x86 asm using DOS and MASM, but I will never trust the "PC BIOS" simply because it's not the PC BIOS. I feel much better about trusting, say, glibc. Or in this case, something more like uclibc. It seems to me that most places I would thi
          • by fm6 ( 162816 )

            DOS with XMS memory and a 32 bit extender can access at least 3GB memory on x86 platforms.

            It can access it, but it can't address it. Important difference. The real-mode DOS code still has pointers that aren't big enough to point past the 1 MB barrier. The data has to be transferred to and from a buffer within the real-mode memory space.

  • Wang Labs (Score:5, Interesting)

    by Sanat ( 702 ) on Saturday August 09, 2008 @09:36AM (#24537183)

    Wang (now a defunct company) built a PC in the early 70's that was actually called a "PC" but it stood for Professional Computer. It used the 8088 technology. Earlier prototypes utilized the 4004 and the 8008 as well and was in other technology designed by the company R&D department. Later the computer used the 8086 but for years was not "IBM" compatible at the microcode level thus could not run IBM type programs. The company was inflexible on fixing the problem as they expected IBM to conform to Wang Standards rather than vice versa. Some of the instruction set worked differently in order to save a clock cycle or two.

    Eventually the Wang PC became IBM compatible but it was too little... too late and the use of the PC was pretty much restricted to being a terminal rather than a full fledged processing device.

    Dr. An Wang was the person who designed core memory and started Wang Laboratories in the 50's. What an inspiration he was (and still is although he died in 1990) to young and old who are inspiring individuals with creative talents.

    • Re: (Score:1, Funny)

      by Anonymous Coward

      What are you doing with your Wang?

    • Word processors (Score:5, Interesting)

      by Animats ( 122034 ) on Saturday August 09, 2008 @10:38AM (#24537491) Homepage

      In the 1970s and early 1980s, before general-purpose personal computers, there was a whole industry for "word processors". These were special-purpose machines which offered text editing, printing, and storage for documents. They replaced typewriters. For the first time, people could edit documents without retyping. Word processors were not intended to be user-programmable; they ran a built-in application. Wang was a big name in that area, as were Datapoint and IBM. The original IBM PC reused the display from the IBM Displaywriter, IBM's family of word processors.

      The next step was "shared-logic word processors", where several terminals connected to a central unit, with the central unit having a disk and printer. This was a low-end version of time-sharing. Datapoint introduced ArcNet, so the word processors could send documents to each other. But none of this stuff was user-programmable, although the hardware underneath was a general purpose CPU. It wasn't considered reasonable that users in a typical office could program something as complex as a computer. Also, these machines barely had an operating system; they were usually running the application on the bare machine.

      After the IBM PC came out, Wang tried to enter that business. They weren't very successful. I used one of their early 8086 machines, the Wang PIC, which had a scanner. It ran a variant of DOS, which, interestingly, allowed about 800K of user space instead of 640K, because they did the split between RAM and device space at a higher address than IBM did. (The real 8086 limit isn't 640K; it's 1024K minus whatever address space is needed for devices.) It used a completely different (and more rugged) plug-in card design than the IBM PC, and wasn't software-compatible. A nice machine, it just lost out for being incompatible.

      So really, PCs are descended from these word processors.

      • Re:Word processors (Score:4, Insightful)

        by pipingguy ( 566974 ) * on Saturday August 09, 2008 @11:12AM (#24537673)
        special-purpose machines which offered text editing, printing, and storage for documents. They replaced typewriters.

        I find it kind of sad that most people reading this have never typed on anything that doesn't rely on electrons to work.

        +1 Irrelevant Old Fart Comment
        +1 Get Off My Lawn Comment
        • Re:Word processors (Score:4, Insightful)

          by corsec67 ( 627446 ) on Saturday August 09, 2008 @11:59AM (#24537965) Homepage Journal

          Wouldn't the electrons used in the bonds of a mechanical typewriter also be critical for it to work?

          A mechanical typewriter without an electrical circuit would still need electrons to hold it together.

        • Re:Word processors (Score:5, Insightful)

          by Amiga Trombone ( 592952 ) on Saturday August 09, 2008 @01:33PM (#24538637)

          I find it kind of sad that most people reading this have never typed on anything that doesn't rely on electrons to work.

          +1 Irrelevant Old Fart Comment

          +1 Get Off My Lawn Comment

          As somebody who's quite old enough to remember typing on things that didn't rely on electrons or even electricity, I can tell you for sure I don't miss the joys of carbon paper, having to start from scratch if you made a mistake, changing ribbons or unjamming jammed keys at all.

          That's like saying you feel sad because some people have never experienced the joy of taking a crap in an outhouse on a cold winter day.

          Sir, what are you thinking of?

          • by Tablizer ( 95088 )

            As somebody who's quite old enough to remember typing on things that didn't rely on electrons or even electricity, I can tell you for sure I don't miss the joys of carbon paper, having to start from scratch if you made a mistake, changing ribbons or unjamming jammed keys at all.

            Amen! My high-school English papers were probably 5mm thick in some spots with white-out. And I got some points taken off for parts of the white-out that later chipped off, exposing the original mistake.

            I tried to hand in my papers

      • Re: (Score:3, Interesting)

        by rbanffy ( 584143 )

        Just the other day, I was considering how much PC compatibility (and, later, Windows-friendliness) hindered the development of the personal computer.

        While we have faster and cheaper PCs than before, they are, basically, the same design. They employ the same processors, the same I/O architectures (down to the ISA bus buried inside the chipset).

        Free (as in speech) OSs could be the best things that happened in the last couple years to personal computers. It's now possible to build a non-x86 computer that is ab

      • Re:Word processors (Score:5, Interesting)

        by ratboy666 ( 104074 ) <fred_weigelNO@SPAMhotmail.com> on Saturday August 09, 2008 @04:18PM (#24539819) Journal

        "Barely had an operating system" -- my ass.

        Take the Philips P2000/P5000 series.

        Yes, I wrote the OS and DOS. Some features:

        - 64KB of memory (yes, not megabytes, kilobytes)
        - possible bank switching (P5000) for 64KB additional memory
        - fully pre-emptive OS
        - didn't use "p/v semaphores", used event bits of synchronization
        - full interrupt support, able to handle floppy, keyboard, printer i/o and processing concurrently
        - could edit, print, and copy files at the same time.
        - two level directories on floppy (document/page structure)
        - automatic read-after-write checks and re-allocaton of bad blocks (floppy media expensive)

        Note that some of these features did NOT appear in common PC systems until 1995 (full preemption). Memory allocation used fixed length blocks -- we couldn't tolerate fragmentation.

        • - two level directories on floppy (document/page structure) - automatic read-after-write checks and re-allocaton of bad blocks (floppy media expensive)

          P2000, I remember mini cassettes?

          • Naw... Micom P2000 was 8080 based.

            P5000 (Swift) had 2 Z80s. One for regular use, the second one because the Z80 was less expensive than a full featured disc controller. Ran its own little "micro-program". The Swift could do bank-switching.

            P2000 used 8" floppy, P5000 used 5 1/4".

            I did OS/DOS and "indirect cli" for the Swift (software was done by 3 people in well under a year). Also, I wrote CP/M 80 BIOS (CP/M 2.2 port) for Swift (after writing a proposal for this).

            The software was written in assembler (8080)

            • One of the "high points" in my career - well under budget, and far over expectation.

              :-)

              (i just took a quick glance at your resume)

        • Re: (Score:3, Interesting)

          by JLF65 ( 888379 )

          The Amiga had full preemptive multitasking in 1985. People tend to forget that the Amiga had most of the features considered "modern" long before most other OSes. Windows wasn't even out yet, and would attain similar features for a full decade. The Mac had only been out a few months and wouldn't have similar features for more than a full decade.

          • And in case you were wondering what other features are being discussed here, they include things like graphics acceleration (the mac didn't even HAVE a text mode, but had NO graphics acceleration - everything was done with the CPU) as well as sound acceleration, pluggable filesystems, autoconfiguration of hardware... However: The Amiga is the Macintosh done right. Commodore is Apple done wrong.
    • I had a Wang 286 once. Actually, I think it was the first intel box I ever owned. That was WAY after it should have been trashed though - all of my friends had 486's and I had my beloved Amiga. The Wang was just a toy I acquired somewhere for the fun of it.
  • In terms of the first microprocessor you should check out Pico Electronics Ltd, they work working with Sinclair and Monroe and had produced a single chip processor for calculators that was for sale in early 1971.

    http://www.xnumber.com/xnumber/microprocessor_history.htm [xnumber.com]

    The also created the X10 signaling over powerline for home automation and morphed into the eponymous company that had its short burst of infamy with its pop up advertising, before later declaring bankruptcy!
  • by ericferris ( 1087061 ) on Saturday August 09, 2008 @11:14AM (#24537683) Homepage

    1. Hire enginners

    2. Do the opposite of what they recommend

    3. ????

    4. Errr... Where is the profit?

    Ye flippin' gods.

    Let me summarize a few salient points of TFA here:

    • CTC management refused to buy the IP rights of the microprocessor for a paltry 50K (about $300K in today's dollars), a ridiculously low sum as far as circuit design is concerned.
    • The same management (maybe not the same persons though) were then caught cooking the books after CTC became Datapoint

    It's very nice that the name of Roche was documented in this article for posterity. But what we really want is to have the name of these managers documented and written down in business textbook, along with their pictures, the history of their glorious achievements, and maybe a warning such as "Do not hire, consult, play golf with, or even breathe the same air as those morons".

    I'd call this a case of terminal stupidity, but this pun is way too refined for the monstrous cluster-f*ck that these PHBs achieved.

    • by rbanffy ( 584143 )

      Their decision more or less made sense. They did not reject the 8080, but the 8008, which was not that superior to their TTL design. They could shave about U$50 off each device for going with the microprocessor. Since they had a more advanced design under development, they, perhaps, did not think the economy was worth the cost of retooling. How many thousand 2200s would they need to sell to compensate for that? Intel was late delivering the 8008.

      On the other hand, they should have foreseen that, despite thi

      • CTC management refused to buy the IP rights of the microprocessor for a paltry 50K (about $300K in today's dollars), a ridiculously low sum as far as circuit design is concerned.

      It's very nice that the name of Roche was documented in this article for posterity. But what we really want is to have the name of these managers documented and written down in business textbook, along with their pictures, the history of their glorious achievements, and maybe a warning such as "Do not hire, consult, play golf with,

      • I think that even without the benefits of insight, it was pretty obvious that this CPU chip thing had a bright future. One obvious application at the time was replacing the complex TTL boards (or relays and switches) of automata sequencers in production lines. That application alone sold tens of thousands of CPUs a year as soon as microprocessors became available.

        CTC/Datapoint could have become a giant on the automation market alone. And that's not even taking the yet non-existent microcomputer market into

  • 4004 (Score:3, Funny)

    by Anonymous Coward on Saturday August 09, 2008 @11:19AM (#24537713)

    4004: Lineage not found.

  • The second 8080 app (Score:4, Informative)

    by Anonymous Coward on Saturday August 09, 2008 @11:28AM (#24537763)

    or at least one of the first few, was also a PC - a Programmable Controller used for controlling industrial equipment or processes. Eagle Signal's Industrial Controls Division's CP700 Eptak modular system was 8080 based, and some of the early software was developed on Datapoint terminals. They paid $365 each for the first 8080s - an 8080A now goes for $1 or less. Eagle also ran what might have been the first college-level microprocessor course in-house for employees. It was taught by a prof from Iowa State and covered the 8080, 6800 and 6502. The original 8080s also required an external clock as the two pins across which you were supposed to be able to attach a crystal wouldn't osciallate. Don't recall the clock speed - 1MHz initially I think - but the 4MHz Z80 was considered a major speed advance.

    Eagle Signal also had a Traffic Control Division (you can still see their traffic light control cabinets on street corners) that was one of the first 8008 users, and also used Data General Novas for traffic controls.

    Neither Eagle Signal division exists any longer. Both were owned by Gulf+Western Industries in the early 70s and located in Davenport, IA. Both divisions eventually moved to Austin, TX. Danaher now owns the industrial controls product line, and probably makes more profit selling Eagle's HP5 electro-mechanical timers than its electronic products, which was where the company's profits always came from.

    • The 8080As I used in my first asm class ran at 75kHz. I don't remember exactly, but I'm pretty sure the Z80s we had were clocked considerably under 4MHz, especially considering the 8088-based PC I had only ran at 3MHz.

  • And from the Intel 8008, the 8008135 was created. It was optimized for Internet use.
  • So rather than the x80/x86 being the only Intel CPU to have mass-market success, it's in fact a third-party design that Intel started with, and they're batting 0.000?

    They've done better in the embedded market, with chips like the 8048/8051 and the i960 family.

  • Bean Counters (Score:3, Interesting)

    by Tablizer ( 95088 ) on Saturday August 09, 2008 @03:07PM (#24539257) Journal

    It's interesting how bean-counter thinking almost kept Intel from being the biggest chip company:

    Article quote: Frassanito recalled accompanying Roche to a meeting with Bob Noyce, head of Intel, in early 1970 to try to get Intel -- then a start-up devoted to making memory chips -- to produce the CPU chip. Roche presented the proposed chip as a potentially revolutionary development and suggested that Intel develop the chip at its own expense and then sell it to all comers, including CTC, Frassanito recalled.

    "Noyce [of Intel] said it was an intriguing idea, and that Intel could do it, but it would be a dumb move," said Frassanito. "He said that if you have a computer chip, you can only sell one chip per computer, while with memory [Intel's current focus], you can sell hundreds of chips per computer.

  • by master_p ( 608214 ) on Sunday August 10, 2008 @04:59AM (#24544147)

    The 8086 set the PC technology back 15 years, at least. At 1985, the Amiga could do hi-res multicolor bitmap displays, preemptive multitasking, hardware-accelerated graphics and sound, DMA, auto-configurable peripherals (through Zorro slots), 32-bit programming (although the addressing was 24 bit) without the curse of far pointers, and many other goodies that came much later in the PC world.

    The PC technology was largely retarded: stupid BIOS, stupid VGA register layout, stupid memory addressing, stupid interrupt controller, stupid DMA...all these things were very hard to program. But it dominated the world, because of compatibility...

    • The Amiga's capabilities had something to do with the 68000 family, which made the wide-scale integration easy (24 bit addressing was, at the time, spacious and roomy, especially compared to the 20-bit hack of the x86 of the day) but had more to do with the fact that it was built like a cross between a general-purpose computer and an arcade machine. The Amiga was astounding in its time but was limited not just by the failure of Commodore but also by the lack of memory protection. Crasho! I loved my Amigas (

You mean you didn't *know* she was off making lots of little phone companies?

Working...