Forgot your password?
typodupeerror
Hardware

Preserving Great Tech For Posterity — the 6502 290

Posted by timothy
from the double-secret-reverse-engineering dept.
trebonian writes "For great old hardware products like the MOS 6502 (used in the Apple II, the C64, the Nintendo NES), the details of the designs have been lost or forgotten. While there have been great efforts to reverse engineer the 6502 from the outside, there has not been the hardware equivalent of the source code — until now. As Russell Cox states: 'A team of three people accumulated a bunch of 6502 chips, applied sulfuric acid to them to strip the casing and expose the actual chips, used a high-resolution photomicroscope to scan the chips, applied computer graphics techniques to build a vector representation of the chip, and finally derived from the vector form what amounts to the circuit diagram of the chip: a list of all 3,510 transistors with inputs, outputs, and what they're connected to. Combining that with a fairly generic (and, as these things go, trivial) "transistor circuit" simulator written in JavaScript and some HTML5 goodness, they created an animated 6502 web page that lets you watch the voltages race around the chip as it executes. For more, see their web site visual6502.org.'"
This discussion has been archived. No new comments can be posted.

Preserving Great Tech For Posterity — the 6502

Comments Filter:
  • by DancesWithRobots (903395) on Thursday January 06, 2011 @11:53PM (#34787224)
    But wasn't the C64 processor a 6510? I could be wrong.
    • by Amorymeltzer (1213818) on Thursday January 06, 2011 @11:56PM (#34787252)

      Yes [wikipedia.org], but the difference ain't much.

      • by Arthur Grumbine (1086397) on Friday January 07, 2011 @04:20AM (#34788594) Journal

        Yes [wikipedia.org], but the difference ain't much.

        According to my rough calculations the difference is at least two more than the difference between an 8080 and an 8086.

      • by toejam13 (958243)

        That's actually my problem with the MOS 6510 - it was TOO similar to the MOS 6502. There were a bunch of bugs in the old 6502 core that had been around since it was released: weirdness with undefined op codes, wrapping index with direct page ops, unknown state of D flag after interrupt, unable to BRK after interrupt and so on.

        The WDC 65C02 fixed all of these issues and then some. In fact, once you've gotten used to programming on a bug fixed 6502 variant (WDC 65C02, Rockwell 65C02S, Hudson 6580, WDC 6581

  • by Super Dave Osbourne (688888) on Thursday January 06, 2011 @11:55PM (#34787244)
    I learned it while on vacation in europe in 1981 prior to my even knowing BASIC or FORTH on the Apple ][ line of computers. It was the most important step in my otherwise stellarly mediocre life as a senior software engineer with NeXT and Apple. Its great to see others taking a lasting approach to the chip that made the most impact on me and others in my industry. Thank you! BTW, did you know there is at least one logic bug in the CPU? :) Its fairly well known now, find it and you will have a bit of history on your hands.
    • by sconeu (64226)

      I still have my MOSTEK 6502 manual -- 8.5x11 softcover.

  • The 6502 was what I learned computer design, assembly, and machine code on. First was a Buck Engineering trainer, then a KIM-1 with an add-on board. I learned the value of coffee when coding. Finally I built my own system with an S-100 style archetecture. Memories indeed.

  • Just admit it -- someone really spent a lot of time on this idea, and it's cool that it works at any speed at all. I've only got an old P4, and it still ticked along quite nicely. Much faster and you wouldn't see anything but flicker-flashing anyhow.

  • by Petersko (564140) on Friday January 07, 2011 @12:06AM (#34787336)
    It was 1983, my Commodore Vic 20 (bought delivering newspapers) was soon to be replaced by a Commodore 64 (bought the same way), and nobody understood my fascination.

    Except, apparently, for Richard Mansfield, whose book I devoured. I remember trying to figure out how the heck to get the opcodes into memory. I had nobody to teach me what peek and poker were about, so it took a while.

    It's also possible to say that the 6502 and 6510 were perhaps the very last processors that I understood in real, intricate detail. Once I hit the 286 it might as well have run on magic pixie dust. I can't remember ever masking interrupts on an x86. I've only written in languages at the level of C or higher ever since., and I've never embedded assembler to fine-tune performance.

    My geek level has diminished.
    • I had the same sort of thing with the PDP 11/04, and later, the late and totally unlamented Teledyne 1750a processor. Both of those date me appropriately.

    • by Hylandr (813770)
      /Signed.

      - Dan.
    • Ditto. I read that book backwards and forwards and could do just about anything in ML on the C64. I remember writing floating point math routines because the instruction set had nothing built in.

      Fast forward to my first PC, a 486SX. I learned x86 assembly, but never felt the same kind of complete and utter control over the machine, probably because by that point in my life I didn't really have the time to dedicate to really immerse myself in it.

      However it gave me a great intuitive understanding of
      • by fwarren (579763) on Friday January 07, 2011 @01:32AM (#34787806) Homepage

        Fast forward to my first PC, a 486SX. I learned x86 assembly, but never felt the same kind of complete and utter control over the machine, probably because by that point in my life I didn't really have the time to dedicate to really immerse myself in it.

        No it was not the level of immersion. A computer like the C64 had 20K of ROM, hence a kernel that never changed. Always had a 6510 and a VIC chip and 64K of RAM. You could learn every byte inside out. Because the platform did not change over time, there were many volumes of literature written about the internals of this machine.

        Your 486 had a bios made by one of several different BIOS vendors, one of several revisions of that BIOS, for the particular chipset on your board. You had an audio card made by one company, a video card made be another company, an ATA controller made by yet another company.

        Who else really had your setup? No one wrote a complete manual for what you had. By the time you could figure out the BIOS and figure out what it took to POST the hardware you have, it is time to buy a new computer.

        Nowdays there are many levels of abstraction, the one thing it robs us of is the ability to understand exactly what our machines are doing.

    • I too bought my C64 the same way, delivering newspapers and ads :) Had a helluva time with it, too: played a few games on it, but mostly taught myself BASIC and then moved on to assembly. Could spend whole nights up just coding and learning. It really kindled my interest in computers, their internals and programming, and my life would have been quite different without those first contacts with a C64.

      Though, it died then suddenly one day, got an old 286 from my uncle, and that's where my story differs from y

    • I loved the Compute! book series. ML for beginners, Mapping the C64... Good times, good times.
    • I had nobody to teach me what peek and poker were about...

      I know it's rather too late to be useful - but I think I found your problem.

    • by Opportunist (166417) on Friday January 07, 2011 @04:55AM (#34788724)

      You should have moved on to Amiga and its Motorola 68000. That processor sure was MUCH easier to understand than the segment/offset mess the X86 architecture was in.

      The 68000 was the last processor I truly understood. It was pretty straightforward too. Very streamlined and quite powerful.

  • Not so fast... (Score:5, Informative)

    by jimmydevice (699057) on Friday January 07, 2011 @12:06AM (#34787338)
    "the details of the designs have been lost or forgotten. While there have been great efforts to reverse engineer the 6502 from the outside, there has not been the hardware equivalent of the source code — until now"
    The schematic for the 6502 has been available for years on the net. Printed on one sheet of photo paper at 1200 dpi, every transistor is visible. It's quite amazing.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      You mean Balasz' schematic. That's actually also reverse engineered, and has quite some mistakes in it.

    • This schematic also looks like it was reverse-engineered from the chip. Not that I would expect the original schematics to be available to the world, since all that stuff is considered to be quite proprietary by the manufacturers.
      It would be a lot more useful to see the original logic design.
      • I had assumed it was the schematic from a working chip and was most likely from MOS or maybe Western Design Center.
        Color me enlightened.
    • by arivanov (12034)

      Yep.

      And if for whatever reason the schematic cannot be reproduced the schematics for the East German and Russian "illegal" clones can be reproduced without any trouble (sorry, it's been a while so I have forgotten the actual chip numbers).

    • by Kazymyr (190114)

      According to the wikipedia article on the 6502, it is still manufactured for embedded systems. Thus the diffusion masks must still exist. In which case the design hasn't been "lost"

      Either that, or the wikipedia article is wrong.

      • Re:Not so fast... (Score:4, Informative)

        by ishobo (160209) on Friday January 07, 2011 @09:01AM (#34789752)

        Available from Western Design Center, started by Bill Mensch, the person who co-designed the 6502 with Chuck Peddle. Both also helped design the 6800. WDC ha seen selling the 6502 based products since the 1979. Both the Apple IIe and IIc used WDC product, the 65C02. All the second source products over the years have been lisenced from WDC. WDC has been able to sell products based on the original 6502 design because they co-held the patents with MOS.

        It is still sold in its original 40-pin plastic DIP. Verilog soft cores are available too.

      • They're splitting hairs. The later designs are 65C02 done in a different process with new(-er) masks.
    • by Arlet (29997)

      The guys that are doing the 6502 simulator have been long aware of this. This is just an independent effort, with some overlap, and some novel stuff. For instance, the schematic doesn't show you the exact area of the transistors, which can be important. By comparing the two you can also identify some mistakes that were made in either process, which is never a bad thing.

  • by serutan (259622) <snoopdoug&geekazon,com> on Friday January 07, 2011 @12:13AM (#34787368) Homepage

    An animation with voltages racing around the chip as it executes?

    Where's my light-cycle?? I must save my User from the MCP!!

  • Yep.. Space Invaders, too.. SIGGRAPH talk: http://www.visual6502.org/docs/6502_in_action_14_web.pdf [visual6502.org]
  • This is cool.

    It'll be even better when all the traffic dies down to the point you can load all the images.

  • nontrivial! (Score:5, Informative)

    by sillivalley (411349) <{ten.tsacmoc} {ta} {yellavillis}> on Friday January 07, 2011 @12:48AM (#34787564)
    The 6502 as used in the Apple ][ had some interesting quirks -- such as dummy read cycles that appeared on the bus when executing indexed operations. Woz used these dummy memory cycles in designing the original Apple ][ disk controller to whack the disk controller state machine. Undocumented at the least! Some of the Apple ][ disk copy protection schemes (particularly for games on 5 1/4 inch floppies) also relied upon undocumented behaviors in the processor, such as some of the "unused" opcodes.
    • Re:nontrivial! (Score:4, Informative)

      by NixieBunny (859050) on Friday January 07, 2011 @02:48AM (#34788168) Homepage
      Woz wasn't the only person to abuse the 6502. Don Lancaster took advantage of the address bus behavior or the chip when executing NOPs to make his cheap video display (the TVT 6-5/8) board. He basically used the processor as a 16 bit counter that could be loaded under software control during the horizontal blanking time. So the CPU chip became a video chip.
  • You mean to tell me that the companies who designed or licensed this chip plus the chip fabs themselves have no information on how they were manufactured?

    • by Mashiki (184564)

      I suppose it's possible. I mean we did loose the ability to make things like blue glass, green glass, and concrete at least one in the world. Losing schematics is also possible, I remember hearing a few years ago that some company was looking for a particular tube that RCA used to make, because RCA lost the diagram on how to manufacture the tube, whether or not it was ever true I have no idea. It was a passing fancy at the time for me, but even I have electronics that I use today that have tubes in them

      • Re:Really lost? (Score:5, Interesting)

        by JWSmythe (446288) <jwsmytheNO@SPAMjwsmythe.com> on Friday January 07, 2011 @04:22AM (#34788602) Homepage Journal

            That isn't all that uncommon. I guess no one recalls the 2009 news stories about the classified "fogbank" material in nuclear warheads. When the gov't wanted to upgrade some nuclear warheads, it was discovered that the material wasn't documented, and no one remembered how it was made.

            I couldn't find anything on the lost vacuum tube design, but I wouldn't be totally surprised. There have been plenty of things over the years that were superseded by something better, and the predecessors were forgotten about. Trade secrets often include keeping very little documentation about them, so they won't be accidentally released (or stolen).

            I've worked with quite a few things over the years where a working copy existed, but you couldn't replace it. Sometimes it took a virtually clean room reverse engineering to replace it. Sometimes it was hardware. Sometimes it was some application (like a server-side app) where it had been compiled once a decade ago, and no one had a clue where the source was. Well, they *did*, it was just "oh, well Bob did it on his computer. He left 8 years ago. We don't know where to find him", and his desktop had been reformatted, handed down through several other people, and finally retired to the trash a few years before.

            For stuff like vacuum tubes, and ancient apps that don't run on modern hardware, it's usually worth doing it fresh. For things like classified materials that keep nukes from spontaneously exploding, that knowledge can be virtually irreplaceable. There are various lost arts, that if you have to fall back to raw materials, we'd be screwed. I like using the loss of civilization and modern technology as an example. If you, I, and a few thousand skilled Slashdotters were dropped on an deserted island (or an alien but Earth-like) planet, how long would it take for us to raise technology from nothing to build a working computer? Assuming an abundance of easily identifiable raw materials, like "hey, that's iron ore", maybe we could build a forge, and make some decent hand tools. We may be able to build primitive radios, but I doubt we'd ever get far enough fab the first solid state chip, much less a CRT or LCD screen, before the first generation died off of old age. The second generation may have a clue, but no true memory of the technology in working form.

            Consider other "high tech" items. I love the car analogies, so we'll go in that direction. :) Could we build a car? I know quite a few of us know a lot, but could we actually design and fabricate one from scratch, even with another car as a template. Sure as hell in such a scenario we'd never build the first ECM. Could we get as far as a primitive carburetor? Who am I kidding, we'd probably get stuck on something like how to make tires. :) That's assuming no meeting-hungry managers or aspiring politicians were in the aforementioned group. If that happened, we'd end up stuck in endless meetings forming committees to discuss every feature, and we wouldn't get the first item put down on paper because how to make paper and a pencil would be stuck in committee for the rest of our lives.

            The idea of "lives" may be rather short anyways. How many of us are hunter/gatherers and farmers? Not enough to start from scratch.

            Kinda ruins those fun aspirations of colonizing alien planets, doesn't it? 1,000 space nerds fly to and are dropped off on a habitable planet circling one of the closest stars, just to die of starvation before establishing a society capable of self sustaining itself, because we don't have the skills to bring our technology up to "modern" levels.

    • by Tumbleweed (3706) * on Friday January 07, 2011 @01:57AM (#34787936)

      You mean to tell me that the companies who designed or licensed this chip plus the chip fabs themselves have no information on how they were manufactured?

      Well, sure, but they had them backed up on their Apple /// machines, and ... well ...

    • Yeah, I find it hard to believe too. The 6502 was heavily licensed to other companies, and was probably produced in more variations than anything short of the ARM. I would find it very difficult to believe that there aren't plenty of photolithographs stashed away in various files. Now, maybe an actual design document indicating what everything does might be missing, but there should be plenty of film available to reverse engineer from.

    • There's a post a few after yours explaining how this chip was designed. A modern chip is designed on computer, and all of the steps from the HDL down to the final masks will be backed up and archived. The 6502 schematic was hand-drawn on a really large piece of paper, then onto Rubylith for production. The schematic may have been lost long before they stopped producing it, because they just needed the mask.

      Once they stop producing them, there's no commercial incentive to keep the old designs around, s

  • Tickled to see this (Score:4, Interesting)

    by bfwebster (90513) on Friday January 07, 2011 @12:50AM (#34787576) Homepage

    I have fond memories of the 6502. I co-designed and did most of the coding for a computer game for the Apple II (Sundog) and so did a lot of 6502 assembly coding. A few years later, I taught assembly language coding to CS students at BYU, and we use 6502-based systems there as well (which was a vast improvement over the IBM 360 assembly + JCL on punch cards that I had to learn on a decade earlier as an undergrad myself).

    • by TheLink (130905)
      I vaguely remember Sundog... My cousin used to play it on his computer decades ago.

      I'm curious: how do you feel about the emulator versions of Sundog being easily downloadable on the Internet?
    • by sunspot42 (455706)

      Hey Bruce! I wasted many hours playing Sundog on the ST. Thanks!

      You should make a version for the iPad!

  • Just imagine .... (Score:5, Interesting)

    by martyb (196687) on Friday January 07, 2011 @01:03AM (#34787660)

    Just imagine a beowolf cluster of these? No, let's have some fun with math, instead.

    FTFS:

    'A team of three people accumulated a bunch of 6502 chips, applied sulfuric acid to them to strip the casing and expose the actual chips, used a high-resolution photomicroscope to scan the chips, applied computer graphics techniques to build a vector representation of the chip, and finally derived from the vector form what amounts to the circuit diagram of the chip: a list of all 3,510 transistors with inputs, outputs, and what they're connected to.

    Okay, bear with me here:

    • 45 Years Later, Does Moore's Law Still Hold True? [slashdot.org] ("Intel has packed just shy of a billion transistors into the 216 square millimeters of silicon that compose its latest chip [anandtech.com]... which linked article goes on to report: "The quad-core desktop Sandy Bridge die clocks in at 995 million transistors." )
    • Researchers Claim 1,000 Core Chip Created [slashdot.org] (By using FPGAs, Glasgow University researchers have claimed a proof of concept 1,000 core chip that they demonstrated running an MPEG algorithm at a speed of 5Gbps.)
    • Intel Talks 1000-Core Processors [slashdot.org] ("An experimental Intel chip shows the feasibility of building processors with 1,000 cores, an Intel researcher has asserted. The architecture for the Intel 48-core Single Chip Cloud Computer processor is 'arbitrarily scalable,' according to Timothy Mattson. 'This is an architecture that could, in principle, scale to 1,000 cores,' he said. 'I can just keep adding, adding, adding cores.'")

    So let's perform a few calculations, shall we? There are 995,000,000 transistors in the Sandy Bridge Quad Core die. There are 3,150 transistors in the 6502. That means that within the space of the Sandy Bridge chip, there could be, instead, 315,873 complete 6502 cores!!

    But wait, it gets better! Back in its day, IIRC the 6502 ran at what, 1MHz? 2MHz? With today's technology, we could run each of these cores at least one-thousand times faster than the original! That's like having another thousand times as many 6502 cores.

    So, finally, in the space of just one Sandy Bridge Quad Core die, we could have the processing equivalent of over 300 Million 6502 cores!(*)

    (*) Okay, granted, it would take a not insignificant amount of space on die to connect all these together, along with a metric ton of lines for sending data and address info to/from each 6502 processor. Nevertheless, I'm just boggled to see how far we've come from the chip that was in the first computer I ever bought!

    • Just imagine if the same effort that went into minimizing the 6502's transistor count back then were to be applied to a modern CPU. It would take decades from concept to tapeout!
    • by Tumbleweed (3706) *

      That means that within the space of the Sandy Bridge chip, there could be, instead, 315,873 complete 6502 cores!! ...
      With today's technology, we could run each of these cores at least one-thousand times faster than the original! ...
      So, finally, in the space of just one Sandy Bridge Quad Core die, we could have the processing equivalent of over 300 Million 6502 cores!(*)

      That would make for one bad-ass game of LoadRunner!

    • Adding 64K of local memory, interface and interconnect would approach 1M transistors / core, and we're back to 1000 processors.
    • Ca. 2000 I did a speed test with a few machines: Pentium pro 200, AMD K6 233, and my old BBC micro, running a program to find all the possible permutations of the upper layer of a rubik's cube with a given number of quarter-turns. I did this already in 1984/1985 but couldn't get very far, I think 11-12 speedwise was the maximum. In the end I got the result that all permutations can be reached in 19 quarter turns. This took about 60 days of CPU power, which I split over a few machines. Then I found a bug in
  • They've been doing stuff like this for a long time: http://guru.mameworld.info/decap/index.html [mameworld.info]
  • by jtara (133429) on Friday January 07, 2011 @01:21AM (#34787752)

    Did you know that the very first 6502 layout had an unused space reserved for an electrical outlet? No, not an electrical outlet on the chip, silly! An electrical outlet on the wall of the designer.

    I was writing the software for a firmware-based gasoline pump based on the 4040 when the Mostek 6502 was announced. The pricing and power were a huge breakthrough - we could now afford to use an 8-bit processor instead of 4-bit, and the chip was way easier to interface. We arranged a visit to Mostek and came back with a prized pre-production chip with the lid soldered on. We met Chuck Peddle, and he showed us a prototype of the KIM development board. We also took back with us a 9-track tape of a 6502 assembler (written in Fortran) for installation on the local university's timesharing system, which at the time sold time to the public.

    We also met the chip designers and saw the original hand-drawn layout for the chip. No automatic routing software - just drawing on a huge sheet of paper on one wall of one of the guy's offices. There was actually an area of silicon that could not be used because there was an electrical outlet on the wall that they needed for something in the office, so they just didn't draw on that part. The finished design was then rendered in Rubylith, and we were shown a "cell library" which consisted of a set of large drawers with various circuit elements pre-cut in Rubylith.

    Since the KIM was not yet available, we built our own development system - first wirewrapped, and later a set of circuit boards, using 44-bit edge connectors and defined a bus We let my friend Rene use the circuit board layouts we had, and he did some additional boards himself, laying down black tape on mylar. We were given a monitor program that would allow us to load paper tapes produced on a TTY connected to timesharing system and do some debugging.

    It was a really fun and easy chip to program, and I worked on several other firmware projects using the 6502 over the next few years. I did some 6800 as well, but always preferred the 6502.

    • by CaptKilljoy (687808) on Friday January 07, 2011 @02:14AM (#34788022)

      > There was actually an area of silicon that could not be used because there was an electrical outlet on the wall that they needed for something in the office, so they just didn't draw on that part.
      Being curious, I just had to look at the actual chip layout in the online simulator [visual6502.org] to see if this was true. You can see a pair of rectangular voids left of the center, near the edge, the lower of which seems to have the right dimensional ratio to have been a space for a wall socket.

      It's fascinating to hear the lost bit of engineering history that explains something would otherwise have forever remained a mystery. Somebody should forward this anecdote to the visual6502.org guys.

    • Did you know that the very first 6502 layout had an unused space reserved for an electrical outlet? No, not an electrical outlet on the chip, silly! An electrical outlet on the wall of the designer.

      It was (don't know if it still is) common for the layout "artist" to include a graphic of his choice on the die [si.edu].

    • Awesome! (Score:4, Informative)

      by chiark (36404) on Friday January 07, 2011 @07:00AM (#34789186) Homepage Journal
      To me, Chuck Peddle is an absolute inspiration. He's not done the easy thing, or the materialistic thing, but the right thing many many times in his career.

      These were the early days of the computer revolution, and I strongly recommend Brian Bagnell's book, Commodore: A company on the edge, to anyone remotely interested in the era... It's a healthy dose of realism and a perfect antidote to historical revisionism that seems to be coming from a couple of areas in the States...

      The guy is a hero, as were the small teams laying the foundations that, ultimately, means we all have more interesting jobs. No article on the 6502 should fail to mention Chuck Peddle and the team at MOStek, which ultimately became part of Commodore... History tells us that what becomes part of commodore burns brightly, but briefly...

      Get that book, it's great.

  • by chaoskitty (11449) <john@sixgirYEATSls.org minus poet> on Friday January 07, 2011 @01:40AM (#34787850) Homepage

    I have a Commodore A2232 seven port serial card in my Amiga 4000 in my datacenter which provides serial consoles to a number of other machines. While other multiport serial cards have RISC processors or large buffers, this card is simply a 3.58 MHz 65CE02 which polls each port and puts incoming characters into its 16k of memory, which the Amiga can access directly. It's a beautiful example of simplicity at work.

  • 45 years from now, perhaps some of you "youngsters" reading this will be trying to resurrect this site from a roomful of forgotten, dusty hard drives.

    Nah...
  • Amazing bits of history in this thread...

    We pros still de-lid and polish back circuits for reverse-engineering. These days, instead of an optical microscope, you need an SEM to do it. The principle is still the same.

    Delicate work, but quite amazing results.
    • by Sulphur (1548251)

      Amazing bits of history in this thread...

      We pros still de-lid and polish back circuits for reverse-engineering. These days, instead of an optical microscope, you need an SEM to do it. The principle is still the same.

      Speaking of SEM: TI made a video of an IC running in a SEM.

      It showed the current so to speak.

      I can't remember if it was the SEM video or Making of a Microchip (MMC01).

  • And if you use it on the simulator what happens?

  • ZX Spectrum ULA (Score:4, Interesting)

    by Alioth (221270) <no@spam> on Friday January 07, 2011 @05:06AM (#34788754) Journal

    The same thing has been done to the Sinclair ZX Spectrum ULA.

    Little remains of the original Ferranti technical documentation of the Ferranti ULA (which was used for the ZX Spectrum, the 5C- and 6C- series, the same ULA technology was used in the BBC Microcomputer and other European machines during the 1980s). At the time Ferranti was the leader in custom logic until they sat on their laurels and let other companies transistion to CMOS and field programmable logic and eat their lunch.

    The guy who did the reverse engineering (Chris Smith, a friend of mine) has written a book about the Spectrum ULA which details the Ferranti ULA and how it was used in the Spectrum. It has quite a lot on the Ferranti ULAs, including how they were made, the process for making the IC, etc. as well as its particular implementation in the Spectrum. This was also done by de-encapsulation (which I think involves fuming nitric acid, rather than H2SO4).

    Chris's work can be found here (if you want to buy the book, he gets more of the money if you order it through his website rather than Amazon, but it's also available under the GFDL):

    http://www.zxdesign.info/book/ [zxdesign.info]

    I think it's very important that this kind of thing is preserved, the early personal computers are a bit like the equivalent of the early Industrial Revolution textile machinery, but fortunately can be preserved at an individual level rather than needing a huge rich museum.

  • Wot no BBC Micro? (Score:4, Informative)

    by mustrum_ridcully (311862) on Friday January 07, 2011 @05:40AM (#34788860)
    Thought I should mention the BBC Micro (aka Acorn Proton), manufactured by Acorn Computers (RIP) who gave the world the ARM microprocessor, as it also used the 6502. BBC micros equipped with a 6502 2nd processor were actually used to develop the first generation of ARM processors. So yes the humble 6502 is a pretty important processor, if Acorn had used the 6800 or 8088 then we might not have the ARM processor today.
  • by carlhaagen (1021273) on Friday January 07, 2011 @08:00AM (#34789446)
    Not sure why the original poster claims that the 6502 core design was lost in time. WDC (Western Design Center http://www.westerndesigncenter.com/wdc/ [westerndesigncenter.com] ) bought the core design and the rights to license it many many years ago. These are the guys behind a few variants building on top of the 6502 as well, like f.e. the 65816 used in the Super Nintendo.
    • by ishobo (160209)

      That is not correct, they never bought a license. WDC co-held the design with MOS. WDC has been selling 6502 based products since 1979.

Life. Don't talk to me about life. - Marvin the Paranoid Anroid

Working...