Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware

Preserving Great Tech For Posterity — the 6502 290

trebonian writes "For great old hardware products like the MOS 6502 (used in the Apple II, the C64, the Nintendo NES), the details of the designs have been lost or forgotten. While there have been great efforts to reverse engineer the 6502 from the outside, there has not been the hardware equivalent of the source code — until now. As Russell Cox states: 'A team of three people accumulated a bunch of 6502 chips, applied sulfuric acid to them to strip the casing and expose the actual chips, used a high-resolution photomicroscope to scan the chips, applied computer graphics techniques to build a vector representation of the chip, and finally derived from the vector form what amounts to the circuit diagram of the chip: a list of all 3,510 transistors with inputs, outputs, and what they're connected to. Combining that with a fairly generic (and, as these things go, trivial) "transistor circuit" simulator written in JavaScript and some HTML5 goodness, they created an animated 6502 web page that lets you watch the voltages race around the chip as it executes. For more, see their web site visual6502.org.'"
This discussion has been archived. No new comments can be posted.

Preserving Great Tech For Posterity — the 6502

Comments Filter:
  • by Super Dave Osbourne ( 688888 ) on Thursday January 06, 2011 @11:55PM (#34787244)
    I learned it while on vacation in europe in 1981 prior to my even knowing BASIC or FORTH on the Apple ][ line of computers. It was the most important step in my otherwise stellarly mediocre life as a senior software engineer with NeXT and Apple. Its great to see others taking a lasting approach to the chip that made the most impact on me and others in my industry. Thank you! BTW, did you know there is at least one logic bug in the CPU? :) Its fairly well known now, find it and you will have a bit of history on your hands.
  • by Petersko ( 564140 ) on Friday January 07, 2011 @12:06AM (#34787336)
    It was 1983, my Commodore Vic 20 (bought delivering newspapers) was soon to be replaced by a Commodore 64 (bought the same way), and nobody understood my fascination.

    Except, apparently, for Richard Mansfield, whose book I devoured. I remember trying to figure out how the heck to get the opcodes into memory. I had nobody to teach me what peek and poker were about, so it took a while.

    It's also possible to say that the 6502 and 6510 were perhaps the very last processors that I understood in real, intricate detail. Once I hit the 286 it might as well have run on magic pixie dust. I can't remember ever masking interrupts on an x86. I've only written in languages at the level of C or higher ever since., and I've never embedded assembler to fine-tune performance.

    My geek level has diminished.
  • by Super Dave Osbourne ( 688888 ) on Friday January 07, 2011 @12:31AM (#34787456)
    That bug is a bit less known, I meant the paging flaw with hex.
  • Tickled to see this (Score:4, Interesting)

    by bfwebster ( 90513 ) on Friday January 07, 2011 @12:50AM (#34787576) Homepage

    I have fond memories of the 6502. I co-designed and did most of the coding for a computer game for the Apple II (Sundog) and so did a lot of 6502 assembly coding. A few years later, I taught assembly language coding to CS students at BYU, and we use 6502-based systems there as well (which was a vast improvement over the IBM 360 assembly + JCL on punch cards that I had to learn on a decade earlier as an undergrad myself).

  • Just imagine .... (Score:5, Interesting)

    by martyb ( 196687 ) on Friday January 07, 2011 @01:03AM (#34787660)

    Just imagine a beowolf cluster of these? No, let's have some fun with math, instead.

    FTFS:

    'A team of three people accumulated a bunch of 6502 chips, applied sulfuric acid to them to strip the casing and expose the actual chips, used a high-resolution photomicroscope to scan the chips, applied computer graphics techniques to build a vector representation of the chip, and finally derived from the vector form what amounts to the circuit diagram of the chip: a list of all 3,510 transistors with inputs, outputs, and what they're connected to.

    Okay, bear with me here:

    • 45 Years Later, Does Moore's Law Still Hold True? [slashdot.org] ("Intel has packed just shy of a billion transistors into the 216 square millimeters of silicon that compose its latest chip [anandtech.com]... which linked article goes on to report: "The quad-core desktop Sandy Bridge die clocks in at 995 million transistors." )
    • Researchers Claim 1,000 Core Chip Created [slashdot.org] (By using FPGAs, Glasgow University researchers have claimed a proof of concept 1,000 core chip that they demonstrated running an MPEG algorithm at a speed of 5Gbps.)
    • Intel Talks 1000-Core Processors [slashdot.org] ("An experimental Intel chip shows the feasibility of building processors with 1,000 cores, an Intel researcher has asserted. The architecture for the Intel 48-core Single Chip Cloud Computer processor is 'arbitrarily scalable,' according to Timothy Mattson. 'This is an architecture that could, in principle, scale to 1,000 cores,' he said. 'I can just keep adding, adding, adding cores.'")

    So let's perform a few calculations, shall we? There are 995,000,000 transistors in the Sandy Bridge Quad Core die. There are 3,150 transistors in the 6502. That means that within the space of the Sandy Bridge chip, there could be, instead, 315,873 complete 6502 cores!!

    But wait, it gets better! Back in its day, IIRC the 6502 ran at what, 1MHz? 2MHz? With today's technology, we could run each of these cores at least one-thousand times faster than the original! That's like having another thousand times as many 6502 cores.

    So, finally, in the space of just one Sandy Bridge Quad Core die, we could have the processing equivalent of over 300 Million 6502 cores!(*)

    (*) Okay, granted, it would take a not insignificant amount of space on die to connect all these together, along with a metric ton of lines for sending data and address info to/from each 6502 processor. Nevertheless, I'm just boggled to see how far we've come from the chip that was in the first computer I ever bought!

  • by jtara ( 133429 ) on Friday January 07, 2011 @01:21AM (#34787752)

    Did you know that the very first 6502 layout had an unused space reserved for an electrical outlet? No, not an electrical outlet on the chip, silly! An electrical outlet on the wall of the designer.

    I was writing the software for a firmware-based gasoline pump based on the 4040 when the Mostek 6502 was announced. The pricing and power were a huge breakthrough - we could now afford to use an 8-bit processor instead of 4-bit, and the chip was way easier to interface. We arranged a visit to Mostek and came back with a prized pre-production chip with the lid soldered on. We met Chuck Peddle, and he showed us a prototype of the KIM development board. We also took back with us a 9-track tape of a 6502 assembler (written in Fortran) for installation on the local university's timesharing system, which at the time sold time to the public.

    We also met the chip designers and saw the original hand-drawn layout for the chip. No automatic routing software - just drawing on a huge sheet of paper on one wall of one of the guy's offices. There was actually an area of silicon that could not be used because there was an electrical outlet on the wall that they needed for something in the office, so they just didn't draw on that part. The finished design was then rendered in Rubylith, and we were shown a "cell library" which consisted of a set of large drawers with various circuit elements pre-cut in Rubylith.

    Since the KIM was not yet available, we built our own development system - first wirewrapped, and later a set of circuit boards, using 44-bit edge connectors and defined a bus We let my friend Rene use the circuit board layouts we had, and he did some additional boards himself, laying down black tape on mylar. We were given a monitor program that would allow us to load paper tapes produced on a TTY connected to timesharing system and do some debugging.

    It was a really fun and easy chip to program, and I worked on several other firmware projects using the 6502 over the next few years. I did some 6800 as well, but always preferred the 6502.

  • by fwarren ( 579763 ) on Friday January 07, 2011 @01:32AM (#34787806) Homepage

    Fast forward to my first PC, a 486SX. I learned x86 assembly, but never felt the same kind of complete and utter control over the machine, probably because by that point in my life I didn't really have the time to dedicate to really immerse myself in it.

    No it was not the level of immersion. A computer like the C64 had 20K of ROM, hence a kernel that never changed. Always had a 6510 and a VIC chip and 64K of RAM. You could learn every byte inside out. Because the platform did not change over time, there were many volumes of literature written about the internals of this machine.

    Your 486 had a bios made by one of several different BIOS vendors, one of several revisions of that BIOS, for the particular chipset on your board. You had an audio card made by one company, a video card made be another company, an ATA controller made by yet another company.

    Who else really had your setup? No one wrote a complete manual for what you had. By the time you could figure out the BIOS and figure out what it took to POST the hardware you have, it is time to buy a new computer.

    Nowdays there are many levels of abstraction, the one thing it robs us of is the ability to understand exactly what our machines are doing.

  • by CaptKilljoy ( 687808 ) on Friday January 07, 2011 @02:14AM (#34788022)

    > There was actually an area of silicon that could not be used because there was an electrical outlet on the wall that they needed for something in the office, so they just didn't draw on that part.
    Being curious, I just had to look at the actual chip layout in the online simulator [visual6502.org] to see if this was true. You can see a pair of rectangular voids left of the center, near the edge, the lower of which seems to have the right dimensional ratio to have been a space for a wall socket.

    It's fascinating to hear the lost bit of engineering history that explains something would otherwise have forever remained a mystery. Somebody should forward this anecdote to the visual6502.org guys.

  • by JWSmythe ( 446288 ) <jwsmytheNO@SPAMjwsmythe.com> on Friday January 07, 2011 @02:51AM (#34788180) Homepage Journal

        You know what's funny, that (the parent's argument) was the song of the Windows fan boys for so long. Now I'm somewhere that I have an abundance of old hardware, and a mix of new and old operating systems. I'm finding it easier to throw Linux on a box, than to pray that Vista or Win7 will work. CPU and memory wise, sure it'll work. But dear god don't try to find drivers for some fairly standard old video card (like a 4Mb to 32Mb PCI card), sound card, or printer. There are three categories for old systems. "Good enough for Win7", "Wow, a great Linux machine", and "don't bother". Why bother with a 1Ghz machine, when I have stacks of 2.0Ghz to 2.8Ghz machines to use. Oh, and if anyone is interested, I stumbled on a stack of probably a dozen 200Mhz Pentium CPU's. I have no idea what to do with them, but they'll probably end up in my own personal museum. :)

        We just played a little game with one of our techs. "Can you get Win2k3 to install" on some old Dell servers with the original Win2k3 license stuck on the case by Dell at the factory. Linux? 5 minutes from an install CD I made, or 15 minutes from the distro original CDs. Both are current. Win2k3? 3 days of head pounding, calls to Dell support, downloading and running BIOS updates, and some mystery driver emailed and being told "try this". 24 work hours for 1 server, versus 0.25 work hours. Poor guy, he was a huge Windows fan. By the end of it, he looked like he was going to personally go and bomb Redmond. :)

        The best work around I've found is to run Linux on them, and then provide any pesky Windows needs with a virtual machine under VMWare.

        I inherited several old printers, which were wonderful old workhorses of their time. My mom still on her WinXP machine, because it still runs any app she wants, and is fine for CPU and memory (2.8Ghz, and I upgraded her memory for 2GB last year). I tried to plug it into my Win7 Home Premium laptop (USB to Parallel converter required for both machines). Nope, sorry, no Win7 drivers available anywhere. No kludges other than "get a new printer". Yet still, it's supported perfectly well under Linux. heh.

        I actually haven't had a problem with a new run of the mill system, or even most exotic hardware, for years under Linux.

  • Re:Really lost? (Score:5, Interesting)

    by JWSmythe ( 446288 ) <jwsmytheNO@SPAMjwsmythe.com> on Friday January 07, 2011 @04:22AM (#34788602) Homepage Journal

        That isn't all that uncommon. I guess no one recalls the 2009 news stories about the classified "fogbank" material in nuclear warheads. When the gov't wanted to upgrade some nuclear warheads, it was discovered that the material wasn't documented, and no one remembered how it was made.

        I couldn't find anything on the lost vacuum tube design, but I wouldn't be totally surprised. There have been plenty of things over the years that were superseded by something better, and the predecessors were forgotten about. Trade secrets often include keeping very little documentation about them, so they won't be accidentally released (or stolen).

        I've worked with quite a few things over the years where a working copy existed, but you couldn't replace it. Sometimes it took a virtually clean room reverse engineering to replace it. Sometimes it was hardware. Sometimes it was some application (like a server-side app) where it had been compiled once a decade ago, and no one had a clue where the source was. Well, they *did*, it was just "oh, well Bob did it on his computer. He left 8 years ago. We don't know where to find him", and his desktop had been reformatted, handed down through several other people, and finally retired to the trash a few years before.

        For stuff like vacuum tubes, and ancient apps that don't run on modern hardware, it's usually worth doing it fresh. For things like classified materials that keep nukes from spontaneously exploding, that knowledge can be virtually irreplaceable. There are various lost arts, that if you have to fall back to raw materials, we'd be screwed. I like using the loss of civilization and modern technology as an example. If you, I, and a few thousand skilled Slashdotters were dropped on an deserted island (or an alien but Earth-like) planet, how long would it take for us to raise technology from nothing to build a working computer? Assuming an abundance of easily identifiable raw materials, like "hey, that's iron ore", maybe we could build a forge, and make some decent hand tools. We may be able to build primitive radios, but I doubt we'd ever get far enough fab the first solid state chip, much less a CRT or LCD screen, before the first generation died off of old age. The second generation may have a clue, but no true memory of the technology in working form.

        Consider other "high tech" items. I love the car analogies, so we'll go in that direction. :) Could we build a car? I know quite a few of us know a lot, but could we actually design and fabricate one from scratch, even with another car as a template. Sure as hell in such a scenario we'd never build the first ECM. Could we get as far as a primitive carburetor? Who am I kidding, we'd probably get stuck on something like how to make tires. :) That's assuming no meeting-hungry managers or aspiring politicians were in the aforementioned group. If that happened, we'd end up stuck in endless meetings forming committees to discuss every feature, and we wouldn't get the first item put down on paper because how to make paper and a pencil would be stuck in committee for the rest of our lives.

        The idea of "lives" may be rather short anyways. How many of us are hunter/gatherers and farmers? Not enough to start from scratch.

        Kinda ruins those fun aspirations of colonizing alien planets, doesn't it? 1,000 space nerds fly to and are dropped off on a habitable planet circling one of the closest stars, just to die of starvation before establishing a society capable of self sustaining itself, because we don't have the skills to bring our technology up to "modern" levels.

  • ZX Spectrum ULA (Score:4, Interesting)

    by Alioth ( 221270 ) <no@spam> on Friday January 07, 2011 @05:06AM (#34788754) Journal

    The same thing has been done to the Sinclair ZX Spectrum ULA.

    Little remains of the original Ferranti technical documentation of the Ferranti ULA (which was used for the ZX Spectrum, the 5C- and 6C- series, the same ULA technology was used in the BBC Microcomputer and other European machines during the 1980s). At the time Ferranti was the leader in custom logic until they sat on their laurels and let other companies transistion to CMOS and field programmable logic and eat their lunch.

    The guy who did the reverse engineering (Chris Smith, a friend of mine) has written a book about the Spectrum ULA which details the Ferranti ULA and how it was used in the Spectrum. It has quite a lot on the Ferranti ULAs, including how they were made, the process for making the IC, etc. as well as its particular implementation in the Spectrum. This was also done by de-encapsulation (which I think involves fuming nitric acid, rather than H2SO4).

    Chris's work can be found here (if you want to buy the book, he gets more of the money if you order it through his website rather than Amazon, but it's also available under the GFDL):

    http://www.zxdesign.info/book/ [zxdesign.info]

    I think it's very important that this kind of thing is preserved, the early personal computers are a bit like the equivalent of the early Industrial Revolution textile machinery, but fortunately can be preserved at an individual level rather than needing a huge rich museum.

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...