Forgot your password?
typodupeerror
Hardware Hacking Technology

Developer Creates DIY 8-Bit CPU 187

Posted by Soulskill
from the now-that's-impressive dept.
MaizeMan writes "Not for the easily distracted: a Belmont software developer's hand-built CPU was featured in Wired recently. Starting with a $50 wire wrap board, Steve Chamberlin built his CPU with 1253 pieces of wire, each wire wrapped by hand at both ends. Chamberlin salvaged parts from '70s and '80s era computers, and the final result is an 8-bit processor with keyboard input, a USB connection, and VGA graphical output. More details are available on the developer's blog."
This discussion has been archived. No new comments can be posted.

Developer Creates DIY 8-Bit CPU

Comments Filter:
  • by physicsphairy (720718) on Saturday May 30, 2009 @08:17AM (#28148657) Homepage

    I own a two-bit computer. My dad gave it to me. I know it is two bits because before he gave it to me he would often remark "I hate this ******* two bit computer."

    (Yes, it is also reproductive.)

    • "Windows: A thirty-two bit extension and graphical shell to a sixteen bit patch to an eight bit operating system originally coded for a four bit microprocessor which was written by a two-bit company that can't stand one bit of competition."
  • Sorry, I just had to ask.

    • Crysis, OTOH, is fine
    • Re: (Score:3, Informative)

      by dunkelfalke (91624)

      not a single version of windows was designed to run on 8 bit cpus.

    • by Anonymous Coward on Saturday May 30, 2009 @08:40AM (#28148725)

      Why does this question keep coming up? Of course it does. It's turing-complete, so it's just a matter of writing the software. Not for the impatient users, obviously.

      • by morgan_greywolf (835522) on Saturday May 30, 2009 @09:05AM (#28148825) Homepage Journal

        Well, your response applies slightly more to Linux (one would just have to implement a Linux kernel on the 8-bit CPU, which isn't likely to happen to anytime soon) and doesn't really apply to Vista at all. MIcrosoft would have to implement Vista, and unless there is sufficient market demand for this 8-bit CPU, they'll never do it, since the incentive for them to write an 8-bit Vista is approximately zero.

        While it may be possible to write a Linux kernel for an 8-bit processor, this, too, is not likely, at least not a complete Linux kernel. Linux was pretty much designed and written from the ground up on a 32-bit processor with built-in low-level support for multitasking.

        So, IOW, while you are theoretically correct, from a practical standpoint implementing Vista or Linux or any other modern OS, with the exception of FreeDOS, is virtually impossible. Hence, the GP's joke retains its original humor.

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          any Turing machine can emulate any other. an 8 bit CPU can do a 128 bit calculation, it just has to do it in small steps. you would probably need a lot of disk space.

          for example you store on disk the input and output of every transistor in a core2duo. then you iterate through each transistor and set the output according to the input. it may take a billion clock cycles to emulate one core2duo clock cycle, but its possible.

      • Well, that and another lifetime or so hand-crafting 1GB+ of memory chips, yeah.

      • by thethibs (882667)

        Turing-complete is not enough if you have a user interface.

        • by jlarocco (851450)

          Sure it is. Input and output is nothing more than reading and writing to the proper memory locations.

      • I tried writing a virtualization in Z-80 assembler of a 32-bit system meeting Microsoft's specifications for Vista.

        Unfortunately it couldn't actually run Vista because Microsoft underballed the requirements.

  • After all, it has to be better than MS-DOS!

  • by Bentov (993323)
    Not one "but does it run linux" ? I'm sooooo disappointed...
  • If only I had a time machine and a highly extended lifespan, so I could spare a couple thousand hours on it...

    -jcr

  • by Dachannien (617929) on Saturday May 30, 2009 @08:46AM (#28148761)

    built his CPU with 1253 pieces of wire

    Farnsworth: Let me show you around. That's my lab table, and this is my work stool, and over there is my intergalactic spaceship. And here is where I keep assorted lengths of wire.
    Fry: Whoa! A real live space ship!
    Farnsworth: I designed it myself. Let me show you some of the different lengths of wire I used.

  • by thomasdz (178114) on Saturday May 30, 2009 @08:49AM (#28148769)

    Yeah, baby...
    Back before the days of the 4004, 8008, and 8080, when we built computers, we REALLY built computers.
    None of this: take a pre-built-motherboard, add a pre-built-power-supply, add a pre-built graphics card...

    oblig: get off my lawn

  • by mgblst (80109) on Saturday May 30, 2009 @08:58AM (#28148795) Homepage

    It is about time that Intel has some competition.

  • by Sponge Bath (413667) on Saturday May 30, 2009 @09:11AM (#28148847)

    All the board level products I designed in the early 90s had to be prototyped with wire wrapping. Even if you are careful, by the time you do hundreds of connections it is almost inevitable there is some flaw. You might miscount a row of pins and attach to the wrong pin. The process of layering multiple connection to a single pin might damage a wire at the bottom. Wires might break or make a shaky connection that comes or goes.

    I would not ever want to go back to that, but it did two useful things: The plodding physical process of "I'm now connecting this to that." forced a slow, comprehensive walk through of your design which can reveal design mistakes. The other is honing debugging skills of intermittent problems: "Is this a design flaw or a wire making poor contact?".

    • by Animats (122034)

      Wire wrapping isn't that big a deal. I still have a wire wrap gun, although I haven't used it in years. You buy precut, wire in different standard lengths, and it comes with the right amount of insulation pre-stripped at each end. You have a wire list (A-15 to C-42, etc., and you just follow the wire list. It's usual to sort the wire list by length, so that you do the longest wires first. It's time-consuming, but not difficult. Far easier than ordinary hand-wiring.

      Fully automatic wire wrap machines

  • by Elledan (582730) on Saturday May 30, 2009 @09:23AM (#28148889) Homepage
    Magic-1 [homebrewcpu.com], a 16-bit TTL-based, wire-wrap PCB computer.

    Slashdot posted an article on Magic-1 when it was completed years ago as well.
  • I remember a grad class that went into circuit design and register transfer language. I was amazed how opcodes could be made to twiddle bits. We just had a software simulator to try our designs -- I admire this guy for building it. Cool!
  • Fantastic! (Score:5, Interesting)

    by adosch (1397357) on Saturday May 30, 2009 @09:37AM (#28148953)
    There needs to be more Steve Chamberlin's in the world. Personal (or enterprise, for that matter) computing hardware has hit a mass exploitation mark; computers today have such an abundance of resources, storage and processing power, any developer I've had to work with in the last half of decade sees the computer, much like Steve mentioned in TFA, as "...like black boxes... and understand what they do, but not how they do it." which leads to blatant disregard for anything, really sloppy ways of coding and development, zero ideology or best practice on how to truly harness and control resources efficiently. I don't expect anyone to have a physics background or be some die-hard electrical engineer, but there's definitely something to be said for growing up and working with early computer models where you had to give two shakes about that stuff. This is very cool, indeed.
    • Re:Fantastic! (Score:4, Insightful)

      by Vellmont (569020) on Saturday May 30, 2009 @11:05AM (#28149481)


      which leads to blatant disregard for anything, really sloppy ways of coding and development, zero ideology or best practice on how to truly harness and control resources efficiently.

      I see lots of sloppy coding practices, but they have about nothing to do with hardware efficiency. (Think buginess and maintainability). Unless you're writing for specialized environments, the days of worrying about a few cycles of CPU, or a few kilobytes of memory are over (and good riddance).

      The world has changed and the challenges have changed. It's great to design your own CPU, and I'm sure he learned an enormous amount. But to pretend that we should all be thinking like it's still 1979 is absurd.

    • One where you had to give a rat arse about the ambient temperature to set the fuel mix, where you had to handcrank to get an idea what compression really means. Real manly brakes, none of this ABS crap that required you to anticipate as it would take a while to bring several tons of metal to a full stop. Etc etc. Personally, I love the modern age, where we can concentrate on what we want to achieve not being limited by the tools.
      • by smoker2 (750216)
        While all you said is fair enough, sometimes not knowing the basics can get you in trouble. When steering wasn't so positive, or your brakes weren't as efficient, you learned how to handle several tons of metal. So when you threw it into a corner, you could feel the car lean into the corner, and you knew instinctively when it was likely to break away from you. These days, you just point and shoot, and the result is lots of cars hitting each other or leaving the road.

        Plus the ease of use of modern vehicles
  • newsworthy? (Score:4, Informative)

    by SolusSD (680489) on Saturday May 30, 2009 @10:11AM (#28149141) Homepage
    This sounds like the kind of project any computer engineering undergrad curriculum would cover. Myself, I have had to design/build 4 different processors of varying complexity (basic mips, pipelined, superscalar, etc) during my years as an undergrad. Its cool nonetheless and by no means "easy".
    • Did you really build them on a real physical wirewrap? Or just in software? Neither one is easy, but I think this story is about the lengths that the guy took to actually do it with the wirewrap.
  • by WindBourne (631190) on Saturday May 30, 2009 @10:22AM (#28149191) Journal
    Sounds like the right mag for this project, though make is the more appropriate one.
  • Back in the '80s there was a rumor of a Russian version of the Apple II with the CPU implemented in wire-wrap, because they couldn't get actual 6502s in Russia.

  • Just look at all the embroidery [4004.com] in a 4-bit CPU.

  • Some time ago (nearly 10 years - wow!) I made a microcontroller-based homebrew MP3 player: http://codepuppies.com/~ben/sens/pic/mp3 [codepuppies.com] . My big mess of wires was a tiny fraction of the size of his, and it caused me enough headaches - tracking down signal noise, random glitches, etc... Hats off to this guy.

    I'd also recommend this book: http://en.wikipedia.org/wiki/The_Soul_of_a_New_Machine [wikipedia.org] to anyone who finds the Wired article of interest. It doesn't get too technical, but it describes the trials and tribulatio

  • I can't seem to find a parts list, but well, I wonder if he used a 74181 Arithmetic Logic Unit and 74182 carry lookahead generator. That would provide quite a few of the arithmetic and logical instructions that one might expect in 8-bit CPUs common in the 1980s in just three packages (the 74181 ALU is a 4-bit ALU, but it can be cascaded, and the 74182 can be used to provide carry lookahead for up to four 74181s). However, you'll also need an instruction decoder, and that, unless you use a programmable log

    • Yep, the 181 shows up in the layout diagram on his site [stevechamberlin.com]. I'm a little surprised, actually -- if you compromise enough to use GALs (programmable logic), it's kind of pointless to brag about being "TTL-based". If I build something out of a fistful of PICs, and hook them together with TTL-level signals, can I claim the same thing?

      I built a "video card" for a TRS-80 as a hobby project about 25 years ago -- 50 or so chips, 8 or 10 solderless breadboards, and a baking sheet for support/ground-plane. 30 rows, 1

  • The LCD panel on the front of the case [stevechamberlin.com] probably has more processing power in its internal controller than the rest of this project, heh.
    • The LCD panel on the front of the case probably has more processing power in its internal controller than the rest of this project, heh.

      Not to mention that despite the back lighting, the LCD panel probably uses but 1/100th of the power consumed by all the discrete logic.

  • Out of curiosity how many FLOPS (floating operations per second) does this processor do? I know it's probably not a lot but I am still really curious. If it doesn't do floating operations then I am still wondering how many integer operations is does. In fact any rough indicator of speed would be nice!

We are Microsoft. Unix is irrelevant. Openness is futile. Prepare to be assimilated.

Working...