Please create an account to participate in the Slashdot moderation system


Forgot your password?

The Real Story of Hacking Together the Commodore C128 179

szczys writes "Bil Herd was the designer and hardware lead for the Commodore C128. He reminisces about the herculean effort his team took on in order to bring the hardware to market in just five months. At the time the company had the resources to roll their own silicon (that's right, custom chips!) but this also meant that for three of those five months they didn't actually have the integrated circuits the computer was based on."
This discussion has been archived. No new comments can be posted.

The Real Story of Hacking Together the Commodore C128

Comments Filter:
  • Mind blowing (Score:5, Informative)

    by 50000BTU_barbecue ( 588132 ) on Monday December 09, 2013 @04:00PM (#45642635) Journal
    It's really cool to hear about this stuff. It's just sad to realize that the 128 was a terrible idea and Commodore spread itself too thin making all kinds of bizarre 8-bit computers around that time instead of making a true successor to the C64. The C65 should have been what made it to market, not the weird 128 with its obsolete the day it left the factory CP/M mode running at half the speed of its competitors.

    The people I knew with 128s back then all used the 64 mode but used the 128 as an excuse to buy a better monitor. I never knew anyone using the CP/M mode.

  • One grain of salt (Score:2, Informative)

    by Anonymous Coward on Monday December 09, 2013 @04:14PM (#45642781)

    From the Article: "Commodore C-128, the last mass production 8 bit computer and first home computer with 40 and 80 column displays"

    C-128 was in 1985, the Acorn BBC had 20, 40 & 80 column modes (and a teletext mode) in 1981.

  • by Anonymous Coward on Monday December 09, 2013 @04:14PM (#45642783)

    Bil will be teaching a class at the Vintage Computer Festival East [] next spring. He also lectured about the 128 and Commodore repair at the same event in 2012. Details are on

  • Re:Mistake (Score:5, Informative)

    by nedlohs ( 1335013 ) on Monday December 09, 2013 @04:48PM (#45643149)

    Want to do a crazy program you can't write on modern computers?


    Yeah, can't is a blatant lie.

    Yeah, that's trivial to do on a modern computer too. A trivial loadable kernel module in linux could do so, for example.

    Simply loop through a sequence of poking two random numbers, and incrementing a number that you print.


    That is what it says, write a random value to a random memory location in a loop.

    Every time, the system will do different things.

    What ?

    Of course it will. Sometimes you random memory location will be the memory mapped to the screen and a character will show up. Sometimes you'll change a return address on the stack and run some random code.

    If you did this on a modern computer, eventually it'd corrupt system files and the thing wouldn't boot.


    That's true, eventually you'll write over some file data just before it is flushed to disk and trash a file required for booting. Or screw with memory the file system is using and mess that up on the next write (though given the use of checksums that's pretty unlikely). The key is eventually since you'll have to run it a *lot* of times before it does something like that before crashing itself.

    And of course not when running as a normal user process.

    It makes you wonder why modern OSes aren't hardened with the theory: No matter what the user does, allow the computer to boot up safely next time.

    You're an idiot.

    Yes he is.

    Computers that have the OS on ROM unsurprisingly aren't susceptible to making the system unbootable by screwing with boot files. The same is true of a modern computer hardwired to boot off of ROM as well though. And of course it makes upgrading that base OS essentially impossible (short of replacing the ROM, or actually using an EEPROM - and of course if software can do the upgrade then the random memory setting could also cause it to happen and screw up booting)

  • by goombah99 ( 560566 ) on Monday December 09, 2013 @05:08PM (#45643357)

    THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800), the instruction set was reduced. It had only 2 data registers (A,B) and two 8 bit address registers ( X Y) and fewer complicated ways to branch. Instead it effectively memory mapped the registers by using instructions like, offset Y by A, treat that as an address and get the byte at that location. Because it could do all that in one clock cycle, This effectively gave it 256 memory mapped registers. It also didn't have separate input lines for perifprials, and instead memory mapped those.

    Nearly every instruction took a microsecond. Thus while the clock rate was 1 Mhz, it was much faster than a 4 Mhz 8080 series chip since those could take multiple cycles to do one instruction. Few memory chips (mainly static memory) could keep pace with that clock rate so the memory would inject wait states that further slowed the instruction time. The 6502's leisurley microsecond time was well matched to meory speeds. Moreover, on the 6502 only half the clock cycle was used for the memory fetch. This left the other half free for other things to access memory on a regular basis.

    The regularity of that free memory access period was super important. it meant you could do two things. First you could backside the video memory access onto that period. On the 8080s using main memory you could often see gltiches on video displays that would happens when the video access was overridden by the CPU access at irregular clock cycles. As a result most 8080 series based video systems used dedicated video card like a CGA or EGA. Hence we had all these ugly character based graphics with slow video access by I/O in the Intel computer world. In the 6502 world, we had main memory mapped graphics. This is why the C64/Amiga/Apple were so much better at games.

    This regular clock rate on the main meory had a wonderful side effect. It meant you could use Dynamic memory which was faster, cheaper, denser, and MUCH MUCH lower power than static memory. With the irregular access rates of the 8080 refreshing a page of dynamic memory requird all sorts tricky circuitry that trried to opportunistically find bus idle times to increment the dynamic refresh address, occasionally having to halt the CPU to do an emergency refresh cycle before the millisecond window of memory lifetime expired. As a result, the 8080 seris computers like Cromenco, Imsai, altair and Northstar all had whopper power supplies and big boxes to supply the cooling and current the static memory needed.

    So the C64s and Apples were much nicer machines. However they had a reputation of being gaming machines. At the time that didn't mean "high end" like it does now. It mean toys. the Big Iron micros were perceived as bussiness machines.

    Oddly that was exactly backwards. But until Visicalc, the bussiness software tended to be written for the 8080 series.

    I think it was this memory mapping style rather than formal I/O lines to dedicated cards for periphrials (keyboard decoders, video, etc..) that lead apple to strive for replacing chips with software. they software decoed the serial lines (rather than using USART chips) they soft sectored the floppy drives rather than using dedicated controller chips, etc... And that was what lead to making the macintosh possible: less hardware to fit in the box, lower cost chip count, lower power more efficient power supplies.

    Eventually however the megahertz myth made the PCs seem like more powerful machines than the 68000 and powerPC.

  • Re:Mind blowing (Score:5, Informative)

    by 50000BTU_barbecue ( 588132 ) on Monday December 09, 2013 @05:24PM (#45643521) Journal
    Commodore didn't design the Amiga, they bought it.
  • by wcrowe ( 94389 ) on Monday December 09, 2013 @05:39PM (#45643717)

    Just wanted to say that's a great story about your dad.

  • Re:U.S. Navy? (Score:4, Informative)

    by wcrowe ( 94389 ) on Monday December 09, 2013 @06:04PM (#45644105)

    Well, you know, there was other stuff going on, as her father and step mother were out of town that winter, but this is Slashdot, not Penthouse forum, so...

    Hey, I eventually married the girl. ;-)

  • by Wookie Monster ( 605020 ) on Tuesday December 10, 2013 @12:10AM (#45647417)
    Several times in the article, he mentions that the C128 was the last 8-bit computer to be designed. This isn't true -- a year later, Tandy announced the CoCo3, also with 128KB and capable of 80 column text display. It didn't run CP/M, but instead it ran Microware OS-9.

To be a kind of moral Unix, he touched the hem of Nature's shift. -- Shelley