Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

The Real Story of Hacking Together the Commodore C128 179

szczys writes "Bil Herd was the designer and hardware lead for the Commodore C128. He reminisces about the herculean effort his team took on in order to bring the hardware to market in just five months. At the time the company had the resources to roll their own silicon (that's right, custom chips!) but this also meant that for three of those five months they didn't actually have the integrated circuits the computer was based on."
This discussion has been archived. No new comments can be posted.

The Real Story of Hacking Together the Commodore C128

Comments Filter:
  • Mind blowing (Score:5, Informative)

    by 50000BTU_barbecue ( 588132 ) on Monday December 09, 2013 @03:00PM (#45642635) Journal
    It's really cool to hear about this stuff. It's just sad to realize that the 128 was a terrible idea and Commodore spread itself too thin making all kinds of bizarre 8-bit computers around that time instead of making a true successor to the C64. The C65 should have been what made it to market, not the weird 128 with its obsolete the day it left the factory CP/M mode running at half the speed of its competitors.

    The people I knew with 128s back then all used the 64 mode but used the 128 as an excuse to buy a better monitor. I never knew anyone using the CP/M mode.

    • Re:Mind blowing (Score:5, Interesting)

      by Webcommando ( 755831 ) on Monday December 09, 2013 @03:19PM (#45642839) Homepage Journal

      I went from a Vic20 to C128 instead of a C64. I was amazed that I could use CPM and a very advanced basic. The power of this machine enabled me and a good friend to build a robot in college made of nothing but old car parts, DC motors, relays, and plates with holes drilled in them for encoders. That directly led to my first job as an automation engineer.

      The C128 also was the last computer that fueled my dreams. I went to college to become a computer engineer so I could build what I called the "compatibility machine". This machine could execute all the major 8-bit computer software (they all had Z80's or 6502) without the user intervening or worrying what version of software they purchased. The C128 showed me it could be possible!

      By the time I left school the writing was on the wall that Mac / IBM style PCs would rule the world. It didn't stop me from getting an Amiga, but it was pretty clear that CBM was on the way out.

      • Ah good point, BASIC V7 was far better than 2.0. Did you use the user port or make a custom expansion cartridge? The closest I got to robotics back then was the Radio Electronics interface board to the Armatron... I never got the Armatron...
        • The closest I got to robotics back then was the Radio Electronics interface board to the Armatron... I never got the Armatron...

          We used the user port to drive a board with 5 volt relays that, in turn, were used to turn on and off the DC motors (re-purposed windshield wiper motors). For input, I used the joystick ports since BASIC 7 had features to react to button presses, etc. and all the I/O was essentially just switches. I could POKE on the gripper motor and have the system react when the gripper closed "fire button" was hit before turning it off.

          Reading and reacting to the encoders required a machine language routing to keep u

          • Hey that does sound pretty cool. Reading joysticks in BASIC V2 was crappy and not very fast. You don't have any pictures?
            • You don't have any pictures?

              Someplace in a deep dark place with the rest of my college material there lives a picture. This thread makes me want to go dig around again! Probably sitting next to a banner saying "Commodore 128" made with Print Shop on a Citizen 120D and some really cool drawings I made on GEOS geoPaint.

              I remember we named it MAXX but I cannot remember why anymore.

              I have a soft place for the 8 bit machines and actually have a small collection: Osborne executive, Atari 800, Atari 400, Commodore 4+, several C64, C128, VI

        • OMG. You just reminded me about my first (sort of) "robot" -- I connected an Erector Set motor's power lugs to the switched power traces on the cassette interface of my c64 using alligator clips, and attached a weak rubber band to pull it back. It was utterly useless, and did nothing besides pivot a rod back and forth, but it WAS technically a crude robot capable of moving atoms via software ;-)

          Thank ${deity} I didn't fry the cassette port. That would have really sucked, and it's the kind of thing that does

    • by goombah99 ( 560566 ) on Monday December 09, 2013 @04:08PM (#45643357)

      THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800), the instruction set was reduced. It had only 2 data registers (A,B) and two 8 bit address registers ( X Y) and fewer complicated ways to branch. Instead it effectively memory mapped the registers by using instructions like, offset Y by A, treat that as an address and get the byte at that location. Because it could do all that in one clock cycle, This effectively gave it 256 memory mapped registers. It also didn't have separate input lines for perifprials, and instead memory mapped those.

      Nearly every instruction took a microsecond. Thus while the clock rate was 1 Mhz, it was much faster than a 4 Mhz 8080 series chip since those could take multiple cycles to do one instruction. Few memory chips (mainly static memory) could keep pace with that clock rate so the memory would inject wait states that further slowed the instruction time. The 6502's leisurley microsecond time was well matched to meory speeds. Moreover, on the 6502 only half the clock cycle was used for the memory fetch. This left the other half free for other things to access memory on a regular basis.

      The regularity of that free memory access period was super important. it meant you could do two things. First you could backside the video memory access onto that period. On the 8080s using main memory you could often see gltiches on video displays that would happens when the video access was overridden by the CPU access at irregular clock cycles. As a result most 8080 series based video systems used dedicated video card like a CGA or EGA. Hence we had all these ugly character based graphics with slow video access by I/O in the Intel computer world. In the 6502 world, we had main memory mapped graphics. This is why the C64/Amiga/Apple were so much better at games.

      This regular clock rate on the main meory had a wonderful side effect. It meant you could use Dynamic memory which was faster, cheaper, denser, and MUCH MUCH lower power than static memory. With the irregular access rates of the 8080 refreshing a page of dynamic memory requird all sorts tricky circuitry that trried to opportunistically find bus idle times to increment the dynamic refresh address, occasionally having to halt the CPU to do an emergency refresh cycle before the millisecond window of memory lifetime expired. As a result, the 8080 seris computers like Cromenco, Imsai, altair and Northstar all had whopper power supplies and big boxes to supply the cooling and current the static memory needed.

      So the C64s and Apples were much nicer machines. However they had a reputation of being gaming machines. At the time that didn't mean "high end" like it does now. It mean toys. the Big Iron micros were perceived as bussiness machines.

      Oddly that was exactly backwards. But until Visicalc, the bussiness software tended to be written for the 8080 series.

      I think it was this memory mapping style rather than formal I/O lines to dedicated cards for periphrials (keyboard decoders, video, etc..) that lead apple to strive for replacing chips with software. they software decoed the serial lines (rather than using USART chips) they soft sectored the floppy drives rather than using dedicated controller chips, etc... And that was what lead to making the macintosh possible: less hardware to fit in the box, lower cost chip count, lower power more efficient power supplies.

      Eventually however the megahertz myth made the PCs seem like more powerful machines than the 68000 and powerPC.

      • by PhantomHarlock ( 189617 ) on Monday December 09, 2013 @04:16PM (#45643455)

        And as a descendent to that is was amazing what the Amiga did with the 68000 and its custom graphics and sound chips, as you mention at the very end. you never saw smooth scrolling and sprite movement on a PC. The Amiga and the C=64 both had arcade quality graphics locked to a 60hz interlaced or 1/2 vertical res (single field) refresh rate of a standard NTSC television signal. Since the whole thing was timed to that frequency, you never got tearing. The only downside was interlace flicker without a frame doubler, but not a lot of applications used interlaced mode.

        • you never saw smooth scrolling and sprite movement on a PC.

          This was true of CGA, but after EGA and VGA became popular, John Carmack figured out how to use these newer cards' scroll registers and built Commander Keen in 1990.

      • by tlhIngan ( 30335 )

        THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800), the instruction set was reduced.

        The reason for the popularity of the 6502 came down to one factor - cost. An 8086, 68000, Z80, etc., would've run you about $200 or so, while MOS was selling the 6502 for... $20. And you got a databook too.

        The 6800 from Motorola was supposed to be the "cheap" chip (compared to the 68000), but it was still pricey - enough so that a bunch of Motorola

      • THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800), the instruction set was reduced. It had only 2 data registers (A,B) and two 8 bit address registers ( X Y) and fewer complicated ways to branch. Instead it effectively memory mapped the registers by using instructions like, offset Y by A, treat that as an address and get the byte at that location. Because it could do all that in one clock cycle, This effectively gave it 256 memory mapped registers. It also didn't have separate input lines for perifprials, and instead memory mapped those.

        Actually the 6502 only had one accumulator, the A register. The 6809 had A and B. It is correct that the 6502 had very nice addressing modes. Zero page addresses acted more like machine registers. One commonly used addressing mode was z-page indirect indexed by Y. Two consecutive locations on z-page acted like a 16 bit pointer and register. Either that could be incremented OR Y could be incremented. So a block move of 1 256 byte page was easy.
        I don't think I *ever* used ($23,X) where X selects the z-page l

      • What's kind of sad is that technically, VGA *did* have some of the same low-level capabilities of the C64 (besides sprites, obviously). At least, if you had a VRAM-based card like the ET4000. They just weren't supported by the BIOS, so they were (almost) never used in commercial software. You had to know how the video subsystem was wired together, where the various control registers were mapped, and bitbang them directly by hijacking system timers and dead reckoning.

        One of the more hardcore examples I remem

      • by ewhac ( 5844 ) on Monday December 09, 2013 @07:37PM (#45645857) Homepage Journal

        I don't have time to correct all the errors in the parent post. So very briefly:

        • The 6502 had three 8-bit registers: A, X, and Y. A was the accumulator, and received the result of all arithmetic operations. X and Y could hold temporary data, arithmetic operands, and be used as index registers for memory load/store. There was also an 8-bit stack pointer register, SP, hard-mapped to the range 0x0100 - 0x01FF.
        • The 8080 had the 8-bit registers A, B, C, D, E, H, L, and a 16-bit stack pointer. In addition, the registers B & C, D & E, and H & L could be used to hold 16-bit quantities for some instructions.
        • The Z80 had all the registers of the 8080, plus a shadow copy of the registers for quick use by interrupt service routines.
        • The 6502's zero page (0x0000 - 0x00FF) got special treatment by the CPU, using only a single byte to address a location. As such, zero page usually got treated by software as a pile of "slow registers."
        • No instruction on the 6502 executed in fewer than two clock cycles. The fastest 6502 I ever saw was 2 MHz.
        • By contrast, 4 Mhz Z80 chips were widespread.
        • The Z80 helped popularize dynamic RAMs by containing a very basic DRAM refresh counter. The 6502 had no such thing; DRAM refresh was usually provided by custom logic, usually part of the video controller.
        • S-100 machines had huge power supplies because they had huge numbers of slots (eight or more being common), and had to have enough reserve power for all of them.
        • There was nothing special about the 6502's memory access patterns, and 6502 would get starved out like any other CPU if another device held the bus. On the C-64 in particular, every eight video lines, the VIC would grab the bus for 40 uSecs to fetch the next row of character cells, holding off the 6502 the whole time. This led to all kinds of problems with timing-sensitive operations, and was directly responsible for transfers to/from the 1541 floppy drive to be glacially slow.

        Schwab

    • Re:Mind blowing (Score:5, Interesting)

      by Dogtanian ( 588974 ) on Monday December 09, 2013 @04:24PM (#45643519) Homepage

      The C65 should have been what made it to market, not the weird 128 with its obsolete the day it left the factory CP/M mode running at half the speed of its competitors.

      Whatever the merits or demerits of the two machines is irrelevant; the C128 came out in 1985, whereas the C65 [wikipedia.org] wasn't developed circa 1990-91.

      C64 diehards have an obsession with the C65 and Commodore's perceived mistake in abandoning it, but despite the latter's numerous crap decisions, I'm sorry to say that in this case they were absolutely right.

      The C64 was still selling as a budget option circa 1991 (*) viable due to sheer momentum. The 16/32-bit Amiga was not only established as the successor, it had already taken over (in Europe, at least) and was already nearing *its* own commercial peak(!)

      Trying to release a (sort of) new 8-bit format by that point, even a very good one, would have made absolutely no sense, flopped horribly and stood on the low-end Amiga models' toes, mudding the waters pointlessly.

      They could have sold it as cheaply as the C64 (i.e. the high manufacturing costs of a new machine selling at the same price as a "wringing the last profit from established cash cow model"), but what would the point of that have been?

      The C128 at least came to market when there was still *possibly* a gap in the market for a high-end 8-bit machine between the C64 and the new (but still very expensive) Amiga.

      (*) Apparently C= were still making them when they went bankrupt circa mid-1994(!)

      • Edit; sorry, should read "The C65 wasn't developed until circa 1990-91".
      • Yes, Commodore should have started work on the C65 much earlier instead of spreading out into bizarre orphan architectures like the C16, C116, Plus/4, B128, C264 and all the other useless cruft they came up with.

        Commodore was right to abandon the C65 by 1991. Yes. I think we agree there, I'm just saying C= should have focused earlier and the C65 would have more sense in the marketplace in 1986. Granted, it wouldn't have been the 1991 C65, sure.

        But if C= had taken its engineers away from all the useless c

        • Spreading out into bizarre orphan architectures like the C16, C116, Plus/4, B128, C264 and all the other useless cruft they came up with.

          While they (like Tramiel's Atari Corp. did later on) probably did too many overlapping things at once, it's only fair to point out that the apparently pointless introduction of a new, C64-incompatible architecture for the C16, C116 and Plus/4 family did supposedly start out for sensible reasons. According to the WP article [wikipedia.org], Jack Tramiel was paranoid that (as they'd done in many other industries), the Japanese would swoop in and undercut everyone with ultra-cheap consumer-oriented machines. That's why the ch

          • Comment removed based on user account deletion
            • The C16 family was a good idea gone bad. Ideally, they should have released the C16 as a compatible successor to the VIC20

              AFAIK, in the US, the C64 itself had become the de facto successor to the Vic 20 anyway, purely because it was being sold so cheaply there.

              I also understand that this meant C= weren't actually making much money on them, and this is why Tramiel was forced out (i.e. he won the 8-bit computer market, but it was mostly a pyrrhic victory.) But that wasn't the end-buyer's problem...

              At any rate, I think that by the time the C16 came out in late-1984, compatibility with the Vic 20 wouldn't have been that big a

          • (*) Ironically, the Japanese took over the US market another way, by launching the NES and everyone buying them for gaming instead of home computers.

            The C64's good years are quite noticeable, 84 to 87. From the crash of 84, when people who still wanted to do electronic gaming almost had to jump to more expensive than a game console 8-bit computers, till the ascendancy of the NES. Twas Zelda that put the nail in the coffin. It didn't hurt that the NES was cheaper than a C64 system, without the load times, and with mostly better graphics.

            But also, many of those who had C64's during that period only used them for games and only knew enough Commodore BASI

            • The C64's good years are quite noticeable, 84 to 87. From the crash of 84, when people who still wanted to do electronic gaming almost had to jump to more expensive than a game console 8-bit computers, till the ascendancy of the NES.

              It lasted quite a long time in the UK- albeit having to share the market with the massive selling ZX Spectrum. Over here, the NES wasn't particularly successful (at least not compared to the US.)

              In fact, the NES was outsold here by the Sega Master System- possibly because that was well-marketed, whereas AFAICT Nintendo didn't much care about Europe- but neither console dominated the UK gaming market, which remained mainly home computer based during the late 8-bit and early-16 bit eras. It wasn't until the

      • Comment removed based on user account deletion
        • Having said that, had the C128 been a better successor to the C64, then things could have turned out much different. A successful 8-bit series through the late '80s might have eliminated the need to keep a budget entry model in the Amiga lineup. If we had a C256 and C512, the A500+ and A600 might never have been released.

          Honestly? I think that would have been a major mistake.

          Having a vastly improved higher-end 8-bit machine would have been good in the mid-80s, but it would still have been utterly misguided to rely on it as a replacement for a mass-market 16/32-bit machine; they'd have been hammered at the end of the decade as people moved towards true 16-bit models.

          To have an 8-bit machine remotely competitive with what the Amiga 500 was- or even the Atari ST- they'd have ended up having to redesign the whole thing anyw

    • by ackthpt ( 218170 )

      I had a C64 for years and at one time was slaving it to an Apple ][ with a nifty little interface, which I still have in a box somewhere. It was a dream to hack and play games on, despite having a mainframe at work which could do things I could only dream of at home (such as load/save from/to a HDD). My brother bought a 128 but never did anything with it as he wasn't a coder and had no idea what I was doing. Eventually I'd move to an Amiga 500 and then to a 2000 (which I still have.)

    • Wordstar! The 1571 floppy can read/write Kaypro formatted discs, as well as some other CP/M formats, and Commodore's own GCR'd CP/M format. With software the 1571 can read/write practically any 5.25 format out there, including DOS.

      IIRC I've read tha CP/M on the 128 was popular for BBS sysops since it was inexpensive.

    • Comment removed based on user account deletion
    • Well, apparently you never used the Plus4, in comparison to which the C128 looked like a nicely crafted supercomputer.

  • One grain of salt (Score:2, Informative)

    by Anonymous Coward

    From the Article: "Commodore C-128, the last mass production 8 bit computer and first home computer with 40 and 80 column displays"

    C-128 was in 1985, the Acorn BBC had 20, 40 & 80 column modes (and a teletext mode) in 1981.

    • C-128 was in 1985, the Acorn BBC had 20, 40 & 80 column modes (and a teletext mode) in 1981.

      Yes, this is correct. Technically, I guess it could depend how one interprets

      [The] first home computer with 40 and 80 column displays, dual processors, three operating systems, 128k memory via MMU and one heck of a door stop.

      Was the BBC truly a "home computer"? I'd say yes, though it overlapped the educational market too, but one could argue the point.

      And perhaps it could have meant "(40 and 80 column displays) BOOLEAN-AND (dual processors) AND (three operating systems) AND (128k memory via MMU)".

      That said, this is probably overanalysing. The BBC Micro wasn't that successful outside the UK, and the US tech industry (well, the US in general!) tends

    • The thing with the C128 is that you can use both displays at the same time, meaning you can have a 40 column display hooked up AND an 80 column display. Most people used dual-mode monitors but there was some software that you did some things in 40 column mode but then the software displayed special output in 80, or vice versa.

  • by Anonymous Coward on Monday December 09, 2013 @03:14PM (#45642783)

    Bil will be teaching a class at the Vintage Computer Festival East [vintage.org] next spring. He also lectured about the 128 and Commodore repair at the same event in 2012. Details are on c128.com.

  • Hrmmmmm (Score:3, Interesting)

    by Anonymous Coward on Monday December 09, 2013 @03:19PM (#45642841)

    He is claiming a lot of "firsts" that I would swear were in my Apple ][e prior to Winter '85...

    • Yup, as I read it more and more he's claiming some historically dubious things, but now you know how it feels to have history re-written.
    • Which ones? While 80 columns and 128K were options on the //e, 6he //e didn't come with 128k as default till 1987 with the Platinum //e. That was also the first //e with a numeric keypad by default. The 1571 also has a higher capacity than Apple's 5.25" drives.

      So yes, the C128 did have some features as standard before the //e.

      • The Apple IIc had 128k in 1984. Was the 1571 any faster than the 1541? Most Apple IIe systems were equipped with 2x5.25" drives, so it wasn't that big a deal. Apple did offer the Unidisk 3.5"+interface card for the Apple IIe for 800k worth of storage. The first revision of the Apple IIc had built in support for the drive as well.
  • A lot of early personal computers have a similar story. Software is often written with breadboarded or nonexistent hardware.

    What is unique about the idea of custom silicon LSI chips for a 1980's PC?

    The original Atari 800 (a design later copied by Commodore for the VIC-20 and Commodore-64 computers) had three custom chips (ANTIC, CTIA, POKEY) which made up the majority of the machine's circuitry when designed in 1978. And the OS and other early programs were written without the benefits of that completed

  • by PhantomHarlock ( 189617 ) on Monday December 09, 2013 @04:00PM (#45643275)

    ...to play Ultima V in dual SID mode.

    After several C=64s and the 128, I moved to the Amiga, which got me into the VFX business thanks to the Video Toaster and Lightwave.

    Looking forward to reading this article. If it's good I'll stash a copy next to my "Rise and Fall of Commodore" book.

  • by phantomfive ( 622387 ) on Monday December 09, 2013 @04:05PM (#45643321) Journal
    You never know what marketing will do to you as an engineer.

    a couple of weeks later the marketing department in a state of delusional denial put out a press release guaranteeing 100% compatibility with the C64. We debated asking them how they (the Marketing Department) were going to accomplish such a lofty goal but instead settled for getting down to work ourselves.

    • It was actually a pretty important selling point of the C128. Keep in mind that I (and many others) had a collection of *hundreds* of C64 games before we bought the 128 (thank you, early DRM crackers). I probably wouldn't have bought one if all it could play was C128 software (what little there was of it).

      • One theory behind the lack of C128 software was that the machine could run C64 software and that developers didn't bother writing software that most people couldn't run. Why write C128 software when you can write C64 software that can run on both new and old machines. The Atari STe line had the same problem with games, very few took advantage of the improved graphics and digital sound available on the newer machine.
        • The Atari STe line had the same problem with games, very few took advantage of the improved graphics and digital sound available on the newer machine.

          The STe was clearly designed to close the gap between the "vanilla" ST (and STFM) and the Amiga, which had come down in price by that point. It might have worked... had Atari directly replaced the STFM with the STE at the same price when it launched.

          Problem was that- almost certainly due to Jack Tramiel's penny-pinching short-sightedness- they charged more for the STe and continued to sell it alongside the STFM. So anyone buying an ST because it was cheap would get the STFM, and anyone who had a bit more

        • Comment removed based on user account deletion
  • by JoshDM ( 741866 ) on Monday December 09, 2013 @04:05PM (#45643323) Homepage Journal

    I was in 5th or 6th grade, and I woke up to a new computer in my room. The printer immediately broke and I noticed the desk was half up-side down. My dad had assembled it and the desk in the dark, during the night, while I was asleep (I'm a heavy sleeper). He was no technician, but I appreciated the effort. I traded c64 games with kids at school and stacks of 5.25 floppies via mail. Commodore games were fantastic; much better than NES. Junior year of High School, I finally had the initiative to figure out what my dad had done to the printer, and it turned out to be a simple problem that I fixed. I used 80 column mode to type and print essays for school for the next two years. Much praise to my old man. Granted, first year of college and he helped me acquire a 386 with Windows 3.0, which I had for three years, then built my own. I'll never forget my C=128. Thanks, dad!

  • Too little too late (Score:4, Interesting)

    by RedMage ( 136286 ) on Monday December 09, 2013 @04:05PM (#45643333) Homepage

    I was a big fan, and a game developer for the C64. Those were the days that a machine could be fully understood by an untrained person with a knack for programming. When the C128 came out, I was interested, especially in the 80 column screen and CP/M software compilers. But there were too many limits on the machine (no hard drive easily added, no real OS, etc.) and it didn't feel like enough of an advancement over the C64. My grandfather did buy one, and I had some time with his, but that never really sparked much either. My next machine would be the Amiga, and as soon as that become somewhat affordable by a college student (the A500), I never looked back.

    RM

  • About time (Score:4, Funny)

    by Tablizer ( 95088 ) on Monday December 09, 2013 @05:15PM (#45644273) Journal

    Excellent, my wife's been on me to upgrade my C64

  • The software design only took three people to write, Fred Bowen, Terry Ryan, Von Ertwine, contrast that with some other projects ...
  • by Wookie Monster ( 605020 ) on Monday December 09, 2013 @11:10PM (#45647417)
    Several times in the article, he mentions that the C128 was the last 8-bit computer to be designed. This isn't true -- a year later, Tandy announced the CoCo3, also with 128KB and capable of 80 column text display. It didn't run CP/M, but instead it ran Microware OS-9.
  • by kriston ( 7886 ) on Monday December 09, 2013 @11:57PM (#45647677) Homepage Journal

    While not originally designed under the auspices of Commodore, the Amiga was also designed with VLSI custom chips. The prototype did not have the chips available, either. Instead, their larger-scale prototypes were the size of a small room. The booth they used at a trade show to demonstrate the Amiga was designed to hide the fact that the walls of the booth itself consisted of the prototypes of the custom chips hidden behind curtains.

    It would be cool if we could find a photograph of it.

    RIP Jay Miner.

    • by kriston ( 7886 )

      More on the period: At this time I had been a die-hard TRS-80 Color Computer aficionado but I did appreciate the advances in which the Commodore camp had triumphed. I eventually embraced, for better or worse, the Commodore Amiga line, from 16- to 32-bit both in AmigaOS and Amiga Unix.

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...