Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

Fifteen Classic PC Design Mistakes 806

Harry writes "Once upon a time, it wasn't a given that PC owners should be able to format their own floppy disks. Or that ports should be standard, not proprietary. Or that it was a lousy idea to hardwire a PC's AC adapter, or to put the power supply in the printer so that a printer failure rendered the PC unusable, too. Over at Technologizer, Benj Edwards has taken a look at some of the worst design decisions from personal computing's early years — including ones involving famous flops such as the PCJr, obscure failures such as Mattel's Aquarius, and machines that succeeded despite flaws, like the first Mac. In most instances — but not all — their bad decisions taught the rest of the industry not to make the same errors again."
This discussion has been archived. No new comments can be posted.

Fifteen Classic PC Design Mistakes

Comments Filter:
  • by Dystopian Rebel ( 714995 ) * on Monday June 15, 2009 @09:48AM (#28335195) Journal

    Patents and proprietary, closed standards -- Open standards lead to innovation and better hardware for consumers. Look at some of the junk in that article... Engineers need the challenge of having other people improve upon their ideas. Open standards and open-source *will* win because people work best working together. Capitalism certainly won't die but it needs to learn this lesson.

    Honourable Mention: Keyboards -- Most computer keyboards are designed for some other lifeform -- one with a single arm bearing 10 or more fingers. Consumers accept the familiar "conventional" keyboard because it's familiar and conventional. The keyboards that are best for human beings have a "split" or curve in the centre. There are many horrible keyboards, so I'd like to mention some excellent ones:
    GoldTouch
    Adesso Ergonomic
    original Microsoft Natural (not the later rubbish that claimed to be "ergonomic" just because it had a fake leather wrist support -- while maintaining the straight-row key configuration that is bad for wrists)

    • by UnknownSoldier ( 67820 ) on Monday June 15, 2009 @10:14AM (#28335501)

      > original Microsoft Natural

      That was a great keyboard back in 96! I would demonstrate a simple proof to others to show the benefit of its ergonomics:

      * Stand up. Put your hands by your sides. Notice the angle of your hands.
      * Now raise your hands up, keeping your biceps in place, and making an L, as if you were shaking hands.
      * Now roll both of your hands inward, as if you were to play a wide piano. Seem how comfortable that is?
      * Now slide your hands together so your thumbs are touching. Notice how awkward that is?

      Took me a little while to get used to it, but it was good. My only problem was that the Y,H,and N keys (quite logically) were put on the right side. I'm a pretty hard-core gamer that uses most of the left side + partial right side of the keyboard, and found those keys "missing." (I used the right hand on the mouse.)

      I wish someone would bring it back, duplicating the TY, GH, NM keys on both the left and right side.

      --
      "Necessity is the mother of invention,
      but Curiosity is the Father."
        -- Michaelangel007

      • by Weaselmancer ( 533834 ) on Monday June 15, 2009 @12:26PM (#28337361)

        Here is an article with a picture of one. [pctechguide.com]

        I'm a touch typist, took a class in it in high school. Fingers on the home keys. Left hand rests on ASDF. Right hand on JKL;.

        If you move up a row from ASDF, you get QWER. My left pinky is A, move up 1 to Q. My right pointer is on F, move up 1 row to R.

        Move up to the next row for numbers. ASDF becomes 1234. Now here's where we get to the mistake. We were taught that your left pointer goes up 2, and towards the middle 1 to get to 5. Likewise, your right pointer goes up 2 and over to the middle one 1 to get to 6.

        Notice how the 6 is on the wrong side? When my brain thinks "6", my right pointer wants to see it right next to the 7. It's now the responsibility of my left pointer to be in charge of 456, and my right pointer is now only in charge of 7.

        I can't tell you how frustrating this keyboard is to a touch typing programmer. It's as if nobody at Microsoft knows how to touch type.

      • by BobMcD ( 601576 ) on Monday June 15, 2009 @12:28PM (#28337395)

        I wish someone would bring it back, duplicating the TY, GH, NM keys on both the left and right side.

        This. Very, very, very THIS. Please. And hurry...

    • by JCSoRocks ( 1142053 ) on Monday June 15, 2009 @10:15AM (#28335519)
      You forgot Apple :P *ducks fanboys*. Seriously though, I just bought a Mac Mini and I was extremely disappointed to find that it uses a proprietary mini-displayport connector. If you want to use dual link DVI to power a 30" monitor you have to buy a $100 adapter that doesn't even work. Standards are standard for a reason Apple!
      • by jedidiah ( 1196 ) on Monday June 15, 2009 @10:25AM (#28335665) Homepage

        Nevermind the display port.

        They are back to the "no user serviceable parts" mantra.

        Sure you can upgrade a mini if you are sufficiently stubborn.
        However, it's a process where you will find yourself applying
        a putty knife to your pretty little Mac.

        Frankly I don't think most Apple users are up to that sort of
        thing.

        The thing is a glorified headless laptop anyways. Why didn't they
        just take that idea to it's logical conclusion and have expansion
        panels like real laptops do?

        This is especially problematic since minis historically came with
        too little memory as Mac in general have. This is why I personally
        know the joys of upgrading a mini.

        • Re: (Score:3, Interesting)

          by JCSoRocks ( 1142053 )
          Sadly I went through the exact same process. RAM from NewEgg is less than half the price of RAM from Apple. The installation process is frustrating to the say the least. Like you said, using a putty knife on your brand new toy (almost inevitably marring the surface in the process) is not fun.

          My new Mini is actually my first Apple ever. So far, I have not been impressed.
          • Re: (Score:3, Informative)

            by AKAImBatman ( 238306 ) *

            My new Mini is actually my first Apple ever. So far, I have not been impressed.

            Just about anyone who's posting on Slashdot is not going to be well-served by a Mac Mini. At least not as a primary machine. The Mini is a scaled-down computer intended for non-power users who need a relatively inexpensive machine that can be tethered to a desk.

            If you want to be happy with your Mac purchase, get a MacBook. It will do everything you need of it and more. Plus, getting it equipped out-of-the-box with sufficient memo

            • Comment removed (Score:5, Insightful)

              by account_deleted ( 4530225 ) on Monday June 15, 2009 @12:21PM (#28337309)
              Comment removed based on user account deletion
              • by cyn1c77 ( 928549 ) on Monday June 15, 2009 @01:38PM (#28338301)

                ...or the total overkill that is the Mac pro line...

                As someone who also got bit by Apple's non-user serviceable part philosophy, I agree with you 100%.

                I've got a Mac Pro. I'm not an Apple fanboi, I just hate them less than other computer manufacturers. My computer works great. But I didn't get the wireless card installed when I purchased it because I didn't need it. Later on, I needed the wireless capability, so I tried to buy the Airport Extreme card from Apple. The fuckers (yes, they are fuckers) wouldn't sell it to me because "it was not a user installable part." I had to make an appointment at my "local" Apple store that is 60 miles away to let some teenage "genius" install it for me. Yeah, OK, I'll get right on that, because I really want to drive my expensive 90-lb machine 120 miles on my day off so some 13-year-old-looking smartass can paw at it.

                Instead, I bought it off a third-party vendor and worked out how to install it myself, since the only instruction it came with said "This is not a user installable part, please refer to the Mac Pro service manual for installation." It worked fine and I now have wireless capability, but I found Apple's actions with that upgrade really insulting.

                If I am willing to pony up $4000 for a computer, chances are I have the necessary intellect and experience to screw a wireless card to my motherboard and plug in two antennas. Or I am willing to accept the consequences of my actions if I screw up. Why would a company make it hard for a consumer to use their product?

                Apple's increasingly common philosophy of non-user serviceable parts, lack of mid-range user-upgradable towers, and forcing weird connectors down our throats without including the adapters for free are annoying and I think, ultimately, holding them back in the PC market. Window's recent suckage has been working to Apple's advantage, but I feel they could have capitalized on it more effectively. Of course, I am sure that Steve and his financial analysts have determined otherwise.

          • It's not just Apple that charges such a large amount for better parts. Dell (whose computers you can easily upgrade on your own) has prices on upgrade parts that are much higher than retail.

            For example, a base model Vostro desktop lists the Core 2 Duo E8600 as an upgrade (over the Celeron 450) for $330; the E8600 can be bought [mwave.com] for $267.99 with free shipping. Dell lists their 21.5" HD monitors for $260; I recently bought two Samsung 21.5" HD monitors for $189.99 each [newegg.com] (with free shipping, and there are rebates available). Dell will upgrade your baseline Vostro from 1GB to 4GB of 800MHz DDR2 for $112; it's not hard [mwave.com] to find 4GB kits for anywhere between $40.99 and $76.99, depending on what brand you prefer. On the same machine Dell will upgrade your 80GB hard drive to a 1TB 7200RPM hard drive for $330; Seagate 1TB drives can be had [mwave.com] for as little as $89.99.

            (Those aren't affiliate links, don't worry :P)

            If you were to get those upgrades, Dell's markup over retail prices is as much as $400, and they pay OEM price, not retail. (To be fair, the hard drive I linked above to is OEM, not retail.)

            These days, I see very little reason to buy a desktop from Dell (or Apple or whoever) unless you're buying a laptop - and even then, you shouldn't have the vendor upgrade your RAM. I bought 4GB RAM for my laptop for $20 (after rebate), where Dell would have charged me $200. (Ironically, the RAM was marketed as "for Macs", despite being standard DDR2 SODIMM.)

            As a humorous side note, if you want Dell to preconfigure RAID on a pair of 1TB drives, they'll do RAID-0 for $350 or RAID-1 for $250... same hardware, different price. Fun fun fun.

      • by Darkness404 ( 1287218 ) on Monday June 15, 2009 @10:26AM (#28335681)
        Actually Mini-Displayport is actually rather open, and while not a standard (yet) you can get the specs from Apple for nothing.
      • by Anonymous Coward on Monday June 15, 2009 @10:58AM (#28336185)

        Actually, DisplayPort isn't proprietary, it's the successor to DVI. Mini-DisplayPort is part of the VESA specification and is entirely royalty-free.

      • I'm not sure you can call the MiniDisplay port proprietary when Apple has published the specs for them so that anyone can use them. The cost is cause nobody uses DisplayPort yet. Lenovo has 1 freakin monitor that has a display port plug, and its about $700.
    • by Shivetya ( 243324 ) on Monday June 15, 2009 @10:15AM (#28335523) Homepage Journal

      Having to press a key on the keyboard and click has got to be the most entertaining solution I have seen as 'good' in a long time.

      I think it is funny the genius bar people practically tell people to get a microsoft mouse.

      multiple cable speaker systems, its about time we had a single cable solution for attached speakers that provided easy to implement separation of channels. USB for everything please, or something similar.

      • Re: (Score:3, Informative)

        by jht ( 5006 )

        Not for nothing, but that's just not true anymore. Hasn't been for quite a few years. Ever since the Mighty Mouse (the one with the trackball) became standard, it's been a 2-button mouse. It's just that the Mac UI was designed to be single-button friendly and the mouse operates in single button mode by default. If you are clever enough a user to want to right-click, it's simple enough to just go to the System Preference pane for your mouse and turn it on.

        Mice with two hardware buttons Just Work as well.

        • Re: (Score:3, Informative)

          by macshome ( 818789 )

          Not for nothing, but that's just not true anymore. Hasn't been for quite a few years. Ever since the Mighty Mouse (the one with the trackball) became standard, it's been a 2-button mouse.

          More than that, it's a 4-button with a scroll ball.

    • by Sensible Clod ( 771142 ) on Monday June 15, 2009 @10:33AM (#28335809) Homepage
      I have a Microsoft Natural. I got it from a computer repair/migration client. Despite having other keyboards with nicer features or quieter mechanisms, I use it exclusively. It and my Microsoft Sound System 80 are two of the nicest pieces of hardware I own.

      Why doesn't Microsoft just forget software and go into hardware?
    • by jo42 ( 227475 ) on Monday June 15, 2009 @10:39AM (#28335897) Homepage

      I must be that "some other lifeform". I can't stand or use curved or "Ergonomic" keyboards such as the Microsoft un-Natural keyboard.

      I'd rather have my wrists rest flat on the table since I find that far more comfortable than having my hands rotated slightly, thus resting my wrists at an angle (which starts to hurt after awhile).

  • by Quiet_Desperation ( 858215 ) on Monday June 15, 2009 @09:50AM (#28335233)
    Well... at least it wasn't spread out over 15 pages.
  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Monday June 15, 2009 @09:51AM (#28335243)
    Comment removed based on user account deletion
    • Re:The 15 problems (Score:5, Interesting)

      by TheRaven64 ( 641858 ) on Monday June 15, 2009 @10:07AM (#28335417) Journal
      The best one from the Mac was putting the power button right next to the floppy drive. Removing the eject button was a good idea; it prevented you from ejecting a disk without unmounting it and ending up with corrupted date. Unfortunately, when the Mac came out, most users were accustomed to manual floppy drives with a mechanical eject button underneath. The natural way of getting a a disk back was to press the button under the floppy drive, which turned off the machine (typically losing data). Putting the power button on the other side, and a soft eject button under the floppy drive would have saved a lot of data.
      • Re:The 15 problems (Score:5, Informative)

        by jrumney ( 197329 ) on Monday June 15, 2009 @10:55AM (#28336137)

        Unfortunately, when the Mac came out,

        That design flaw wasn't introduced when the Mac came out, it was when they first moved from 68000 to PowerPC [wikipedia.org]. Older Macs from the XL through to the Classic II had the power button on the keyboard or tucked away somewhere out of sight on the monitor/base.

      • Re: (Score:3, Insightful)

        by clone53421 ( 1310749 )

        Removing the eject button was a good idea

        No, it was a bad idea. A "good" idea would have been one of two things, as I see it:
        – Soft eject, emergency hard eject (e.g. like a CD-ROM drive). If it's off, I don't want to turn it on to get my disk.
        – Hard eject with soft disable (e.g. like CD-R/RW drives which physically lock closed while burning). Ensure that it unlocks when the power goes off!

      • Re:The 15 problems (Score:4, Informative)

        by jht ( 5006 ) on Monday June 15, 2009 @11:02AM (#28336253) Homepage Journal

        Huh? Which Mac was that? All the original Macs (the 128, 512, and Plus, along with the SE series and Classic) had the power switch on the back. The NuBus Macs (Mac II onwards up until the middle of the PPC era) powered up via their ADB keyboards.

        There was an optional reset switch you could attach to the lower side of the computer, I guess that could look like a power button - but that was originally a user option to install (most didn't that I recall) and the later Macs had a slightly recessed reset button.

      • Re: (Score:3, Informative)

        Worse was the Apple II Reset button - originally it was (if I remember right) more or less right above the Return key. Back in the days when saving was a matter of screwing around with cassette tapes and luck, it was incredibly frustrating to accidentally brush the Reset button. Fortunately it was possible to re-wire it so it required you to press CTRL + Reset to reset, and then we also got a floppy drive so that it was MUCH less obnoxious to save stuff.

        I wonder how many hours of lost work that reset button

      • Re:The 15 problems (Score:5, Insightful)

        by demonbug ( 309515 ) on Monday June 15, 2009 @12:17PM (#28337233) Journal

        Removing the eject button was a good idea; it prevented you from ejecting a disk without unmounting it and ending up with corrupted date.

        Removing the eject button was an idiotic idea, and it illustrates one of the great failures of personal computer design philosophy - the idea that the system builder/designer knows better than the user how the user should use the system. If I want to eject a disk in the middle of an operation then I should be able to - maybe the possibility of corruption is preferable to the alternative of letting an operation continue. Maybe an electrical fire just started in the system power supply, and I want to get my floppy out NOW. Maybe a million things that the designer didn't think of. The assumption that the user is an idiot and doesn't know what they are doing, and that their control over the system must be severely limited for their own protection, is the single worst PC design mistake.

    • Re: (Score:3, Insightful)

      by Hatta ( 162192 )

      This looks to me like 10 actual problems, with multiple examples of crappy keyboards and bulky shit stuck to your computer.

      • Re:The 15 problems (Score:5, Insightful)

        by Chyeld ( 713439 ) <chyeld@g m a i l . com> on Monday June 15, 2009 @10:39AM (#28335895)

        And to be honest, were those bulky expansions really design mistakes or do they just seem that way now that we have the benefit of a couple of decades of experience and design put into the problems they were meant to address?

        I'd have a hard time seeing USB coming out back in the era being described, and not just because every company was doing it's best to lock people into their own platform.

    • Re: (Score:3, Informative)

      by falconwolf ( 725481 )

      Where's #9?
      Oh, instead of releasing their own GUI based PC, Xerox PARC [wikipedia.org] had Apple do it.

      Falcon

    • And a summary (Score:4, Informative)

      by Chas ( 5144 ) on Monday June 15, 2009 @10:32AM (#28335785) Homepage Journal

      A: No PSU fan (leading to thermal warping of internal components)
      B: Limited Apple II Compatibility (Limited Compatibility)
      C: No way to format disks
      D: EM Pulse Erases tapes (unreliable media)
      E: Printer required
      F: Lousy Keyboard (#6 and #8)
      G: Non-detachable AC adapter
      H: Ridiculous external expansion options (10, 13, and technically 14)
      I: No user expandability
      J: Slow BASIC
      K: Unreliable disk drives

  • Big ISA bus flaw (Score:5, Interesting)

    by FranTaylor ( 164577 ) on Monday June 15, 2009 @09:59AM (#28335323)

    IOCHRDY signal is active high instead of active low. Causes no end of problems.

  • by SlashDotDotDot ( 1356809 ) on Monday June 15, 2009 @10:03AM (#28335375) Journal

    Problem #16: Blindingly intense blue LED on my new Dell that blinks when the computer is asleep.

    All night long the computer constantly warns me: "I'm asleep. I'm asleep. I'm asleep." It's like Homer Simpson's "everything is OK" alarm.

  • CapsLock (Score:4, Insightful)

    by jameson ( 54982 ) on Monday June 15, 2009 @10:04AM (#28335391) Homepage

    Sun got it right on their keyboard design, but everyone else kept the CapsLock key. I've been using computers for 21 years, and I use Ctrl constantly. I do not recall ever having used the CapsLock key (except out of curiousity to see whether it actually does anything.)

    (Well, that's a bit of a lie. Of course I use it, after reassigning it to Ctrl. But the point is, having to take that step is a waste of time.)

    CapsLock was useful once upon a time, when there was no \section{} or \textbf{}, and when pressing `shift' actually required strenght. But those days are gone.

    • Re:CapsLock (Score:4, Insightful)

      by Jason Levine ( 196982 ) on Monday June 15, 2009 @10:24AM (#28335643) Homepage

      There are still limited instances when CapsLock is useful. I work in a hospital and our MediTech program requires all caps. (Don't ask me why.) Like you mentioned, you can get keyboard remapping programs to turn CapsLock into another key. Still, I can see your point and it would be nicer if the CapsLock functionality was incorporated without needing a whole key. Say, for example, by pressing the Shift key twice or three times in rapid succession.

      And while we're on the subject, does anyone use Num Lock or Pause anymore?

      • Re:CapsLock (Score:4, Informative)

        by Spatial ( 1235392 ) on Monday June 15, 2009 @11:17AM (#28336465)

        And while we're on the subject, does anyone use Num Lock or Pause anymore?

        Not Numlock, but 'pause' is used all the time in Windows. Off the top of my head: it pauses most games, pauses the command line, and Winkey+Pause opens the System Properties dialogue.

  • by UnknownSoldier ( 67820 ) on Monday June 15, 2009 @10:05AM (#28335399)

    My personal list...

    - 15 to 10 years ago, you had to be careful when installing drives, or RAM. You could almost slice your hand on a cheap case that had unfinished and sharp edges.

    - Beige Only. You can pick any color, as long as it is beige. Why did it take so bloody long to offer any other color then beige? Critical mass?

    - LOUD systems. Have to thank George for showing me just how nice a quiet system is.

    - Power hunger systems. 2 molex connections for a GPU ?!

    - Crap 3D Video cards in laptops, and almost no benchmarks from the "classic" hardware review sites so you know how bad it sucks compared to a "real" GPU. (Thankfully the S3 Virge is gone from desktops, but laptops are still stuck with poor performance unless you pay an arm and a leg.)

    --
    "World of Warcraft (TM) is the McDonalds (TM) of MMOs."
        -- Michaelangel007

  • by PotatoFiend ( 1330299 ) on Monday June 15, 2009 @10:09AM (#28335437)
    has always been the cup holder. That shit always snaps under the strain of my 48-oz. coffee.
  • #1 failure... (Score:5, Insightful)

    by master_p ( 608214 ) on Monday June 15, 2009 @10:10AM (#28335445)

    the choice of IBM to use the 8086 CPU. It set back the computer industry several years. The PC would now be at least 2 generations ahead if IBM did not use the retarded 8086 design.

    Obviously, IBM did not believe in personal computers and thought they were gimmicks.

    • Re:#1 failure... (Score:4, Interesting)

      by KermodeBear ( 738243 ) on Monday June 15, 2009 @10:15AM (#28335521) Homepage

      Why? What other processor(s) should have been used, and what would have been the benefits? No, not trolling. Just interested in what you said and would like more information.

      • Re: (Score:3, Insightful)

        by jedidiah ( 1196 )

        While something like the 68000 could have been used I don't think it was necessary.
        Ultimately, the problem with the PC was the system software. It was thrown together
        without any real thought or consideration for the future. It was the essence of how
        things were NOT done at IBM at the time.

        The problem wasn't so much that the 8086 sucked but that the OS was tied to it so much.

        That clone with the problem serial port would have been in a better position if something
        resembling a real OS was created for the PC to

      • Re:#1 failure... (Score:5, Insightful)

        by vonhammer ( 992352 ) on Monday June 15, 2009 @10:40AM (#28335913)
        Read the Motorola 68000 assembly language manual and marvel at its simplicity and elegance. I believe they had an 8-bit and 16-bit equivalent back then. That would be my choice. Advantages are the simple addressing scheme, many general purpose data registers, brilliantly simple assembly language.
      • Re: (Score:3, Interesting)

        by stevied ( 169 ) *

        680x0 was, IIRC, around at the time, and had a much more elegant, though still CISC, instruction set. Plus it was 32bit internally, though the 68000/10 only had 24 external address lines.

        I seem to recall that writing (GUI) apps in assembler for the (68000-based) Amiga was, although time consuming, perfectly possible. I'd have hated to do it on the register-starved 80x86.

      • Re: (Score:3, Insightful)

        by Megane ( 129182 )
        What other processor should have been used? Anything without those damn segment registers. The 8088's 64k segments were the legacy that set back the industry for so long. The 80286 was no help, either, since it still had that basic 64k limitation. It just added a couple more years to the setback.
      • Re:#1 failure... (Score:5, Informative)

        by pz ( 113803 ) on Monday June 15, 2009 @11:18AM (#28336483) Journal

        Why? What other processor(s) should have been used, and what would have been the benefits? No, not trolling. Just interested in what you said and would like more information.

        The fundamental problem with Intel's instruction set architecture for the 8088/8086 line was that it was complex and intricate. To perform some instructions, the arguments had to be in very specific registers. Every register was, in some way or another, special purpose. The contemporary Motorola architecture, based on the 6800 and extended into the 68000 line, was completely the opposite: every register was, more-or-less, general purpose.

        Writing a compiler for the Intel architecture is an exercise in masochism. Writing one for the Motorola architecture is one of simplicity and elegance. The Motorola instruction set documentation of the era was simple, clean, and definitive: it molded the way instruction sets were documented for generations afterward. The Intel documentation was difficult to understand at best.

        One of the stark differences in the two instruction sets was the difference in instruction length variability. Intel instructions could be almost arbitrarily long. Motorola instructions were one or two bytes, with the one byte instructions being the ones most frequently used (inspired brilliance, that was). Also, for very related reasons, the number of cycles to execute an instruction was highly variable for Intel architectures, and more-or-less fixed for Motorola architectures.

        I wrote assembly code for both architectures, back in the day. I hated, hated, hated writing for Intel chips, and breathed a sigh of relief whenever writing for Motorola chips. The inherent beauty in the Motorola instruction set created a certain kind of transparency making it possible --- seriously --- to see programmer intent when reading assembly code. With Intel chips, that was just not possible. With Motorola chips, you could reverse engineer code pretty easily; with Intel chips, it was painful.

        The world would be a better place if IBM had selected Motorola.

    • Re: (Score:3, Interesting)

      I think you're talking about the 8088. The 8086 was a true 16bit chip, the 8088 had an 8-bit bus. The chief reason was, as I understand it, that 16bit hardware was extremely expensive at the time, so IBM went with it to keep the price of the unit lower, and to make it less expensive for expansion hardware to be built.

      And that's the real secret here of the success of the PCs and PC clones. They were never as good as a number of competitors; Apple had the better GUI, Amigas had the better graphics, the var

    • The Mac started out on the 68K. Ok it was more advanced than the PC to start but I think its fair to say that the only thing (arguably) slightly more advanced about Macs these days (and certainly not 2 generations ahead) is the OS. The hardware is commodity PC.

      As for commodore and atari, well, we know how well using the 68K panned out for them. Just proves that ultimately marketing wins and technological ingenuity comes a poor second.

      • Re: (Score:3, Interesting)

        I honestly don't think the PCs success had much to do with marketing at all. I remember that Commodore, Apple and even Radio Shack poured a lot of money into marketing, and even in the mid-1980s, PCs were still very much considered an unsexy "business" machine, with inferior graphics. And yet, IBM won because it was simply more easy to expand than Amigas or Macs. Sure the basic graphics sucked, but you could always go out and buy a CGA or Hercules video card, and later on EGA and VGA and so forth, and it

  • by Bagels ( 676159 ) on Monday June 15, 2009 @10:15AM (#28335511)
    Our family once owned an old Sony VAIO desktop. It came with a floppy drive, but as it was the year 2000, floppies were quickly becoming unfashionable. Because of this, Sony hid the floppy drive behind a small plastic hatch. The problem? The hatch attached to the case with a small but fairly powerful magnet... which corrupted every single disk inserted into the drive. To this day I'm wary of Sony products (and VAIOs in particular) because of that little screw-up.
  • The Amiga (Score:4, Interesting)

    by bl8n8r ( 649187 ) on Monday June 15, 2009 @10:16AM (#28335525)

    It amazes me how advanced this system* was for it's time and that it didn't catch on better than it did. The graphics and sound (just for starters) was many years ahead of it's time; x86 was still in EGA and speaker beeps at the time.

    [*] - http://en.wikipedia.org/wiki/Amiga#Graphics [wikipedia.org]

    • Re: (Score:3, Insightful)

      by Mordaximus ( 566304 )

      As was the Atari ST. Not trying to draw comparisons between the two systems, each had strengths and weaknesses. The point is there were a few very advanced and powerful systems around back in the day, and they likely only died out because EGA and speaker beeps was in offices everywhere.

    • Re: (Score:3, Insightful)

      by jandrese ( 485 )
      As was the case of many systems that were ahead of their time, they were competing with an established player that already had tons of lock in from software vendors, peripheral manufacturers, and the like. Worse, when a system is "ahead of it's time", that's often forgetting that it was considerably more expensive than the competition and quite possibly outside of the price range of most consumers. Good engineering isn't only about being the "best", but it's also about knowing what to cut to keep the pric
  • Apple Lisa (Score:5, Interesting)

    by camperdave ( 969942 ) on Monday June 15, 2009 @10:20AM (#28335575) Journal
    From the Fancy Article:

    Still, Lisa OS sported a unique document management metaphor that has yet to be replicated in a mainstream OS. Had the Lisa been cheaper and faster, it might have set a new standard in computing.

    Does anybody know what the "unique document management metaphor that has yet to be replicated in a mainstream OS" is, and why it might have set a new standard in computing? It sounds terribly intriguing. Might this be something that could/should be added to Linux?

    • Re:Apple Lisa (Score:4, Informative)

      by copponex ( 13876 ) on Monday June 15, 2009 @10:52AM (#28336087) Homepage

      http://www.youtube.com/watch?v=a4BlmsN4q2I [youtube.com]

      Start at minute 6:45.

      Seems that you would pick a stack of paper - word processing, spreadsheet, graphing, etc. - and it would "tear off" a new page for a new document that you could put elsewhere.

      It may be worth creating a newbie shell that hides many options with an option to go into "advanced" mode. The real endgame will be context sensitive interfaces that allow the computer to guess what you want to do, with an override for people who prefer to keep menus in the same place.

      I think a good design is to have all features across the top via pulldowns, and contextual options at the bottom that you can just turn off if you like.

    • Re:Apple Lisa (Score:5, Informative)

      by stevied ( 169 ) * on Monday June 15, 2009 @11:07AM (#28336335)

      As mentioned by others, document-centric computing:

      http://en.wikipedia.org/wiki/Apple_Lisa#Historical_importance [wikipedia.org]
      http://en.wikipedia.org/wiki/Xerox_Star#User_interface [wikipedia.org]

      People keep having stabs at it, and to give MS their due they did try pretty hard with Win95 and OLE/COM, and got rid of MDI [wikipedia.org] in later versions of Office .. but some it never seems to have been perfected on mass-market machines. The tab-view that we have in browsers now seems to be actively moving away from it (this is your application .. with your documents as child objects to it .. - though at least Chrome has the decency to put the tabs at the top of window.)

      It'll probably get leap-frogged as an idea by all this Web2.0 stuff and in-browser apps (which again is a regression: you still have to think about which SoaS-providing site you have to go to get a particular job done.)

  • by peter303 ( 12292 ) on Monday June 15, 2009 @10:20AM (#28335585)
    Smart phones are current decade's generation of personal computing like PDAs were in the 90s, and PCs in the 80s. We see some of the same trade-offs between of proprietary vs openess, short-cutting essential hardware features, clunky GUIs, etc we saw in the 80s. Will Apple's clean, but proprietary SDK win over the more portable, but clunky Android? Does a darkhouse OS like the new Pre, Windows ME, or micro-Java stand a chance? Will non-keyboard phones win over keyboard phones? And so on. Some of these debates have clear answers and others we are waiting for the market to decide.
  • by gr3y ( 549124 ) on Monday June 15, 2009 @10:32AM (#28335783)

    That was one of the most serious design mistakes of the last thirty years, but it's only really interesting because it's symptomatic of Apple's design philosophy, which is: "Do as I wilt".

    The one-button mouse spanned multiple generations of Apple computers and underscored Apple's stubborn unwillingness to produce computers that do what their users want, and not what Jobs or Apple's HID team think they should do.

    Really. Apple refuses to correct the annoyances of the UI that should not exist. Why doesn't OSX have a maximize window button? Why does clicking on "one hour before event" for an ical event reset the clock to one hour before the time you click the button, and not one hour before the event? Why doesn't finder support afp connections over ssh?

    None of those things seem to be complex, every one of them is a failure of the UI, and yet none of them have been corrected.

    • by EMB Numbers ( 934125 ) on Monday June 15, 2009 @11:03AM (#28336261)

      When the Mac came out, every software user's manual had to explain how to use a mouse. I witnessed early Mac users would couldn't grasp the idea that the pointer on screen was controlled by their hand on the mouse. People would watch their hand moving instead of watching the mouse pointer on screen. A single button was the right choice in 1984. Nothing stops you from connecting a multi-button mouse to your Mac, and all of the buttons and scroll wheel work swimmingly.

      People still don't understand double-click vs. single click. My father is brilliant, but he double clicks everything out of habit.

      And what is "maximize" good for. Isn't it ironic for someone who derides a one button mouse to want a one window GUI ?

  • I believe the biggest mistake was IBM using an Intel8088, instead of a Motorola68000.

    Imagine for a moment what would have happened if IBM choose in the early 1980s a 32 bits processor for the first successful Personal Computer!

    • no infamous 640k memory limit
    • probably no MSDOS (or QDOS), and a real operating system instead
    • 32 bits computing would have become mainstream a decade earlier at least!
    • much less assembly written software
    • Re: (Score:3, Informative)

      The cost for 16bit hardware was considerably higher. This was not a mistake, but a very practical decision to allow the IBM PC to use existing hardware with little modification. The other reason for not using the 68000 was that part of the point of using a member of the 808x family was so that CP/M could be run on the PC (that's not the direction they went in the end, but still CP/M was the king of business systems at the time).

      IBM was very specifically making a business decision. There wasn't a lot of s

    • by the_humeister ( 922869 ) on Monday June 15, 2009 @11:01AM (#28336227)

      Hindsight is usually 20/20. But in this case maybe not. Here's what an M68000 would have brought to the table:

      1) Higher costs - the reason IBM went with the 8088 is because it was less expensive.

      2) No 640k memory limit - okay, but then we'd have the issue of 16MB memory limit a few years later since the M68k had a 16 bit external bus. BTW, the 640k limit was particular to IBM's implementation, not necessarily a limitation of the 8088. Because everyone else copied that implementation, we have the 640k limit.

      3) 8-bit and 16-bit mainstream computing - why do I say that? Because memory cost a lot of money back then. Even though the M68k can use 32-bit code, the first computers would have come with miniscule amounts of memory. 32-bit code would not be a good idea then.

      4) continuation of CISC architecture - I personally don't think this is much of an issue, but some people do contending that the current CISC-to-RISC translation still takes up significant silicon real-estate.

    • by nojayuk ( 567177 ) on Monday June 15, 2009 @11:18AM (#28336487)
      The MC68000 was not available in production quantities at the time the IBM PC design was being finalised. The chip was late and buggy -- I used a dev board with a pre-production version of the chip clocked at half-speed, 4MHz, in 1982. Attempts to run it at 8MHz (the datasheet spec speed) were a failure.

      There were other reasons for IBM to go with the 8086-family chipsets:

      1) the 8086/8088's bus could easily drive the 8080-family support chips such as the 8251, 8255, 8259 etc. to build a complete system. The MC68k family support chips were even later than the release of the CPU itself (in some cases like the MMU several years late) and the MC68k bus could not be easily interfaced with the Intel family chips which were cheap and in plentiful supply.

      2) the 8086 family's internal data registers and addressing modes were designed to simplify conversion of existing 8080 code to run on the new 16-bit CPUs. The 68k, although a superior CPU in all respects to the 8086 family, had no tools available to make code conversion from the 6800 or other sibling CPU family (6809, 6502 etc.) simple -- all 68k code had to be written from scratch.

      3) the 68k was an expensive chip, not suprising as it was complex and required a large die, necessitating a 0.6" wide 68-pin DIL ceramic package. Motorola's target market for the chip was $10,000 workstations, not "toy" desktop computers only costing $2,000. By comparison the 8088 was cheap as chips.

  • PCjr (Score:5, Interesting)

    by cdrguru ( 88047 ) on Monday June 15, 2009 @10:33AM (#28335803) Homepage

    The biggest single problem with the PCjr was that it was late. In 1984 it was supposed to be on the shelf in the fall - October is the usual month when things are supposed to be shipped so they are stocked and on the shelf in November.

    Didn't happen. Macy's had received $50,000 to hold shelf space for the PCjr and they left them empty.

    The PCjr came out in February. A little late for Christmas. Everyone had created products for Christmas 84 specifically for the PCjr, but there wasn't anything to run them on. January 1985 CES was pretty dead - lots of PCJr games that nobody cared about. Parker Brothers closed down their electronic games division, as did lots of other companies right about then. It was a year or so later that the Nintendo finally started making inroads into the home game market but between the PCjr and Nintendo things were very, very dead.

    You can say all you want about a poor design of the keyboard and limitations of the hardware. But it is even more difficult to use when it doesn't exist and cannot be purchased. Not having it in time killed it, not any stupid design decisions.

  • TI Sidecars (Score:3, Interesting)

    by orb_nsc ( 819661 ) on Monday June 15, 2009 @10:48AM (#28336029)
    I'm glad they mentioned the TI 99/4A sidecars. I had a couple of these before getting the P-box. With all the engineers working at Texas Instruments, had none of them heard of "cables"? With a memory expansion and a floppy drive (which still needed it's own sidecar for the controller) your TI was already taking up the entire desk. And god forbid you nudge anything accidentally, and cause the whole thing to crash.
  • by Sj0 ( 472011 ) on Monday June 15, 2009 @10:58AM (#28336187) Journal

    It isn't really a 'classic' mistake, but the biggest PC design problem today from where I'm standing is over-reliance on fans. High volume fans will result in fuzzy lint growing on the devices which can least afford a layer of fuzzy lint.

    In the past year, I've revived dozens of computers, and nearly every failure can be directly attributed to lint induced by fans.

  • by Linker3000 ( 626634 ) on Monday June 15, 2009 @10:58AM (#28336193) Journal
    • Olivetti/AT&T: On the M24-M280 series' used a 9-pin D connector for keyboard. If you plugged keyboard into your EGA port you blew a diode and lost (ISTR) green.
    • Olivetti/AT&T: (See above). M290 model - putting the EGA and keyboard connectors NEXT TO EACH OTHER! (WTF).
    • Olivetti/AT&T: (See above). If you killed your keyboard (coffee spill etc.), a new one was £160 ('no discount') and nothing else fitted. We actually used to repair these keyboards as they cost so much.
    • Olivetti/AT&T: Low cost (M200 ?) series - no cover on PSU and integrated power switch on left side of case - when you slid off the case top without unplugging, there was a better than even chance one of your fingers would touch the live switch contacts - saw an engineer do this and then proceed to throw the system unit across the workshop while yelping in pain.
    • Olivetti/AT&T: 'Integrated' UPS that slid into the bottom of some of their servers. NO covering on bottom circuit board and so if you didn't get the unit into its rails properly, the board would touch the bottom inside of the case and short out the batteries/weld itself to the case, leaving you tugging for all your might to break the contact before the batteries (or something else) exploded.
    • IBM: Micro Channel Architecture's lousy licencing terms.
    • Tulip: 'Fault tolerant' server with active pull-up on the SCSI bus powered from ONE of the 'redundant' PSUs - so if *that* PSU blew you lost your disk data and command channels even though the other PSU kept everything else running.
    • General: Plastic clips on early SIMM sockets that snapped when you sneezed near them
    • General: USB socket is same width as RJ45 so you can slide a USB plug into the network port and it feels 'right', but gets you nowhere until you look and check!

    I could go on...!

  • by e9th ( 652576 ) <e9th@NoSPAm.tupodex.com> on Monday June 15, 2009 @11:04AM (#28336287)
    It would go something like this:

    Well sonny, I remember it was back in the '80s. There were these guys who loved their Apple IIIs so much that, despite its faults, they kept them running for years beyond their useful lifetimes. They did this by filling their offices with industrial-strength fans pointed at those Apple IIIs. Ever since then, we've called people who continue to support obviously flawed products "fanboys"
  • by fm6 ( 162816 ) on Monday June 15, 2009 @11:36AM (#28336723) Homepage Journal

    The PCjr's serial port, monitor port, joystick ports, keyboard port, and others used different connectors from the IBM PC. In fact they were not only non-standard connectors, but completely proprietary connectors that couldn't be found on any other computer.

    People, this is 1983. All connectors were "non-standard". Nowadays we're used to a standard connector and pinout for RS-232 [lammertbies.nl] and parallel [diyha.co.uk] ports on the back of PCs. But in 1983, exactly one model of computer used them: the IBM PC. It didn't more than a couple years for people to realize that the only way to compete with the IBM PC was to be extremely compatible with it. But when the PC Jr. came out, everybody (especially IBM) used business and sales models that paid no attention to the idea that computers and their components could be commodified.

    Small qualification: the use of 25-pin D-shaped connectors with specific pinouts was part of the RS-232 standard. But 25-conductor, straight-across cables cost, and you actually didn't need most of those signals for typical applications. So making cables that would connect some random computer to some random modem or serial printer was a serious black art. There was even a book [amazon.com] on the subject.

    (Jerry Pournelle once wrote that he used internal modems because he could never remember the pinouts he needed to make cables. But by the time he wrote this, RS-232 pinouts had been standardized and cheap pre-made modem cables were in all the stores. Pournelle is the original know-it-all ignoramus computer pundit.)

    Parallel printer cables were even worse. They all used the Centronic [pinouts.ru] de-facto standard on the printer side. But to save money, everybody used 25-pin D connectors at the computer side, and the way the 36 Centronics signals mapped to those 25 computer pins was different for every manufacturer. It took IBM to standardize the pinout, and also to standardize making the printer connector female so you didn't accidentally plug a modem into it.

  • PCjr (Score:3, Insightful)

    by MaWeiTao ( 908546 ) on Monday June 15, 2009 @11:48AM (#28336875)

    I don't understand why the PCjr is bashed so much. We had one and I thought it was pretty damn good. Granted I was quite young, but we did put that machine to good use for quite a few years. We did get the chiclet keyboard, but by that point IBM was already including a similar keyboard with conventional keys so it was a moot point. I actually thought the keyboard was pretty cool. It wasn't the best for typing, but I think it was more a consequence of the technology available at the time and the size of the buttons than anything else. I'd like to think that current Apple keyboards are a spiritual successor and show that the concept wasn't necessarily a bad one. As for the IR, certainly you had to be careful with anything getting in between the keyboard and the machine, but generally it was excellent and we never ran into problems. I must preferred that to having to deal with a cable.

    As for the sidecars, it's not like people at the time were upgrading machines anywhere near as frequently as they do now. And there were tons of clumsy upgrade solutions for many computers at the time. When a 128K memory card was as large, if not larger, than most video cards today there aren't many options for efficient packaging. Actually, the upgrade we got was from a company called Legacy and it pretty much was a whole other case, the size of the PCjr which added 512K of ram and added a second floppy drive. It doubled the size of the machine, but that's just how things were back then; it never bothered us.

    The PCjr was a better machine than pretty much anything else I encountered through much of elementary school. It was far superior than the crappy Apple IIs we had in school. It offered better resolution and 16 colors. What did suck, however, was that it was somewhat less powerful than the IBM PCs available then and later on. While it supported CGA, it's 16 color format was proprietary and not compatible at all with EGA. But regardless, for $1000 it was a great deal and generally compatible with most IBM PC applications.

    I haven't gone through all the "mistakes", but it seems like this article is written from a modern-day perspective which is inappropriate given the era when these machines were designed and manufactured.

  • Real mistakes (Score:5, Informative)

    by Animats ( 122034 ) on Monday June 15, 2009 @12:08PM (#28337127) Homepage

    Those are mistakes an end user would see. Here are some deeper mistakes from an engineerings standpoint.

    • Bus-type peripheral architecture. The IBM PC was a spinoff of the IBM Displaywriter, a dedicated word processor with no expandability. It inherited some design decisions from the Displaywriter that were reasonable for a word processor, but terrible for an expandable machine. Most notably, the IBM PC had the peripherals on the memory bus. That meant any DMA had to be on the I/O card, and thus any card could blither all over memory. Peripherals were thus trusted devices, and, in turn, drivers had to be trusted. IBM knew the right answer - channels, as on mainframes, and in the PS/2, they used a "microchannel" architecture. But it was too late - the industry had already standardized on "ISA cards". This is the fundamental reason cause of most operating system crashes - the I/O architecture gives drivers too much power.
    • The Motorola MMU debacle.The Motorola 68000 first appeared in 1978, and it was a very good machine. Almost. There was a flaw. Instruction backout didn't quite work, and thus a paged MMU couldn't be added. So Motorola didn't ship an MMU with the 68000. The early UNIX workstations all used the 68000, and painful hacks were used to kludge together some kind of MMU to make it work. Apollo used two CPUs, one for the OS and one for the user, only one running at a time, to get around this. The Apple Lisa used one CPU with an Apple MMU built from many parts, and the compiler avoided generating any instructions with incrementation so that backout would work. Motorola came out with the M68010 in 1982, which fixed the bugs, but there was still no MMU. When Motorola finally shipped the 68451 MMU, it was a segmented MMU, and worse, slowed down the machine by one clock cycle per memory access. If Motorola had gotten it right by 1979 or so, the whole history of personal computing might have been Motorola-based using protected mode-UNIX.
    • The Intel 286 CPU. Not enough memory management for a protected mode OS, too much segmentation machinery for an unprotected OS. That powered the IBM PC/AT and a whole generation of machines with the addressing system from hell. It could run a version of UNIX, but no process could exceed 64K in protected mode, although you could put a few megabytes on the machine.
    • Baseband Ethernet. Coax-based Ethernet had some serious electrical problems. The thing really was unbalanced baseband, so you couldn't use capacitive coupling. The coax shield could only be grounded at one point, or you'd get ground loops. That created an electrical safety issue with the outside of coax connectors, and running coax between buildings was iffy. It was just bad electrical design. 10baseT, which is balanced, was far better from an electronics standpoint.

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...