Fifteen Classic PC Design Mistakes 806
Harry writes "Once upon a time, it wasn't a given that PC owners should be able to format their own floppy disks. Or that ports should be standard, not proprietary. Or that it was a lousy idea to hardwire a PC's AC adapter, or to put the power supply in the printer so that a printer failure rendered the PC unusable, too. Over at Technologizer, Benj Edwards has taken a look at some of the worst design decisions from personal computing's early years — including ones involving famous flops such as the PCJr, obscure failures such as Mattel's Aquarius, and machines that succeeded despite flaws, like the first Mac. In most instances — but not all — their bad decisions taught the rest of the industry not to make the same errors again."
Worst Mistake That Still Needs Fixing (Score:5, Insightful)
Patents and proprietary, closed standards -- Open standards lead to innovation and better hardware for consumers. Look at some of the junk in that article... Engineers need the challenge of having other people improve upon their ideas. Open standards and open-source *will* win because people work best working together. Capitalism certainly won't die but it needs to learn this lesson.
Honourable Mention: Keyboards -- Most computer keyboards are designed for some other lifeform -- one with a single arm bearing 10 or more fingers. Consumers accept the familiar "conventional" keyboard because it's familiar and conventional. The keyboards that are best for human beings have a "split" or curve in the centre. There are many horrible keyboards, so I'd like to mention some excellent ones:
GoldTouch
Adesso Ergonomic
original Microsoft Natural (not the later rubbish that claimed to be "ergonomic" just because it had a fake leather wrist support -- while maintaining the straight-row key configuration that is bad for wrists)
Re:Worst Mistake That Still Needs Fixing (Score:5, Insightful)
> original Microsoft Natural
That was a great keyboard back in 96! I would demonstrate a simple proof to others to show the benefit of its ergonomics:
* Stand up. Put your hands by your sides. Notice the angle of your hands.
* Now raise your hands up, keeping your biceps in place, and making an L, as if you were shaking hands.
* Now roll both of your hands inward, as if you were to play a wide piano. Seem how comfortable that is?
* Now slide your hands together so your thumbs are touching. Notice how awkward that is?
Took me a little while to get used to it, but it was good. My only problem was that the Y,H,and N keys (quite logically) were put on the right side. I'm a pretty hard-core gamer that uses most of the left side + partial right side of the keyboard, and found those keys "missing." (I used the right hand on the mouse.)
I wish someone would bring it back, duplicating the TY, GH, NM keys on both the left and right side.
--
"Necessity is the mother of invention,
but Curiosity is the Father."
-- Michaelangel007
It too, has a single tragic design flaw (Score:5, Interesting)
Here is an article with a picture of one. [pctechguide.com]
I'm a touch typist, took a class in it in high school. Fingers on the home keys. Left hand rests on ASDF. Right hand on JKL;.
If you move up a row from ASDF, you get QWER. My left pinky is A, move up 1 to Q. My right pointer is on F, move up 1 row to R.
Move up to the next row for numbers. ASDF becomes 1234. Now here's where we get to the mistake. We were taught that your left pointer goes up 2, and towards the middle 1 to get to 5. Likewise, your right pointer goes up 2 and over to the middle one 1 to get to 6.
Notice how the 6 is on the wrong side? When my brain thinks "6", my right pointer wants to see it right next to the 7. It's now the responsibility of my left pointer to be in charge of 456, and my right pointer is now only in charge of 7.
I can't tell you how frustrating this keyboard is to a touch typing programmer. It's as if nobody at Microsoft knows how to touch type.
Re:Worst Mistake That Still Needs Fixing (Score:5, Interesting)
I wish someone would bring it back, duplicating the TY, GH, NM keys on both the left and right side.
This. Very, very, very THIS. Please. And hurry...
Re:Worst Mistake That Still Needs Fixing (Score:5, Insightful)
Re:Worst Mistake That Still Needs Fixing (Score:5, Insightful)
Nevermind the display port.
They are back to the "no user serviceable parts" mantra.
Sure you can upgrade a mini if you are sufficiently stubborn.
However, it's a process where you will find yourself applying
a putty knife to your pretty little Mac.
Frankly I don't think most Apple users are up to that sort of
thing.
The thing is a glorified headless laptop anyways. Why didn't they
just take that idea to it's logical conclusion and have expansion
panels like real laptops do?
This is especially problematic since minis historically came with
too little memory as Mac in general have. This is why I personally
know the joys of upgrading a mini.
Re: (Score:3, Interesting)
My new Mini is actually my first Apple ever. So far, I have not been impressed.
Re: (Score:3, Informative)
Just about anyone who's posting on Slashdot is not going to be well-served by a Mac Mini. At least not as a primary machine. The Mini is a scaled-down computer intended for non-power users who need a relatively inexpensive machine that can be tethered to a desk.
If you want to be happy with your Mac purchase, get a MacBook. It will do everything you need of it and more. Plus, getting it equipped out-of-the-box with sufficient memo
Comment removed (Score:5, Insightful)
Re:Worst Mistake That Still Needs Fixing (Score:5, Interesting)
...or the total overkill that is the Mac pro line...
As someone who also got bit by Apple's non-user serviceable part philosophy, I agree with you 100%.
I've got a Mac Pro. I'm not an Apple fanboi, I just hate them less than other computer manufacturers. My computer works great. But I didn't get the wireless card installed when I purchased it because I didn't need it. Later on, I needed the wireless capability, so I tried to buy the Airport Extreme card from Apple. The fuckers (yes, they are fuckers) wouldn't sell it to me because "it was not a user installable part." I had to make an appointment at my "local" Apple store that is 60 miles away to let some teenage "genius" install it for me. Yeah, OK, I'll get right on that, because I really want to drive my expensive 90-lb machine 120 miles on my day off so some 13-year-old-looking smartass can paw at it.
Instead, I bought it off a third-party vendor and worked out how to install it myself, since the only instruction it came with said "This is not a user installable part, please refer to the Mac Pro service manual for installation." It worked fine and I now have wireless capability, but I found Apple's actions with that upgrade really insulting.
If I am willing to pony up $4000 for a computer, chances are I have the necessary intellect and experience to screw a wireless card to my motherboard and plug in two antennas. Or I am willing to accept the consequences of my actions if I screw up. Why would a company make it hard for a consumer to use their product?
Apple's increasingly common philosophy of non-user serviceable parts, lack of mid-range user-upgradable towers, and forcing weird connectors down our throats without including the adapters for free are annoying and I think, ultimately, holding them back in the PC market. Window's recent suckage has been working to Apple's advantage, but I feel they could have capitalized on it more effectively. Of course, I am sure that Steve and his financial analysts have determined otherwise.
Re:Worst Mistake That Still Needs Fixing (Score:5, Interesting)
It's not just Apple that charges such a large amount for better parts. Dell (whose computers you can easily upgrade on your own) has prices on upgrade parts that are much higher than retail.
For example, a base model Vostro desktop lists the Core 2 Duo E8600 as an upgrade (over the Celeron 450) for $330; the E8600 can be bought [mwave.com] for $267.99 with free shipping. Dell lists their 21.5" HD monitors for $260; I recently bought two Samsung 21.5" HD monitors for $189.99 each [newegg.com] (with free shipping, and there are rebates available). Dell will upgrade your baseline Vostro from 1GB to 4GB of 800MHz DDR2 for $112; it's not hard [mwave.com] to find 4GB kits for anywhere between $40.99 and $76.99, depending on what brand you prefer. On the same machine Dell will upgrade your 80GB hard drive to a 1TB 7200RPM hard drive for $330; Seagate 1TB drives can be had [mwave.com] for as little as $89.99.
(Those aren't affiliate links, don't worry :P)
If you were to get those upgrades, Dell's markup over retail prices is as much as $400, and they pay OEM price, not retail. (To be fair, the hard drive I linked above to is OEM, not retail.)
These days, I see very little reason to buy a desktop from Dell (or Apple or whoever) unless you're buying a laptop - and even then, you shouldn't have the vendor upgrade your RAM. I bought 4GB RAM for my laptop for $20 (after rebate), where Dell would have charged me $200. (Ironically, the RAM was marketed as "for Macs", despite being standard DDR2 SODIMM.)
As a humorous side note, if you want Dell to preconfigure RAID on a pair of 1TB drives, they'll do RAID-0 for $350 or RAID-1 for $250... same hardware, different price. Fun fun fun.
Re:Worst Mistake That Still Needs Fixing (Score:4, Informative)
Re: (Score:3, Insightful)
http://www.monoprice.com/products/search.asp?keyword=displayport&x=0&y=0 [monoprice.com]
What's ridiculous is HDMI.
Re:Worst Mistake That Still Needs Fixing (Score:5, Insightful)
If the spec is open, isn't it, by definition, not proprietary?
It's like claiming Linux is proprietary because you down have GCC? The Spec is open. No patents or licenses are preventing you from making your own display port. You just don't have the means necessary.
Heck, by that 'definition' VGA, DVI, etc are all "proprietary" too. Just because you can't make it or buy it at best buy, doesn't mean that it's proprietary.
Re:Worst Mistake That Still Needs Fixing (Score:5, Informative)
DisplayPort IS an open standard [wikipedia.org]. Mini Display Port is added to the 1.2 specification. You can look up all the wiring for the pins, making IT NOT PROPRIETARY.
Apple was literally the first company to put these out. So for a short time there was only 1 place to buy them.
You can get cables from Monoprice and any of a dozen online retailers. Right now you can get DisplayPort connectors from DigiKey and I imagine once 1.2 is fully adopted , that you'll probably have no problem finding Mini DisplayPort connectors at Digikey.
Again, how is (Mini) Display Port any more proprietary than VGA, DVI, HDMI?
Re:Worst Mistake That Still Needs Fixing (Score:4, Informative)
So until any other company used it, the first USB port on a computer was proprietary?
So until any other company used it, the first PCI port on a computer was proprietary?
So until any other company used it, the first firewire port on a computer was proprietary?
So until any other company used it, the first X port on a computer was proprietary?
http://en.wikipedia.org/wiki/Proprietary [wikipedia.org]
The word proprietary indicates that a party, or proprietor, exercises private ownership, control or use over an item of property.
http://dictionary.reference.com/browse/proprietary [reference.com]
1. belonging to a proprietor.
2. being a proprietor; holding property: the proprietary class.
3. pertaining to property or ownership: proprietary wealth.
4. belonging or controlled as property.
5. manufactured and sold only by the owner of the patent, formula, brand name, or trademark associated with the product: proprietary medicine.
6. privately owned and operated for profit: proprietary hospitals.
(Mini) Display Port is NOT proprietary. Dell uses Display Port. Other laptop or netbook companies may find a mini display port smaller than VGA. Only time will tell.
Re:Worst Mistake That Still Needs Fixing (Score:5, Insightful)
I was never discussing open versus closed standards. This is about proprietary versus standard.
That's why he argued with you the whole time. You're using 'proprietary' to mean 'uncommon'.
Your point's valid, you're just using the wrong term.
Re:Worst Mistake That Still Needs Fixing (Score:5, Informative)
Actually, DisplayPort isn't proprietary, it's the successor to DVI. Mini-DisplayPort is part of the VESA specification and is entirely royalty-free.
Re: (Score:3)
Re:Worst Mistake That Still Needs Fixing (Score:4, Informative)
All new dell upper end monitors have Displayport too, the Dell 2408 has one for sure.
Apple's fascination with single button mice (Score:4, Insightful)
Having to press a key on the keyboard and click has got to be the most entertaining solution I have seen as 'good' in a long time.
I think it is funny the genius bar people practically tell people to get a microsoft mouse.
multiple cable speaker systems, its about time we had a single cable solution for attached speakers that provided easy to implement separation of channels. USB for everything please, or something similar.
Re: (Score:3, Informative)
Not for nothing, but that's just not true anymore. Hasn't been for quite a few years. Ever since the Mighty Mouse (the one with the trackball) became standard, it's been a 2-button mouse. It's just that the Mac UI was designed to be single-button friendly and the mouse operates in single button mode by default. If you are clever enough a user to want to right-click, it's simple enough to just go to the System Preference pane for your mouse and turn it on.
Mice with two hardware buttons Just Work as well.
Re: (Score:3, Informative)
Not for nothing, but that's just not true anymore. Hasn't been for quite a few years. Ever since the Mighty Mouse (the one with the trackball) became standard, it's been a 2-button mouse.
More than that, it's a 4-button with a scroll ball.
Re: (Score:3, Informative)
I think he's commenting on the fact that many Windows programs put "essential" commands in the context menu, which is "invisible" until a user right-clicks to bring it up.
Re:Apple's fascination with single button mice (Score:4, Informative)
Oh, that menu. How many applications don't have the same options in the primary menus, though?
Not to mention that it also MAKES SENSE.
If I want the menu for Application X, I'd naturally expect it to be attached to the window itself.
If I want the menu for an object within Application X's window, I'd expect it to be attached to the object itself.
That said, they've started an alarming trend of hiding the primary menu, too, until you press the Alt key or click some icon. Whoever thought the primary menus for applications should be hidden should be shot, IMHO...
Re:Worst Mistake That Still Needs Fixing (Score:4, Interesting)
Why doesn't Microsoft just forget software and go into hardware?
Re:Worst Mistake That Still Needs Fixing (Score:4, Insightful)
I must be that "some other lifeform". I can't stand or use curved or "Ergonomic" keyboards such as the Microsoft un-Natural keyboard.
I'd rather have my wrists rest flat on the table since I find that far more comfortable than having my hands rotated slightly, thus resting my wrists at an angle (which starts to hurt after awhile).
Re:Worst Mistake That Still Needs Fixing (Score:4, Insightful)
Re: (Score:3, Funny)
one hand typing (while mousing)
Sure...'mousing' is what causes you to type with one hand.
Re:Worst Mistake That Still Needs Fixing (Score:4, Interesting)
Your "oversized" enter keys are standard for non-US keyboards. I don't like the US style ones, because I am used to a larger enter key..
One classic web design mistake (Score:5, Funny)
Comment removed (Score:5, Informative)
Re:The 15 problems (Score:5, Interesting)
Re:The 15 problems (Score:5, Informative)
That design flaw wasn't introduced when the Mac came out, it was when they first moved from 68000 to PowerPC [wikipedia.org]. Older Macs from the XL through to the Classic II had the power button on the keyboard or tucked away somewhere out of sight on the monitor/base.
Re: (Score:3, Insightful)
Removing the eject button was a good idea
No, it was a bad idea. A "good" idea would have been one of two things, as I see it:
– Soft eject, emergency hard eject (e.g. like a CD-ROM drive). If it's off, I don't want to turn it on to get my disk.
– Hard eject with soft disable (e.g. like CD-R/RW drives which physically lock closed while burning). Ensure that it unlocks when the power goes off!
Re:The 15 problems (Score:4, Informative)
Huh? Which Mac was that? All the original Macs (the 128, 512, and Plus, along with the SE series and Classic) had the power switch on the back. The NuBus Macs (Mac II onwards up until the middle of the PPC era) powered up via their ADB keyboards.
There was an optional reset switch you could attach to the lower side of the computer, I guess that could look like a power button - but that was originally a user option to install (most didn't that I recall) and the later Macs had a slightly recessed reset button.
Re: (Score:3, Informative)
Worse was the Apple II Reset button - originally it was (if I remember right) more or less right above the Return key. Back in the days when saving was a matter of screwing around with cassette tapes and luck, it was incredibly frustrating to accidentally brush the Reset button. Fortunately it was possible to re-wire it so it required you to press CTRL + Reset to reset, and then we also got a floppy drive so that it was MUCH less obnoxious to save stuff.
I wonder how many hours of lost work that reset button
Re:The 15 problems (Score:5, Insightful)
Removing the eject button was a good idea; it prevented you from ejecting a disk without unmounting it and ending up with corrupted date.
Removing the eject button was an idiotic idea, and it illustrates one of the great failures of personal computer design philosophy - the idea that the system builder/designer knows better than the user how the user should use the system. If I want to eject a disk in the middle of an operation then I should be able to - maybe the possibility of corruption is preferable to the alternative of letting an operation continue. Maybe an electrical fire just started in the system power supply, and I want to get my floppy out NOW. Maybe a million things that the designer didn't think of. The assumption that the user is an idiot and doesn't know what they are doing, and that their control over the system must be severely limited for their own protection, is the single worst PC design mistake.
Re:The 15 problems (Score:5, Interesting)
Re: (Score:3, Insightful)
This looks to me like 10 actual problems, with multiple examples of crappy keyboards and bulky shit stuck to your computer.
Re:The 15 problems (Score:5, Insightful)
And to be honest, were those bulky expansions really design mistakes or do they just seem that way now that we have the benefit of a couple of decades of experience and design put into the problems they were meant to address?
I'd have a hard time seeing USB coming out back in the era being described, and not just because every company was doing it's best to lock people into their own platform.
Re: (Score:3, Informative)
Where's #9?
Oh, instead of releasing their own GUI based PC, Xerox PARC [wikipedia.org] had Apple do it.
Falcon
And a summary (Score:4, Informative)
A: No PSU fan (leading to thermal warping of internal components)
B: Limited Apple II Compatibility (Limited Compatibility)
C: No way to format disks
D: EM Pulse Erases tapes (unreliable media)
E: Printer required
F: Lousy Keyboard (#6 and #8)
G: Non-detachable AC adapter
H: Ridiculous external expansion options (10, 13, and technically 14)
I: No user expandability
J: Slow BASIC
K: Unreliable disk drives
Re: (Score:3, Insightful)
It was a design mistake because the system's own power supply generated the EMP when the switch was flipped. More testing could have caught the issue, but it was a critical flaw in the component choices and board layout of the system.
The iPod battery is lightweight and generally easy to forget about. The power bricks were heavier and bulk
Big ISA bus flaw (Score:5, Interesting)
IOCHRDY signal is active high instead of active low. Causes no end of problems.
Re:Big ISA bus flaw (Score:4, Interesting)
Depending on the bus design, active-low signals can be asserted by any device by turning on a transistor to ground. They are allowed to "float" high via pull-up resistors, so you get a poor-man's OR gate.
And depending on the circuitry you're using, "drive down, float up" may be much, much, much simpler than "drive up, float down". In the pre-CMOS days, for example, only N-channel FETs were available. So a transistor to drive a bus line low is cheap, but it's not pleasant driving something high. (Fortunately, I've forgotten most of the details.)
It's one of those places where reality and theory diverge.
Re:Big ISA bus flaw (Score:4, Interesting)
It's been decades since I've worked on ISA bus stuff, but IIRC, IOCHRDY is essentially active-low. Any card can pull it down to add wait states to the current cycle, then they let if float back up when they're ready.
The main problem with the ISA bus is that it was never engineered in the first place. The people in the skunk-works PC project at IBM slapped it together by tacking a few TTL kludges onto off-the-shelf Intel I/O parts, probably without doing any formal timing analysis. That probably worked OK at the original 4.77 MHz, but within a few years the bus had been overclocked throughout the industry to 8MHz. (I think that Dell, then known as PC's Limited, tried pushing the ISA bus to 12MHz, but that bad idea was quickly dropped.)
One project task I had in the 1980s was to sit down and to a complete timing analysis of the IBM PC/AT bus (which added yet more kludges to the original PC bus to go from 8 to 16 bits) based on the circuit diagrams in their technical reference. Some of the timings just can't work using the worst-case specifications. The computers usually worked mainly because the odds of getting actual worst case behavior out of several chips is rather low. However, there was no shortage of incompatibilities and crashes with a lot of 3rd party ISA adapters.
Re: The 15 problems (Score:5, Insightful)
Problem #16: Blindingly intense blue LED on my new Dell that blinks when the computer is asleep.
All night long the computer constantly warns me: "I'm asleep. I'm asleep. I'm asleep." It's like Homer Simpson's "everything is OK" alarm.
Low-tech solution (Score:5, Insightful)
1 square inch of Scotch brand #33 electrical tape.
Re: (Score:3, Insightful)
More like blindingly intense blue LEDs period on all current hardware. Give me back my soft red LEDs...
Re: The 15 problems (Score:5, Funny)
I have two DVD players that have a helpful little red LED that lets me know the device is off.
Seriously. When I turn the player on, the LED goes off.
CapsLock (Score:4, Insightful)
Sun got it right on their keyboard design, but everyone else kept the CapsLock key. I've been using computers for 21 years, and I use Ctrl constantly. I do not recall ever having used the CapsLock key (except out of curiousity to see whether it actually does anything.)
(Well, that's a bit of a lie. Of course I use it, after reassigning it to Ctrl. But the point is, having to take that step is a waste of time.)
CapsLock was useful once upon a time, when there was no \section{} or \textbf{}, and when pressing `shift' actually required strenght. But those days are gone.
Re:CapsLock (Score:4, Insightful)
There are still limited instances when CapsLock is useful. I work in a hospital and our MediTech program requires all caps. (Don't ask me why.) Like you mentioned, you can get keyboard remapping programs to turn CapsLock into another key. Still, I can see your point and it would be nicer if the CapsLock functionality was incorporated without needing a whole key. Say, for example, by pressing the Shift key twice or three times in rapid succession.
And while we're on the subject, does anyone use Num Lock or Pause anymore?
Re:CapsLock (Score:4, Informative)
And while we're on the subject, does anyone use Num Lock or Pause anymore?
Not Numlock, but 'pause' is used all the time in Windows. Off the top of my head: it pauses most games, pauses the command line, and Winkey+Pause opens the System Properties dialogue.
worst: sharp unfinished inside edges in cheap case (Score:5, Interesting)
My personal list...
- 15 to 10 years ago, you had to be careful when installing drives, or RAM. You could almost slice your hand on a cheap case that had unfinished and sharp edges.
- Beige Only. You can pick any color, as long as it is beige. Why did it take so bloody long to offer any other color then beige? Critical mass?
- LOUD systems. Have to thank George for showing me just how nice a quiet system is.
- Power hunger systems. 2 molex connections for a GPU ?!
- Crap 3D Video cards in laptops, and almost no benchmarks from the "classic" hardware review sites so you know how bad it sucks compared to a "real" GPU. (Thankfully the S3 Virge is gone from desktops, but laptops are still stuck with poor performance unless you pay an arm and a leg.)
--
"World of Warcraft (TM) is the McDonalds (TM) of MMOs."
-- Michaelangel007
The worst-designed case component... (Score:5, Funny)
Re:The worst-designed case component... (Score:5, Funny)
#1 failure... (Score:5, Insightful)
the choice of IBM to use the 8086 CPU. It set back the computer industry several years. The PC would now be at least 2 generations ahead if IBM did not use the retarded 8086 design.
Obviously, IBM did not believe in personal computers and thought they were gimmicks.
Re:#1 failure... (Score:4, Interesting)
Why? What other processor(s) should have been used, and what would have been the benefits? No, not trolling. Just interested in what you said and would like more information.
Re: (Score:3, Insightful)
While something like the 68000 could have been used I don't think it was necessary.
Ultimately, the problem with the PC was the system software. It was thrown together
without any real thought or consideration for the future. It was the essence of how
things were NOT done at IBM at the time.
The problem wasn't so much that the 8086 sucked but that the OS was tied to it so much.
That clone with the problem serial port would have been in a better position if something
resembling a real OS was created for the PC to
Re:#1 failure... (Score:5, Insightful)
Re: (Score:3, Interesting)
And much more expensive to purchase and assemble than the Intel chipset. The Slashdot Uber Tech Society often forgets that computers are designed and priced for the end user and the mass market, not the programmer and the Uber techie.
Re:#1 failure... (Score:5, Funny)
Slashdot Uber Tech Society
You mean SLUTS? Sorry - couldn't help myself...
Re: (Score:3, Interesting)
680x0 was, IIRC, around at the time, and had a much more elegant, though still CISC, instruction set. Plus it was 32bit internally, though the 68000/10 only had 24 external address lines.
I seem to recall that writing (GUI) apps in assembler for the (68000-based) Amiga was, although time consuming, perfectly possible. I'd have hated to do it on the register-starved 80x86.
Re: (Score:3, Insightful)
Re:#1 failure... (Score:5, Informative)
Why? What other processor(s) should have been used, and what would have been the benefits? No, not trolling. Just interested in what you said and would like more information.
The fundamental problem with Intel's instruction set architecture for the 8088/8086 line was that it was complex and intricate. To perform some instructions, the arguments had to be in very specific registers. Every register was, in some way or another, special purpose. The contemporary Motorola architecture, based on the 6800 and extended into the 68000 line, was completely the opposite: every register was, more-or-less, general purpose.
Writing a compiler for the Intel architecture is an exercise in masochism. Writing one for the Motorola architecture is one of simplicity and elegance. The Motorola instruction set documentation of the era was simple, clean, and definitive: it molded the way instruction sets were documented for generations afterward. The Intel documentation was difficult to understand at best.
One of the stark differences in the two instruction sets was the difference in instruction length variability. Intel instructions could be almost arbitrarily long. Motorola instructions were one or two bytes, with the one byte instructions being the ones most frequently used (inspired brilliance, that was). Also, for very related reasons, the number of cycles to execute an instruction was highly variable for Intel architectures, and more-or-less fixed for Motorola architectures.
I wrote assembly code for both architectures, back in the day. I hated, hated, hated writing for Intel chips, and breathed a sigh of relief whenever writing for Motorola chips. The inherent beauty in the Motorola instruction set created a certain kind of transparency making it possible --- seriously --- to see programmer intent when reading assembly code. With Intel chips, that was just not possible. With Motorola chips, you could reverse engineer code pretty easily; with Intel chips, it was painful.
The world would be a better place if IBM had selected Motorola.
Re: (Score:3, Interesting)
I think you're talking about the 8088. The 8086 was a true 16bit chip, the 8088 had an 8-bit bus. The chief reason was, as I understand it, that 16bit hardware was extremely expensive at the time, so IBM went with it to keep the price of the unit lower, and to make it less expensive for expansion hardware to be built.
And that's the real secret here of the success of the PCs and PC clones. They were never as good as a number of competitors; Apple had the better GUI, Amigas had the better graphics, the var
What , you mean like the Mac , Amiga, Atari ST? (Score:3, Insightful)
The Mac started out on the 68K. Ok it was more advanced than the PC to start but I think its fair to say that the only thing (arguably) slightly more advanced about Macs these days (and certainly not 2 generations ahead) is the OS. The hardware is commodity PC.
As for commodore and atari, well, we know how well using the 68K panned out for them. Just proves that ultimately marketing wins and technological ingenuity comes a poor second.
Re: (Score:3, Interesting)
I honestly don't think the PCs success had much to do with marketing at all. I remember that Commodore, Apple and even Radio Shack poured a lot of money into marketing, and even in the mid-1980s, PCs were still very much considered an unsexy "business" machine, with inferior graphics. And yet, IBM won because it was simply more easy to expand than Amigas or Macs. Sure the basic graphics sucked, but you could always go out and buy a CGA or Hercules video card, and later on EGA and VGA and so forth, and it
BS! Re:I don't agree (Score:5, Interesting)
I was programming in x86 assembler (by necessity - not choice) at the time and the X86 instruction set sucks big time. The 68000 was far easier. No programmer worth his salt would choose X86.
The X86 still used 32 bits for the address but they overlapped the two 16 bit pieces so there were many ways to form the same address. It was INSANE!
IBM missed the boat, created a major competitor in the process and short themselves in the foot many times as a result. About all that saved IBM's PC bacon back then was that they had a lot of feet to shoot at.
IMHO when I read the article - its great. It shows how the rush to market can put a company out of business real quick.
BTW, I looked at the Lisa. I didn't buy it. I looked at a lot of the other computers in the list. I didn't buy them. Apple has not EVER sold me a computer. Funny. IBM has not EVER sold me a computer.
I have been running clones since 1986.
I'll predict that Microsoft's days are numbered as well. I think the number might be large however given their cash reserves. However I am hearing people tell me they are sick and tired of the shoddy windows code and the problem with malware. I think a lot of this problem stems from the X86 days and windows 3.11
The way I see it... the general population in many ways is like a school of fish. They tend to clump together for safety reasons. However, few have much in the way of any enduring investment and just like a school of fish they can all change direction rather quickly. If/when this happens then we may see the fortunes of a company like Microsoft turn sour about as fast as we saw the fortunes of GM and Chrysler turn sour.
If this happens then people will not go back. These paths tend to be traveled but once.
Sony VAIO desktop problem... (Score:5, Interesting)
The Amiga (Score:4, Interesting)
It amazes me how advanced this system* was for it's time and that it didn't catch on better than it did. The graphics and sound (just for starters) was many years ahead of it's time; x86 was still in EGA and speaker beeps at the time.
[*] - http://en.wikipedia.org/wiki/Amiga#Graphics [wikipedia.org]
Re: (Score:3, Insightful)
As was the Atari ST. Not trying to draw comparisons between the two systems, each had strengths and weaknesses. The point is there were a few very advanced and powerful systems around back in the day, and they likely only died out because EGA and speaker beeps was in offices everywhere.
Re: (Score:3, Insightful)
Apple Lisa (Score:5, Interesting)
Does anybody know what the "unique document management metaphor that has yet to be replicated in a mainstream OS" is, and why it might have set a new standard in computing? It sounds terribly intriguing. Might this be something that could/should be added to Linux?
Re:Apple Lisa (Score:4, Informative)
http://www.youtube.com/watch?v=a4BlmsN4q2I [youtube.com]
Start at minute 6:45.
Seems that you would pick a stack of paper - word processing, spreadsheet, graphing, etc. - and it would "tear off" a new page for a new document that you could put elsewhere.
It may be worth creating a newbie shell that hides many options with an option to go into "advanced" mode. The real endgame will be context sensitive interfaces that allow the computer to guess what you want to do, with an override for people who prefer to keep menus in the same place.
I think a good design is to have all features across the top via pulldowns, and contextual options at the bottom that you can just turn off if you like.
Re:Apple Lisa (Score:5, Informative)
As mentioned by others, document-centric computing:
http://en.wikipedia.org/wiki/Apple_Lisa#Historical_importance [wikipedia.org]
http://en.wikipedia.org/wiki/Xerox_Star#User_interface [wikipedia.org]
People keep having stabs at it, and to give MS their due they did try pretty hard with Win95 and OLE/COM, and got rid of MDI [wikipedia.org] in later versions of Office .. but some it never seems to have been perfected on mass-market machines. The tab-view that we have in browsers now seems to be actively moving away from it (this is your application .. with your documents as child objects to it .. - though at least Chrome has the decency to put the tabs at the top of window.)
It'll probably get leap-frogged as an idea by all this Web2.0 stuff and in-browser apps (which again is a regression: you still have to think about which SoaS-providing site you have to go to get a particular job done.)
deja vue all over again in smart phones (Score:4, Insightful)
So, the one-button mouse didn't make the list? (Score:4, Interesting)
That was one of the most serious design mistakes of the last thirty years, but it's only really interesting because it's symptomatic of Apple's design philosophy, which is: "Do as I wilt".
The one-button mouse spanned multiple generations of Apple computers and underscored Apple's stubborn unwillingness to produce computers that do what their users want, and not what Jobs or Apple's HID team think they should do.
Really. Apple refuses to correct the annoyances of the UI that should not exist. Why doesn't OSX have a maximize window button? Why does clicking on "one hour before event" for an ical event reset the clock to one hour before the time you click the button, and not one hour before the event? Why doesn't finder support afp connections over ssh?
None of those things seem to be complex, every one of them is a failure of the UI, and yet none of them have been corrected.
You never had to explain how to use a mouse (Score:4, Insightful)
When the Mac came out, every software user's manual had to explain how to use a mouse. I witnessed early Mac users would couldn't grasp the idea that the pointer on screen was controlled by their hand on the mouse. People would watch their hand moving instead of watching the mouse pointer on screen. A single button was the right choice in 1984. Nothing stops you from connecting a multi-button mouse to your Mac, and all of the buttons and scroll wheel work swimmingly.
People still don't understand double-click vs. single click. My father is brilliant, but he double clicks everything out of habit.
And what is "maximize" good for. Isn't it ironic for someone who derides a one button mouse to want a one window GUI ?
biggest mistake: PC = 8088 not M68000!!! (Score:4, Insightful)
I believe the biggest mistake was IBM using an Intel8088, instead of a Motorola68000.
Imagine for a moment what would have happened if IBM choose in the early 1980s a 32 bits processor for the first successful Personal Computer!
Re: (Score:3, Informative)
The cost for 16bit hardware was considerably higher. This was not a mistake, but a very practical decision to allow the IBM PC to use existing hardware with little modification. The other reason for not using the 68000 was that part of the point of using a member of the 808x family was so that CP/M could be run on the PC (that's not the direction they went in the end, but still CP/M was the king of business systems at the time).
IBM was very specifically making a business decision. There wasn't a lot of s
Re:biggest mistake: PC = 8088 not M68000!!! (Score:4, Informative)
Hindsight is usually 20/20. But in this case maybe not. Here's what an M68000 would have brought to the table:
1) Higher costs - the reason IBM went with the 8088 is because it was less expensive.
2) No 640k memory limit - okay, but then we'd have the issue of 16MB memory limit a few years later since the M68k had a 16 bit external bus. BTW, the 640k limit was particular to IBM's implementation, not necessarily a limitation of the 8088. Because everyone else copied that implementation, we have the 640k limit.
3) 8-bit and 16-bit mainstream computing - why do I say that? Because memory cost a lot of money back then. Even though the M68k can use 32-bit code, the first computers would have come with miniscule amounts of memory. 32-bit code would not be a good idea then.
4) continuation of CISC architecture - I personally don't think this is much of an issue, but some people do contending that the current CISC-to-RISC translation still takes up significant silicon real-estate.
Re:biggest mistake: PC = 8088 not M68000!!! (Score:5, Interesting)
There were other reasons for IBM to go with the 8086-family chipsets:
1) the 8086/8088's bus could easily drive the 8080-family support chips such as the 8251, 8255, 8259 etc. to build a complete system. The MC68k family support chips were even later than the release of the CPU itself (in some cases like the MMU several years late) and the MC68k bus could not be easily interfaced with the Intel family chips which were cheap and in plentiful supply.
2) the 8086 family's internal data registers and addressing modes were designed to simplify conversion of existing 8080 code to run on the new 16-bit CPUs. The 68k, although a superior CPU in all respects to the 8086 family, had no tools available to make code conversion from the 6800 or other sibling CPU family (6809, 6502 etc.) simple -- all 68k code had to be written from scratch.
3) the 68k was an expensive chip, not suprising as it was complex and required a large die, necessitating a 0.6" wide 68-pin DIL ceramic package. Motorola's target market for the chip was $10,000 workstations, not "toy" desktop computers only costing $2,000. By comparison the 8088 was cheap as chips.
PCjr (Score:5, Interesting)
The biggest single problem with the PCjr was that it was late. In 1984 it was supposed to be on the shelf in the fall - October is the usual month when things are supposed to be shipped so they are stocked and on the shelf in November.
Didn't happen. Macy's had received $50,000 to hold shelf space for the PCjr and they left them empty.
The PCjr came out in February. A little late for Christmas. Everyone had created products for Christmas 84 specifically for the PCjr, but there wasn't anything to run them on. January 1985 CES was pretty dead - lots of PCJr games that nobody cared about. Parker Brothers closed down their electronic games division, as did lots of other companies right about then. It was a year or so later that the Nintendo finally started making inroads into the home game market but between the PCjr and Nintendo things were very, very dead.
You can say all you want about a poor design of the keyboard and limitations of the hardware. But it is even more difficult to use when it doesn't exist and cannot be purchased. Not having it in time killed it, not any stupid design decisions.
TI Sidecars (Score:3, Interesting)
Not 'classic', but still... (Score:4, Interesting)
It isn't really a 'classic' mistake, but the biggest PC design problem today from where I'm standing is over-reliance on fans. High volume fans will result in fuzzy lint growing on the devices which can least afford a layer of fuzzy lint.
In the past year, I've revived dozens of computers, and nearly every failure can be directly attributed to lint induced by fans.
A few of my favourite things - from the workshop (Score:5, Interesting)
I could go on...!
This is a good opportunity for a new myth (Score:5, Funny)
Well sonny, I remember it was back in the '80s. There were these guys who loved their Apple IIIs so much that, despite its faults, they kept them running for years beyond their useful lifetimes. They did this by filling their offices with industrial-strength fans pointed at those Apple IIIs. Ever since then, we've called people who continue to support obviously flawed products "fanboys"
No standard connectors in 1983 (Score:4, Interesting)
The PCjr's serial port, monitor port, joystick ports, keyboard port, and others used different connectors from the IBM PC. In fact they were not only non-standard connectors, but completely proprietary connectors that couldn't be found on any other computer.
People, this is 1983. All connectors were "non-standard". Nowadays we're used to a standard connector and pinout for RS-232 [lammertbies.nl] and parallel [diyha.co.uk] ports on the back of PCs. But in 1983, exactly one model of computer used them: the IBM PC. It didn't more than a couple years for people to realize that the only way to compete with the IBM PC was to be extremely compatible with it. But when the PC Jr. came out, everybody (especially IBM) used business and sales models that paid no attention to the idea that computers and their components could be commodified.
Small qualification: the use of 25-pin D-shaped connectors with specific pinouts was part of the RS-232 standard. But 25-conductor, straight-across cables cost, and you actually didn't need most of those signals for typical applications. So making cables that would connect some random computer to some random modem or serial printer was a serious black art. There was even a book [amazon.com] on the subject.
(Jerry Pournelle once wrote that he used internal modems because he could never remember the pinouts he needed to make cables. But by the time he wrote this, RS-232 pinouts had been standardized and cheap pre-made modem cables were in all the stores. Pournelle is the original know-it-all ignoramus computer pundit.)
Parallel printer cables were even worse. They all used the Centronic [pinouts.ru] de-facto standard on the printer side. But to save money, everybody used 25-pin D connectors at the computer side, and the way the 36 Centronics signals mapped to those 25 computer pins was different for every manufacturer. It took IBM to standardize the pinout, and also to standardize making the printer connector female so you didn't accidentally plug a modem into it.
PCjr (Score:3, Insightful)
I don't understand why the PCjr is bashed so much. We had one and I thought it was pretty damn good. Granted I was quite young, but we did put that machine to good use for quite a few years. We did get the chiclet keyboard, but by that point IBM was already including a similar keyboard with conventional keys so it was a moot point. I actually thought the keyboard was pretty cool. It wasn't the best for typing, but I think it was more a consequence of the technology available at the time and the size of the buttons than anything else. I'd like to think that current Apple keyboards are a spiritual successor and show that the concept wasn't necessarily a bad one. As for the IR, certainly you had to be careful with anything getting in between the keyboard and the machine, but generally it was excellent and we never ran into problems. I must preferred that to having to deal with a cable.
As for the sidecars, it's not like people at the time were upgrading machines anywhere near as frequently as they do now. And there were tons of clumsy upgrade solutions for many computers at the time. When a 128K memory card was as large, if not larger, than most video cards today there aren't many options for efficient packaging. Actually, the upgrade we got was from a company called Legacy and it pretty much was a whole other case, the size of the PCjr which added 512K of ram and added a second floppy drive. It doubled the size of the machine, but that's just how things were back then; it never bothered us.
The PCjr was a better machine than pretty much anything else I encountered through much of elementary school. It was far superior than the crappy Apple IIs we had in school. It offered better resolution and 16 colors. What did suck, however, was that it was somewhat less powerful than the IBM PCs available then and later on. While it supported CGA, it's 16 color format was proprietary and not compatible at all with EGA. But regardless, for $1000 it was a great deal and generally compatible with most IBM PC applications.
I haven't gone through all the "mistakes", but it seems like this article is written from a modern-day perspective which is inappropriate given the era when these machines were designed and manufactured.
Real mistakes (Score:5, Informative)
Those are mistakes an end user would see. Here are some deeper mistakes from an engineerings standpoint.
Re:Keyboard layout... (Score:4, Insightful)
Nothing irks me more than having to go hunting for oft-used keys such as end, delete, etc. on every different laptop. I've seen them below shift, above enter, buried as an Fn-key... *continues on for another few minutes*.
Re:Keyboard layout... (Score:5, Interesting)
Re: (Score:3, Interesting)
Clearly you're not thinking differently enough!
Re: (Score:3, Insightful)
I get no end of issues with " not being above 2, # being a \, and other non-UK keyboard layouts screwing up user experiences.
Re:General trend (Score:5, Insightful)
And even though its not classic, I think the "underpowered" Vista machines deserve at least a mention.
Can we stop with the knee-jerk microsoft bashing? The article is literally titled "Fifteen _Classic_ PC Design Mistake." There's nothing in the article that would make a vista reference even relevent. Posting as AC to avoid karma whoring like the parent.
Not really (Score:5, Insightful)
Well, way I see it, not really. At _least_ half the mistakes there are about cutting corners (e.g., the crappy cheap keyboards, an ultra-expensive computer shoved out the door with an unreliable floppy drive, etc), and most of the rest are about blatantly trying to nickel-and-dime the users (e.g., the lack of a format command so they have to buy their floppies from you only, or all the connectors on the PC Jr being incompatible with the standard PC ones, etc.)
Unfortunately both types of failures are standard stapples of capitalism, so don't expect them to go away any time soon. Even though those particular 15 manifestations of them might not happen again, we're just seeing new and innovative ways to do the same two things. E.g., when EA cuts costs on testing their new game, _and_ launches a new game with over half the content sold separately (check out The Sims 3: from day 1 there was more virtual furniture for sale for real money on their site than included with the game)... I'm sure you can see the same two things at work.
E.g., for hardware, when as you correctly mention a system that's waay underpowered for Vista is sold as Vista ready, you have the first failure mode in action: they wanted to sell a system as Vista ready, without actually including the expensive hardware needed to actually be ready. It's just cutting corners.
E.g., nickel-and-diming... well, let's just say HP's whole printer ink business is based on that. It recently even reached such absurdity as including chips to make the ink or toner cartridge artifficially "expire" after a while, even if there's actually plenty of ink left inside. For some users that already was the straw that broke the camel's back, but I expect some bright MBA to try something even more ham-fisted soon.
Re: (Score:3, Funny)
Sounds more like the last generation G5 iMac.
Re: (Score:3, Funny)
Steve Jobs is that you???
Honestly your post sounds a bit short sighted. Why do we need uniquely identified machines? Why no user upgradeable parts? Why do you want everyone to have the Imac style?