Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware Books Media Book Reviews

Fire In the Valley: The Making of the Personal Computer 100

Fire In the Valley is not about the computer or software industry in toto -- this book is about the evolution of the PC, including well before it was called the PC. Forget being able to pick up the phone and order a speedy machine with plenty of RAM (for overnight delivery, no less!) from any of several vendors, for less than a month of the minimum wage: this was a time when the very idea of a machine for casual or home use, or even have a full-featured computer on one's work desktop was radical, even laughable. A personal computer? There was no such animal, until the people in this book invented it. (Read more below.)

Fire In the Valley
author Freiberger, Paul and Swaine, Michael
pages 448
publisher McGraw-Hill
rating 8.0
reviewer timothy
ISBN 0071358951
summary A look into the the personal computer revoluition; lots of history straight from the mouths of the revolutionaries.

*

In that light, it's not as wide-ranging a book as Steven Levy's Hackers, but Fire does a good job of getting into the details of the sometimes bizarre culture of programmers and hardware gurus who decided that -- however impractical it seemed at the time -- eventually people were going to want computers to communicate, to track their bills, to play games. They knew there was money in that line of thought, even if a few of them though that the real benefit of personal computers would be in a new and lasting sense of community. And some of them wanted machines just to play with, too. In short, it illustrates the wildcat beginnings of the industry which brings you the screen in front of you.

Positive vibrations

Many of the names in the pages of this book -- of people, machines and organizations -- are ones that computer history buffs will recognize, whether from a stack of dusty documents in the basement, from old copies of BYTE, or from the covers of Time and Newsweek. Some of them, in fact, are names that you would be hard pressed not to know as a modern computer user -- Gates, Jobs, Ellison. Others are obscure outside of computer-geek circles -- people like John Draper ("Cap'n Crunch"), Lee Felsenstein, David Bunnell (founder of PC Magazine and a host of other computer publications), and Steve Dompier.

After a whirlwind historical introduction to the context in which the personal computer became possible -- including appropriate obligatory mentions of Babbage, ENIAC, and the invention of the integrated circuit -- Freiberger and Swaine land us in a chapter called "Critical Mass" in early 1969. That's when a fateful piece of business landed at then-memory manufacturer Intel: a piece of contract work for a Japanese calculator manufacturer led to the development of 4004 microprocessor, then the 8008, which gave way in turn to ever more complex and capable microprocessors. While Intel did not remain alone in this field for long, the introduction of cheap integrated chips is what set the stage for everything else that happened in the years afterward.

But what did happen afterward? The authors concentrate on the first successful home or hobbiest computers, like the MITS Altair, the Osborne and the Apple I, as well as the generation of corporate fueled projects which emerged once it was clear that people would indeed pay to have a computer on their desk or on the fiddle-with workbench in the basement. They also get into quite a few of the ships that sank as fast as they were launched, and some of these spectacular failures make interesting reading even if you're not thinking about starting a business, but sobering if you are. (Radio Shack, TI, HP, and others probably all regretted not listening sooner to the scruffy hobbiests who swore that people would buy them.)

Unlike certain other techno-hagiographies, most of the action in Fire takes place west of the Missisppi (the Valley in the title is the one you'd think -- Silicon Valley), with only passing references to much of the East Coast action going on at the time at universities and at corporations like DEC. That bias is not accidental, and really doesn't constitute a sin of omission. The boards of big corporations like HP, DEC, and IBM saw little profit in putting many of their precious dollars or engineers on low-margin personal machines: they liked to make big iron (or at least medium iron) that they could sell in hefty chunks to tech-savvy companies to help do large corporate tasks. Selling to individuals, in a world where the necessity for a personal or household computer was not yet established, would have been an odd move. All of those companies later saw the error in their ways and did enter the field of PCs, of course -- ironically (and past the time frame this book primarily addresses) the once-giant DEC was eventually subsumed -- by personal-computer maker Compaq.

But out there in the heady Wild West, when computers were still mostly room-sized beasts which needed constant tending by a full-time staff, grew the makings of The Homebrew Computer Club, which drew people whose names would eventually adorn magazine covers and company letterheads. Right from its start in 1975, the Club started attracting cantankerous circuit designers, switch-flippers and other hackers who wanted to have their very own computer room.

With surprising ease, the story moves among Silicon Valley to Washington State (home of Bill Gates, and eventually, Microsoft -- a whole story in itself, well covered here) and Ed Roberts' operation MITS, creators of the Altair hobbiest computer.

It's a rush to read, and for most of the ride you may forget to stop reading.

A few sour notes

For most of the ride, I said. The story telling is crisp and enjoyable for the most part, but in book this size there are bound to be a few dips. I could have done with a lot less information, for instance, about the role of est at once-promising computer maker IMSAI. While it's certainly an interesting influence (hey, this was California in the 70s, what do you expect? Puritanism?), it didn't turn me on for as many pages as it seemed to have the authors. Perhaps in an effort to make each one readable as an independent section, certain of the chapters read almost as if written for separate publication; how many times do we need to be reminded how Bill Gates and Paul Allen started working together, and that both went to Harvard?

Perhaps because as someone who grew up mostly in the 80s and has spent most of the 90s with at least one personal computer at all times, the story of Jobs split from Apple, the development of Windows, and many of the other later developments covered in the last third of the book were all much less enjoyable to me than the "ancient history" portion. While it's interesting to see these things put in the context of the personal computer revolution as a whole, it was much less informative simply because the history is so close.

Likewise, the brief treatment of the World Wide Web, and in particular the conflicts (still ongoing) outlined in the chapter called "The Browser Wars," I could have skipped and enjoyed the book just as much. It's in this chapter that the most mention of Open Source and Free Software is made, and it may be a decent introduction to the topic to someone unfamiliar with it, but I suspect most Slashdot readers will find themselves skimming for information they didn't already know in this portion.

What it's not

If you're looking for hard-core information on early circuit designs, or code snippets from the programs which launched the rise of the PC, this is probably the wrong book -- though there is a satisfyingly large reproduction of the circuit board of the Apple I. The infectuous spirit of invention and a strange ort of aggressive computer-based fun comes through clearly, because Freiberger and Swaine concentrate on the personalities and business realities of the early days of the PC more than they do the technological advances which made it possible. Whether you find this more engaging or annoying is of course up to you; I found the stories and interactions of the early PC pioneers fascinating, less so the business machinations of the 80s and 90s.

Bonus Play

My favorite aspect of the book is probably all the history of the Altair, the West Coast Computer Faire; I consider these a bonus even though they're a normal part of the book.

This book is labelled a "collectors edition," though (the book was first published in 1984; all the information about the late 80s and 90s is obviously added since then), and it does come with a few nice extras to justify that: For one thing, it's a nice, heftily large book (nearly 450 pages, not counting the massive index); there's a great selection of photos in the middle of the book, too, with some shots that will make you either smile or cringe to see how young these people really were.

A timeline at the beginning of the book is great reading all by itself, and an included CD-ROM contains more audio clips, pictures and history as well.It's well suited to your coffee table or mantle for pleasant evening reading.


You can purchase this book at ThinkGeek.

This discussion has been archived. No new comments can be posted.

Fire In the Valley

Comments Filter:
  • I'm curious - has anyone written a book similar to this about computer/video game evolution? I remember reading a translation of John Dvorak's book on gaming that was published in the early 90s, but it did not cover non-PC platforms.
  • by Anonymous Coward
    I don't recall the author but a great take on Xerox PARC can be found in 'Fumbling the Future'

    It does a great job of contrasting the immensity of the research facility's work against the utter failure of the parent company to produce viable products from this work. (Although Xerox did reap a few billion from laser printing, more than enough to justify funding PARC.)
  • by Anonymous Coward
    Look for 'The Home Computer Wars' by Michael Tomczyk. The author was ack Tramiel assistant for five years including the nascent PET period up to the 64. The book trails off around the time Tramiel took over Atari.

    A volume that did a good job of covering the native UK computer industry would be of interest to me. I used to be fascinated by imported copies of mags like What Micro? that had all of these machines that were virtually unknown in the US.
  • Accidental Empires(AE) was the name of Cringely's book and was turned into Revenge of the Nerds on PBS.

    AE looked more at the business side of SV in the early-mid 80s and 90s. Great focus on why companies prospered, why they died, and how so much of what is now in SV a result of..well..accidents.

    Fire In The Valley (FITV) is a much older work, by almost 10 years or more. It's focus is much older, following the development of the PC itself, and less on the entire technology that existed in the SV area. FITV starts its timeline in the early '70s, and issues in a much more technical sense than AE does.

    In short, give AE to your parents, read both yourself.
  • by PC you mean Personal Computer or "compatible PC"? Because Lisa was made by Apple...
    My first computer was a Sinclair ZX81, my first PC was a 386SX16 1Mb/40Mb :o)
    --
  • Dealers of Lightning: Xerox Parc and the Dawn of the Computer Age by Michael A. Hiltzik

    This covers the history of the hugely influential Xerox PARC research center from its founding through the glory days of the 70s when they invented modern desktop computing. It's an important part of computer history that was typically only recorded as "Steve Jobs steals the GUI from a visit to Xerox PARC" until this book was published.


  • I think I've had it since 1984...
  • "Fire In The Valley" was the basis of last year's rather fun geek movie "Pirates of Silicon Valley" [imdb.com].

    Well worth watching. If only for the first scene at the filming of the 1984 commercial...

    S.
  • OK - you got me on the PCjr! But, I still don't think IBM used the term "PC" in any of it's marketing materials for the original beast.

    On the other hand, you have Apple's "Welcome" ad when IBM entered the market. It pretty clearly was trying to decapitalize "personal computer".

    Another datapoint was "PC Magazine", which I think debuted with a picture of the XT on the cover. So by then the term "PC" was already pretty common for "IBM Clone".
    --
  • Apple used the term "personal computer" heavily in it's marketing long before IBM was selling it's "Personal Computer". (IBM never called it a "PC".)

    The idea that "PC" == "IBM Personal Computer Compatible" wasn't universal until a few years later.
    --
  • It would have been funny if the review had 8086 bytes in body.
  • and we need new innovation and new, original ideas to speed up and destabilize the computer industry.

    That's what Gates and Allchin have been saying! But the DOJ and those pesky "freedom of information" people are fighting against the American Way, trying to stem the tide of innovation, increased disk-space requirements and neato-keen animations for deleting files. Microsoft has been about the innovation of everybody buying at least one license for Microsofts every software package upgrade every few years, as far as Bill Gates has been able to recall.

    --
  • by Pope ( 17780 )
    No such beast as the Atari 8600.
    I think you mean 400 or 800.

    Pope

    Freedom is Slavery! Ignorance is Strength! Monopolies offer Choice!
  • Death and suffering is commonplace. Success is not. People like to read about new and exciting things.
  • The Computer Revolution was over in the mid 90's. We're more in sort of the Computer Middle Age now. Let's just hope we can avoid a 'Dark Age' by not making religous text out of historical fact.

    Time will tell. To say that the "Computer Revolution was over in the mid 90's", when we are only now barely out of the 90's is, IMHO, grossly premature. Of course, you don't define "Computer Revolution", so it's hard to tell what you're really saying here.

    So... before I go farther with this... what do you mean by "Computer Revolution", and what happened that the mid 90's marked it's end?
  • I would agree that anything fitting the definition of "revolution" in the personal computer industry has passed. One of the reasons that Dell and Compaq stock are getting slammed by Wall Street is because very few people are now buying a first computer. People that don't own any kind of PC today are probably not going to own one in the forseeable future. That very important market *growth* just isn't there anymore. Sure, the population is increasing to some degree and people are replacing old computers with new ones, but that's not where the big money is. I would say the PC "revolution" ended a lot closer to 2000 than 1995, but it was definately in the 90s.

    -B
    • Accidental Empires by Robert X. Cringeley
    • Big Blues, The unmaking of IBM by Paul Carroll

    Although both are focused on specific events and companies, they give good background information on the industry.

  • The real danger here is inspiring new, talented creators and inventors to continue to think like the people who originally developed the first PC's. These people were innovative, but the technology is aging... badly... and we need new innovation and new, original ideas to speed up and destabilize the computer industry.

    The Computer Revolution was over in the mid 90's. We're more in sort of the Computer Middle Age now. Let's just hope we can avoid a 'Dark Age' by not making religous text out of historical fact.


    I think the introduction of personal computers and the profileration of competition was far more radical and desatabilizing than anything that is currently going on!

    Think about it - we have gone from a time when the notion of individuals (let alone non-engineers) having any use for computers was absurd, to a time when they are ubiquitous and outselling TV sets!

    In the early days there was a mass of creative innovation and competition. Apple I/II vs the Comodore PET, Ratshack TRS-80, Acorn computer in the UK, etc, etc. Compare that to today where it would seem futile to try to compete with the Wintel x86/Windows monopoly or whereever they want to lead us.

    In a way the open source movement, and the mindblowingly fast adoptation rate of Linux is a return back to the good old days - a return to the joyous celebration of computer technology without regard to viable business models, and an implicit faith that if you build stuff people find interesting the market will come. A field of dreams. This is what it was like in the early 80's when I worked for Acorn - computers were designed as much by engineers who defined what was possible or cool, as by marketing groups who decided how to one-up the competition with spec. sheet features.

    Reminding people of the time of revolution isn't making it into a religious text, but rather reminding people that computers and personal electronics can be as much entertainment as business, and perhaps that's not so bad as a business model going forwards.
  • ... old issues of BYTE ...

    Am I the only one with fond memories of Creative Computing? I never liked BYTE much... :)


  • My father actually bought a PET, it was an earlier version though, with no cassette. It was cool, it ran Basic as an interface...kinda strange, but I didn't know the difference. I wrote little scripts on it. Thats when I fell in love with programming. 15 years later and we're still going strong.
  • Is a book detailing the history of "hobbiest" computing before 1970. In other words, there had to be hobbiests building thier own computers out of junked telephone relays, vacuum (sp?) tubes, homemade punched cards - maybe even a few fortunate ones managed to salvage some transistors, or even a thrown out teletype (or maybe hooked up an electric typewriter, or something.

    I tend to pick up nearly any book related to computer history. My favorite books that I own are actually historical in nature, in that they were written as "up-to-date" cutting-edge "about computers"-type books - but are from the 1950's and 60's. I managed to find one book on the building the TV Typewriter (though this is from the 1970's). I also have a strange "homebrew" book (I hesitate to call it this - it is stapled along one edge, typewritten, 30 odd sheets of 8.5 x 11 inch paper - very homebrew) detailing building computers from scratch. I can't remember the publication date, but I believe it was around the early 70's.

    Anyhow - I am sure that there were hackers building computers in their garages - not anything like we would call PC's - they weren't portable by any means - but more like homebrew minis or mainframes. Like we have hackers today homebrewing supercomputers from throw away old pentium boxes.

    Does anyone have any ideas, info, or anecdotes about this? I would be especially interested in the latter. Any "old-timers" with strange friends? More importantly - do any of these old machines still exist (side note - it is like tracking down old homebrew robots - nearly impossible to find - I know of one still in existance - AROK - one that I have pics of, but nothing else, that I would love to find out if it still exists, was called CHARLIE - about 6 feet tall, white, boxlike humanoid construction)...

    Worldcom [worldcom.com] - Generation Duh!
  • This book, (I've read it) is about the revolution in silicon valley (thus the name) and in Washington.
  • and 'Pirates of Silicon Valley' type stories

    IIRC Swaine, who writes for Dr. Dobbs, said that Pirates was partly based on this book.

  • When I was in college I did sys admin work on a vax. Learned pascal and c on a vax. Nice machine.
  • Great books to read for the history of computers/hacking in the latter half of the 20th century:

    Hackers by Steve Levy

    The Soul of a New Machine by Tracy Kidder

    Fire in the Valley by Freiberger and Swaine

    The Hacker Crackdown by Bruce Sterling

    Any more?

    The Soul of a New Machine chronicles the design and construction of the first 32-bit minicomputer by Data General, and especially about the hardware hackers who did it. Within a few years minis were completely dead, killed by the PC.

    The Hacker Crackdown is about the Secret Service and telcos' going after bbs's and phone phreaks. And, of course, Steve Jackson Games. It came out in 94, I think, and the technology described, bbs's, was completely replaced by the internet within a few years. The issues are still important, though.

  • We are approaching the end game of software development in almost all arenas except AI and games.

    I couldn't disagree more. We may find that we run out of steam with hardware, but as I look around at the software in use I notice that the interfaces are hard to grasp, that work flow is not well supported, that programs don't always interact nicely. Programs still crash far too often. They consume far more memory and processor than they should. Software projects are over budget and late far too often, and by wide margins. Programs fail when presented with unfortunate but all too common badly formed input, or an uncooperative environment.

    Then there is this business of demonstrating that programs are robust, and do what the spec says they will -- we still don't do that well at all.

    I think we still have lots to learn about software construction. I'll accept that we have reached the endgame when building a program carries the same kinds of risks as getting out of bed in the morning ;-)
  • Then how do you explaing the PCjr [ibm.com] in my basement?

    Or this page [ibm.com] from IBM's history timeline?

    I don't remember who orginated the term. My first experience with computing was with an NCR 9600, which was an early (1977) desktop model made for office use that used a cassette tape drive (and later an 8" floppy disk).

  • The only reason I became interested in computers was the C64!

    I, personally, got interested with the Sinclair ZX Spectrum - a stunning machine to program for, with a lot of games.

    I think the Spectrum was a better machine at the time, it had many more games, and it was possible to get assemblers, c compilers and many fun things for it.

    I always thought that the Commodore only took off in a big way in the time of the 16 bit machines - where there was the great computer holy war: Atari vs. Amiga

    (Hmm, Macintosh vs. PC, sounds familiar ;)

    I think its fair to say that the most popular 8 bit machine in the UK was the Spectrum, partly because it was very cheap, and partly because it was well made - except for the bizarre way it had of entering BASIC keywords, (eg. Press 'p' to type "PRINT"), but hey, I was young then, and didn't know any better.


    Steve
    ---
  • I used to sell Osborne Durangos, Trash-80's and Ohio Scientific. We even sold the Xerox microcomputers with the icons (forerunner of Windows). The Ohio Scientific had a 10 meg hard drive that was the size of a boot box. We never got it to work. We all swore the IBM PC was a piece of crap that would never go anywhere. Z-80 was the way to go.

    God, was I an idiot!
  • For what its worth "Pirates of Silicon Valley" was based on this book.
  • The program your thinking about, I think, was called Triumph of the Nerds
    .

    Anyway this book predates that program by about a decade so if anything Triumph of the Nerds is just a rehash of this book.
  • Fucking here here for the note about Commodore. More people bought C-64s for home than any other computer and yet the MAC (and PC)friendly writers won't admit it. I took Cringley to task several times and he would never reply. JLW
  • Yeah, I remember the Homebrew Computer club. Lots of the engineers at the mainframe computer company I worked for in Silicon Valley went. I went a couple of times. Couldn't stand the emphasis on hardware and bit twiddling, even though I was a bit twiddler at work. I also worked for MITS, the Altair folks. I sold my Altair for $1,200, including the teletype with reader/punch and cassette player, in 1982. I agree, Silicon Valley was only one of a multitude of places (most of which were west of the Mississippi) where the PC was invented. MITS, for example, was in Albuquerque, New Mexico. Gates and company also started in Albuquerque. I can't imagine why they wanted to move someplace where the sun doesn't shine. Peace
  • The only reason I became interested in computers was the C64! I think many people my age (30) had a C64 or Vic-20 because of the games and maybe some word processing...however, I eventually became interested in computer programming (remember those games in COMPUTE! you could type in!) and communicating with a computer (with my 300 baud modem and some great BBS's in my town) since I had a C64.

    IBM machines were just too expensive at the time, and when I did see one that my friend's parents bought, I was not impressed (where's the sound and color!?!) I believe that there would not be as many computer users today if it had not been for the Commodore computers...I'd love to read about their history and relationship with the rest of the computer industry at that time.
  • The description above makes this book sound a lot like the Robert X. Cringely [pbs.org] show on PBS a few years back (the name escapes me now). I wonder if this book is just a rehash of that show or if there is a lot of new information.
  • For anybody who dosen't already know, this book was also made into a movie at one point, "Pirates of Silicon Valley". It's a good watch for somebody too lazy to read the book. (or don't have time, etc.)
  • Hey, ahh, could someone tell me where I can get a Speedy PC with PLenty of Ram shipped overnight for less than a month of minimun wage? I want 2 of them!
  • by n3m6 ( 101260 )
    while you are reading the book you may aswell get the movie .. "pirates of the silicon valley" .. based mostly on this book..

    Early to rise and early to bed makes a male healthy and wealthy and dead.
  • I could have done with a lot less information, for instance, about the role of est at once-promising computer maker IMSAI. While it's certainly an interesting influence (hey, this was California in the 70s, what do you expect? Puritanism?)
    What is est? I'm assuming that's only part of the word, but what word? I'd assume incest but that'd make for interesting reading and the author of this review didn't enjoy it.

    Molest?
    Forest?
    Digest?
    Pot-Fest?
    Maybe he meant EST (Eastern Standard Time)?
    What does it all mean??!

    Mordred
  • If you want to read the Hacker Crackdown for free, check out the Guttenberg project.

    I happen to know that the Hacker Crackdown came out around the summer of '92, because that's when I picked Bruce Sterling up in my '81 Caprice Classic at the Montreal airport, nearly scaring him to death. It was a real pleasure to meet him, and he even autographed my (used - doh!) copy of some of his short stories.

    It's a good book too - but you'll be amused at how much has changed since then and all the references to internet "nodes".

    Ken

  • Gates by Stephen Manes

    Hate the man or despise him, after reading it you'll respect him.

  • The single most important event of the personal computer revolution was when Compaq, and later the Phoenix Group reverse-engineered the IBM bios, creating the existance of inexpensive IBM PC clones.

    Were it not for that event, the PC market might still be divided between Commodore, Apple, IBM, and a few other bit-players.

    So put that in your DMCA and smoke it!

  • Still, the expert user could do much the same with them as what we do with computers today. Sure, today we have various bells and whistles, but the central uses are the same.

    Buh? I think the "expert users" you refer to would almost universally agree, that modern computers allow them to do more stuff (several orders of magnitude more stuff) than earlier they ever could with earlier computers. And they can do it faster, cheaper, and smaller, too!

  • I still have a few copies of Creative Computing from 1975 and 1976. One with the original ad for the $666 Apple motherboard.
  • The more interesting thing with this is that we ALL grew up with these machines - anyone working in software now, chances are they started programming on a C-64, Spectrum, whatever. With these machines, it was all pretty stripped down, and one person really _could_ make a difference - some of the greatest games then, like Paradroid, really were just written by one person. The limitations of the hardware meant that you had to learn about programming the sound chip, handling sprites and stuff like that, usually in assembler. But the hardware was all pretty simple then, so it wasn't too hard to figure it out. And if you knew that one person, working on their own, could do that, there was the incentive to see if you could too - you knew it was possible.

    Fast-forward 20 years. The hardware is so immensely complex that no one person can figure it all out (although Tom Pabst does a good job! :-) and all applications, games and other programs are so complex, it's difficult for a newbie to get started. And there's no really successful single-person programs (AFAIK :-) so where's the incentive to try? If you know you can't do something on your own, bcos all programs (even the open-source ones) require vast project teams to do it, where's the spark to make you try?

    I'm not advocating that all 16-bit processors and upwards should be ritually slaughtered! :-) The point really is that we've been lucky; our learning curve started low, with fairly simple computers, so as the technology involved in those computers has stepped up, it's been easy for us to make the jump to the next computer technology, using what we learnt b4. But today, you have to START on your Pentium, figure out Windows programming, DirectX or whatever, and that learning curve is more like a cliff-face!

    I'd be interested to see how many software ppl we've still got in maybe 10 years time, when the new ppl have only ever known hugely-fast machines. Also, I'd be interested to see what the software's like. Starting on slow machines, we naturally learnt how to optimise for speed when we needed it. But if you've got some serious hardware, where's the incentive? I work on embedded stuff, which is much closer to the older machines and often needs to run fast to control the car engine or whatever - if we get ppl used to writing bloatware, it's going to be difficult to get them out of the habit.

    Grab.
  • When Bill Gates was just a millionaire
    --
  • Well, some of the people working as janitors, gardeners, mainentance people and construction workers might be descendants of the original inhabitants, but a good number of them are probably recent immigrants. Not that it really matters. And why do they stay in the valley? Because that's where the money is.

    As for gentrification that's old news. With the dot.com slowdown the process it starting to reverse. Changing economics is just a fact of life. Some people welcomed the wealth that was brought into their neighborhoods.
  • Both the PET and the 64 are mentioned in this book along with the inventor of the microprocessor (MOS?) that went in the 64 and Apple ][.
  • One problem I had with the book was that it has too much white space and too many blank pages. The type was fairly large and the margins were big. Reminded me of those papers in college where you make adjustments with your word processor until to just make it to the 10 page length set by the professor. Although these authors were trying to get it to 450 pages.
    I read both the original and the updated version. The original had some interesting pictures of a Macintosh with a 5.25 inch floppy drive. This picture doesn't show up in the new edition.
    The only other problem I had with this book is the redundancy. Some of the same facts are repeated in different chapters.
  • an apple II, and two S-100's running CP/M.

    i have an altair that i was given laying around in my shed, someday i'll put it online so it can get ceremoniously slashdotted.

    maybe even a 68000 based system or two. one runs pascal natively, one runs CP/M 68K.

    before all the computers we were getting had-me-down TI calculators that had the little mag strips you could store in the back.

    memories....light the corners (of what used to be) my mind...

  • est is the entire word. It mostly consisted of seminars conducted by Werner Erhart to "elevate human potential". The Skeptics Dictionary [skepdic.com] has a nice article on it. They also have a ton of additional links which I won't reproduce here. The first est seminar was in 1971, and the last was in 1991, and about 700,000 people took the seminars.

  • No, but my dad cleaned out his basement and made me take back all my old relics, including an Atari ST, Commodore 64 (with 300 baud modem and actual 5.25" disk, the tape drive was lost long ago)... Actually, I was flipping through my old disks and got a wave of nostalgia. I may fire up those machines one of these days!

    Yep, hacking C64 Basic .. those were the days!

    ---

  • The next big step will surely be developement tools. Right now creating software is so complicated, very few people can do it well. This may be a good thing for us /.'ers, but very bad for everyone else. I think this is why VB is popular .. The slightly above average user can create macros and stuff in Excel and MS Access, and create applications that only a skilled coder could create using standard tools. I can see that this trend will continue, and in 10 years anyone with half a brain will be able to create software by linking together little software units... kinda like the way OOP is headed already.

    ---

  • Fadein to porch. Two men sit on rockers.

    Bill: Well, Andy, I think we have pushed office and internet applications about as far as they need to go. From now on, we'll just fix bugs and work on operating systems stability.

    Andy: Ok, Bill. Then we'll stop developing faster processors. Instead, we will work on pushing down the prices of machines with the same performance.

    Bill: Yup. That seems the thing to do.

    Andy: (Rocking slowly.) Yup.

    Bill: Andy, what about games?

    Andy: Just a few kids play those. No money in it.

    Andy: Bill, do you think there might still be another "next big thing" out there? Do you think you might be able to invent the next Visicalc?

    Bill: Huh? Invent? You let me know if someone does, and then I'll embrace it...

    Andy: Oh, ok. (Sighs.)

    Fadeout.

  • The software of the future will do the same things that it does today, in the same way. It will just be bug free, faster and a tiny little bit easier to use.

    Perhaps not bug free...but close, right?

    I really don't see the internet as the end of the road. We still have a lot further to go with integrating computers into every aspect of our lives. Not only will everyone have a computer terminal in their house connected to the internet, but their refridgerator, microwave, car, oven, phone, etc will be connected. One day, we'll be able to walk into a mall and download information about the product that we're standing in front of directly to our mobile computing platform.

    I think we have PLENTY of room to expand software and hardware.


    --
  • I would add another book to the list...
    • Stan Viet's History of the Personal Computer
    It covers some of the earlier stuff, and alot of the more obsure systems like the Sphere, Cromeco, Nothstar, and Processor Technology. You can read the first chapter here [geocities.com]

    The first computer I programmed on was in highschool in 1977 on a Processor Technolgy SOL-20 ... a cool machine with a wopping 32KB of RAM and dual 8" floppy drives the likes of which I have seen noplace else ... really weird motorized eject & insertion that quite often jammed. I learned BASIC, FOCAL and 8080 assembler on that box.

    My first home system was an Ohio Scientific C1P that I bought in 1978 with money from a part-time job instead of a Car like the rest of my friends. Leaned 6502 assembler and FORTH on that box, and programmed several games and a "word processor". I also wrote an AI program that was like ELIZA, but on steroids and actually learned words and phrases and understood grammer.

    For information on the SOL-20 and an emulator check here [thebattles.net], and for information on the C1P check here [obsoleteco...museum.org] or here [niagarac.on.ca].

    GOD I FEEL OLD!!!

    - subsolar

  • When I was your age, we didn't have 32-bit processors with GUI interfaces and mice. We programmed in binary with toggle switches and LEDs, and we were thankful for them! 4KB was a lot of memory in those days, yessirree... ;^D

    - subsolar

  • My guess is Erhardt (sp?) Seminar Training. Some wacky mind-control thing where you spend from 9 am until midnight locked in a room and degraded by their wacky mind-control programmers, and not even allowed to go to the toilet alone, for two whole days until you are A Better Person.
    Complete Bollocks! And more than a little sinister. Offshoots of this cult include Life Training which works in exactly the same way, and each inductee will try to recruit everyone they know or cut them off entirely and never speak to them again...
    Not just complete bollocks, but evil bollocks!


    Hacker: A criminal who breaks into computer systems
  • You seem to be lost... This orum is for ppeople who share an interest in the topis covered by this book. Here's a review of a book that covers your interests: http://shopping.yahoo.com/Books/Atlantic_Monthly/R eview/Feed/70055099/ Now, Go Away!
  • I just purchased this book (the CD-ROM-less edition) in Amazon.com Book Outlet for $1.99...somehow I felt it was the right price.
  • What did he get arrested for? Im assuming some sort of theft :P~
  • I remember reading Fire in the Valley before I went to college for my CS degree. I thought that it gave an excellent foundation for understanding what people were thinking when computers were developed. Lots of names and history that no one recognizes anymore but that doesn't mean they didn't play an equal role compared to the current dotcom set.
  • And I certain that books regarding those conflicts would sell well to them. But to those of use reading Slashdot, this book documents what interests us. If you want to argue that we should not concentrate on technology but on social conflict, fine. But Slashdot is not about social conflict (unless in some regard it coincides with technological advances) and should not be an advertising forum for books on such topics.
  • Anybody remember the Heathkit H89? I don't see it mentioned very often in the histories but it was used by thousands of people.

    I bought one (SN 844) in 1979 since it was superior to the Apple I (its only real competitor). 2 MHz Z80 with 48K of RAM and a full 80 character x25 line display (everyone else was 40 character/line then) and a whoppin' 100K floppy drive for only $1500! The Z80 was a screamer. Mine still works. I used it as my primary machine until 1989.

    This machine had superior perfomance for its day. Heath DOS seems to have had many precursors to the "modern" day DOS and only later did I get CPM. Of course I had to write my own programs since nothing was compatible with it :-( I learned FORTRAN 77 on this puppy. Yes, it was SLOW, even then!

    The whole idea at the time was to figure out what these microprocessor thingies were. (My EE degree is in RF hardware engineering which I haven't done for 25 years.)

    You were supposed to build and create back then. Since there was no support, you HAD to. Ever write a speadsheet program in Benton Harbor Basic? After writing the program you made the program into a single line to save memory.

    Sigh, but I show my age.

    Rick
    WA3VTF (yeah, I still do RF, just not at work)

  • I was making a point by stretching it a bit. The idea is that GUI was not original with Apple. Apple borrowed all the basic concepts.

    My memory isn't what it used to be, so thanks for your update on SCP. Essentially MSDOS was a piece of junk, far outclassed by CP/M, but Kildall lost his chance the day IBM called - and he was at the golf course. I think I recall a lawsuit between Kildall and Gates over the SCP stuff, but if there was a monetary settlement, Kildall still lost because CP/M wasn't the OS that came with the box, and few people will spend extra to get something better when what's provided is "good enough.".

    I'm surprised no one has commented on the evils of Motorola's intellectual property, which their former employees used to make the 6502. Their mistake, which allowed Motorola to acquire ALL rights to the 6502, was that they took drawings etc. so they had a jumpstart on the artwork. And Motorola's work could be indentified on the 6502 silicon!
  • by octalman ( 169480 ) on Thursday March 01, 2001 @11:45AM (#392193)
    As another poster has observed in the post, West Coast Bias - Rewite History why don't you?, this book misrepresents much of the earliest days of what became personal computers.

    The Intel 4004 was not a computer by any stretch of the imagination, but a special purpose controller, and lived on a handful of separate, hard-to-integrate parts. The 8008 was, IIRC, Intel's first computer chip, with what became the infamous x86 instruction set, and slower than molasses in January, even for the time. Motorola (Austin, Texas) developed the first computer-on-a-chip, but marketing believed they would never sell more than 10,000 (or somewhere in that neighborhood) of the things, so the 6800 was put into cold storage. The Commodore, Atari and Apple computers were all based on the 6502, an inexpensive (for the time), stripped-down 8-bit-only 6800, but with two index registers, produced by some malcontents who left Motorola and went to Philadelphia. Motorola eventually won a massive intellectual property suit, but had lost the initiative because of lack of insight into the future of computing.

    There is a lot of hype about how the Altair was so innovative also, but it was produced in New Mexico, I believe, at least at first. Neither Altair nor any of the other early Intel-based computers could be cold booted. The bootstrap program had to be toggled in from front panel switches, just as mainframes had for years. Southwest Technical Products, San Antonio, introduced the first boot ROM (Gary Kildall and the Intel-centric community insisted on calling it the BIOS) with their 6800-based computers, and until 1982, had sold more computers, based on the 6800 and 68000 processor families, over 110,000 in all, than all the Intel-based boxes together to that time.

    Zilog darned near killed Intel by bringing out a better 8085 than the 8085. The Zilog Z80 had index registers! And some neat extensions to the instruction set. And ate Intel's lunch, technologically speaking, as well as market share. If you read publications from the 1981-1983 period, you will see a lot of hype about how the 4 MHz Z80 was so much superior to Motorola's 2 MHz 6809, just like you see people moaning G4's being outdated by PIII's with twice the rated clock speed. It ain't how fast the clock ticks, li'l buddies, it's what you do each time the clock ticks. A 2 MHz 6809 is about 20% faster than a 4 MHz Z80 because the 6809 used an internal clock doubler to minimize problems with having the processor clock on the motherboard, which only needed the half-speed memory clock. Altair put the processor clock, and a number of dumb Intel hand-shaking signals, on the S100 bus and early S100 motherboard designs had massive signal problems as a result.

    The bottom line is that all the Silicon Valley stuff was based on stolen intellectual property (the 6502) and a piece of crap (Intel 8080), but succeeded because they were afordable by the public. Hell, even the infamous GUI is stolen property. Apple stole it from Xerox PARC and Microsoft stole it from Apple. At least Gates had the insight (and ethics) to buy DOS from Seattle Software and a Unix license from Bell Labs.
  • ...this was a time when the very idea of a machine for casual or home use, or even have a full-featured computer on one's work desktop was radical, even laughable.

    My father began working for IBM's Cottle Road facility in San Jose in 1970. In 1978 (I was just two-years-old), he told my mother that we would have a computer in our home within five years.

    She laughed.

    We bought our first computer in late 1980.

  • I read it over a year ago and loved it. Actually I bought it for my dad for Xmas, but read a few pages before I wrapped it up. Immediately after giving him his present, I went done to the store where I bought his copy, and bought one for me.
  • by Jon Erikson ( 198204 ) on Thursday March 01, 2001 @08:25AM (#392196)
    This is my view. Back in the 70's and early 80's, computers had very simple operating systems, utterly primitive compared to today. Still, the expert user could do much the same with them as what we do with computers today. Sure, today we have various bells and whistles, but the central uses are the same.

    The development of computing over the last 20 years has been all about widening the franchise of users. Now the most computer illiterate of people can do much the same as what an expert could some 20 years ago, and with considerably more ease.

    But what will happen now? The franchise is as big as is needed, and we can already do all the major tasks we could want to - word processing, spread sheets, etc. These are the meat and drink of computing, and they aren't going to get any better or any easier to use now.

    So what will be the driving force of computer development over the next 20 years? I think that computers will cease to increase in power in about 5 years time. Instead terminals will become common, and everyone will have an internet connected terminal in the home. We are approaching the end game of software development in almost all arenas except AI and games. The software of the future will do the same things that it does today, in the same way. It will just be bug free, faster and a tiny little bit easier to use.

  • "Pirates of Sillicon Valley" was a fantastic film, I loved it. The technical side was fasinating, but that has been covered by many texts. What I found fasinating by the movie (and I suspect that it is also in the book, though I have not read it) is the potrayal of the Gates/Jobs relationship, and the way that these men behaved inside and outside of their respective companies. There is no "Hero" as such. Very informative

    Trav

  • The one sad thing is, that most of us 30 or younger missed out on the computer hobbyist scene... The best we can manage is a little hardware modification, or overclocking... Gone are the days when you could take a conventional CPU, and with a massive project board and a whole lot of patience, build a completely unique computer from the wirewraps up...

    You cannot even design a CPU on your own nowadays, too much to invest in fabrication, and forget about building custom boards either, the materials and tooling required costs several thousand dollars, a bit much on a low budget (which technically Apple was founded on)...

    It's a pity, really, because of the fact that so many good ideas in hardware development are stillborn, without a chance to exist in hardware form (unless you're willing to sell out that idea to a major corporation that will not only own that idea, but you as well)...

    So really, while software remains open, hardware has essentially been put under lock and key for eternity...
  • The Cuckoo's Egg by Cliff Stoll.

    A great story about an everyday guy who tracks down a spy who cracked his way into dozens of US military computers in the mid-80s. It was the case that woke up the then-close-knit Internet to the need for security. Filled with Berkeley color and the author's fun personality.

  • I wonder if they mention the commodore 64. I remember when I was 7 years old

  • I'm so *freaking* tired of books that rewrite history to say that the entire PC revolution happened on the West Coast.

    Does anybody remember Commodore? The CBM PET was the first home machine with an integrated screen and keyboard, AND could save your programs on a cassette. All in one machine.

    During the height of the Commodore 64, CBM owned over 33% of the home computer market, a percentage that to this day, no other single box-maker has ever had.

    And then there's the Amiga, a machine so far ahead of it's time that a decade later, some of the features it had still aren't available on modern equipment.

    There was a point where Commodore was kicking Apple's butt, but everyone forgets that and just talks about the Two Steves and Bill, like they alone invented personal computing. Don't get me wrong, I think WOZ is a god, but sheesh!, can't someone write a book that has facts in it, and tells the history truthfully, without cutting out huge chunks of history for the sake of drama?
  • by Bonker ( 243350 ) on Thursday March 01, 2001 @08:27AM (#392202)
    While I cannot personally comment on this book's contents as I have not read it, it seems that the market is being glutted with various biographies, histories, and 'Pirates of Silicon Valley' type stories.

    Think about it folks, the 1970's, when most of the initial development of the PC took place was 30 years ago! These events are becoming a lot less relevant to modern PC design and more like... well... History.

    The real danger here is inspiring new, talented creators and inventors to continue to think like the people who originally developed the first PC's. These people were innovative, but the technology is aging... badly... and we need new innovation and new, original ideas to speed up and destabilize the computer industry.

    The Computer Revolution was over in the mid 90's. We're more in sort of the Computer Middle Age now. Let's just hope we can avoid a 'Dark Age' by not making religous text out of historical fact.
  • Comment removed based on user account deletion
  • Another popular science compu-worship tome.

    What it's not
    If you're looking for hard-core information on early circuit designs, or code snippets from the programs which launched the rise of the PC, this is probably the wrong book -- though there is a satisfyingly large reproduction of the circuit board of the Apple I. The infectuous spirit of invention and a strange ort of aggressive computer-based fun comes through clearly, because Freiberger and Swaine concentrate on the personalities and business realities of the early days of the PC more than they do the technological advances which made it possible. Whether you find this more engaging or annoying is of course up to you; I found the stories and interactions of the early PC pioneers fascinating, less so the business machinations of the 80s and 90s.


    I think that the stories of the 1950s, when science was very important to nearly everybody in the government and intellectual life for the purpose of competint with the Societs, will only be tantalizing and painful to modern inventors who battle in todays world of tanking dot-coms.

    I do however think that it would be instructive to see how people write code on machines with the limitations these machines had... what kind of code is written for machines which were, for all their great size and expense, orders of magnitude less powerful than today's machines?

    Furthermore, any real tale of Silicon Valley should be as much a tale of poverty as a tale of success and prosperity. The place changed from a place of farms to a place of silicon factories to a place of dot coms, and the descendants of the "original" inhabitants of Silicon Valley (well, the original inhabitants were Indians but never mind that), Mexican-American families that settled there in the 16th and 17th centuries, now have jobs as janitors, gardeners, mainentance people and construction workers in the new economy buildings, and they live in converted U-Stor garages 'cause they can't afford the rent.

    In fact, the dot com economy is now beginning to displace people in San Francisco, as the high rent spiral they created further south forces growth north, where dot-coms have begun to gentrify [usatoday.com] the Mission District.
  • While I don't have a problem with a book focusing on the West Coast, I wonder if there's a good resource for the history of Commodore? I had a C64, then an Amiga 2000 before resorting to a 486 in the early 90s, and was sad to see Commodore lose more and more market share. I lived through that history, cheering for the underdog all the way, but I'd love to read somebody else's account of it.

  • There are few books that I let out of my house. This was one. Not because it wasn't informative, or funny, or sad, but that I want more people to read it. That's what was great about the book. I was born when the author creaks by pg.130. But the previous pages were not the only information I did not know. I got to see Woz do the ultimate gag against altair(zaltair), and possibly the worst picture of Bill Gates in exsitance. I encourge everyone who loves computers to buy this book, if only for the nastalgia. Also check out Apple: The Inside Story of Intrigue, Egomania, and Business Blunders isbn:0887309658
    Mark
  • $672? I dunno, I just built a gigahertz Athlon (T-Bird)with an Asus A7V board, 256MB of RAM, a 32MB TNT2 video card, and a 20gig Maxtor hard drive for a little less than that. Of course, you'd have to add a monitor, but those are about $150 for a 17".

    Now I guess it could be debated whether a gigahertz T-Bird is a 'speedy PC', but it's sure not pokey.
  • Timothy, I'm sure you're a great guy and everything, but I swear... sometimes you write the most opaque intros. I understand that that's part of your schtick and all, but please don't undervalue the importance of a clear, precise presentation in some cases. I know this is a fairly off-topic nitpick, but it's a point I felt deserves to be made. Spend a little more time in the submission queue, a little less time with the thesaurus, maybe?

    Please don't even consider this criticism. I mean no offense whatsoever, it's just my point of view... Please don't take this the wrong way, like michael is wont to do (at least in my personal experience) when constructive advice is offered.

    -tf
  • It's relative, my friend. My Celeron 333 is in no way worth one month's wage @ minimum wage, but it plays Quake 3 just fine, and it's WAY nicer than the 286 Tandy that I bought in 1989 for about $2500.00 That machine didn't have a hard drive, and I payed extra for a nice model that did 16 colors and had a better-than-usual soundboard. In my humble opinion, the computer you could buy today for that $600 of pay is fully capable of running modern software (I run Win2000, WinME and Mandrake on my c333) and will perform average user tasks speedily and without too much pain.

    Here is a price quoted from http://www.pricewatch.com:

    AMD DURON 650 MHz COMPLETE SYS WIN98 2nd edition & LIC CD(customizing availible) 64MB RAM, 10GB HDD, Socket A MB, 8MB (share), 52xCD, 56K Modem, NIC, Sound,FDD KB/MS/Spk MED ATX Case W/300W
    $399.00 insured (shipping = $29-$45)

    17inch SVGA monitor 0.28 dpi 0.28 dpi-1 year manufacturers warranty $ 116 (shipping $40)

    $399 + $45 + 116 + 40 = $600 , and that was including next day air freight charges within U.S.

    That, to me, is an okay machine for the AVERAGE user, speedy and with plenty of RAM (assuming the average user is running Win98 and some crappy office suite and AOL, or even broadband). Of course, by forgoing Win98, buying it clean and putting Linux on it, you would spend even less than than $600.

    Anyway, that's my two cents.
  • So, you interact bif the BAX using BM.

  • Of course, that's how I can locate New Jersey. Somehow I felt compelled to alalways copy the lines "Creative Computing, Morristown, New Jersey" at the beginning of my adaptations of their BASIC programs for my CASIO pocket computer (PB100, like in "Ghostbusters"), then on my C64. Their version of Eliza, Lunar Lander or Wumpus was especially great. In a car racing program I had found, I still do not understand how it is physically possible to change the compression ratio of an engine (that's how one was supposed to accelerate).
  • Personally, my first PC was Lisa, but I'm sure some Amiga weirdo will yell at me for saying that. What is with those people anyhow?
  • And so is the movie.
    I especially liked the bulldozer racing scene, but I was disappointed at not seeing Bill Gates' actual mugshot.
    I'll assume most of you already know where to get that ^_^
  • I meant PC simply in that context, not IBM Clone, though, that is ironic, really. In honesty, IBM, much as HP, didn't think computers would make it to the private market at that time.... Well, that was until Apple showed us the way.
    I'm both an Apple and Linux Zealot.... Well, more zealot for Apple; I'm told I have achieved "guru status," though, I see myself more of a missionary, converting on user at a time. This year alone, I've converted four. Not bad for someone with just ICQ, Email, and two chat rooms.
    Now, who's up for volleyball?
  • He was arrested for wreckless driving & owi.
    You can find a pic at www.enemy.org
    You'll also find a delightful "Microsoft World Domination 99" CD in there as well.
  • >This is my view. Back in the 70's and early 80's, computers had very simple operating systems, utterly primitive compared to today.

    Basing that your talking about home computers

    Well I guess you did not play with a C-64. The sprite graphic system at that time was top, and I don't believe that anybody had anything of equal until 1983.

    The OS of a c-64 was also very good, plus the openness of the c-64 was such that, a firm in the midwest made it the controler & graphic output for some medical devices.

    Tandy (radio shack) had some great stuff, All you had to do realy work at it. ( anybody remmember buying the preformated blue grid paper so your output to the screen was correct )

    I recal but don't remmember the name of a company that sold PC kits. Those kits ( 1977 ) were alot of fun and you could control you ham operation and mores code transmitions with them. ( right on you tv screen ).

    OS have made some improvments but not alot. Linux is the next true wave of massive redevelopement of an OS. 30,000 people giving up ideas, improving and speeding code, adding features of there own (and making it public) is going to win at the end. All we need is that one "killer" package and POOF there goes MS marketshare.

    ONEPOINT



    spambait e-mail
    my web site artistcorner.tv hip-hop news
    please help me make it better
  • Too bad. What did it talk about then? Nibbles and Moppyranger?
    "PC" and "gaming" [oldskool.org] didn't get along too well at that time. If you had a clue you got either a console or Amiga [amiga.de].
  • Most early PC users were jealous because their 3,000$ machine couldn't match the specs of the C64, let alone the Amiga, so now they're trying to hide the fact that the system which finally conquered the market was was the most dull and uninspired of them all.
    While people thank the PC for standarizing computing and bringing it to the masses, I feel that the computing world has become more and more boring than ever. All you can expect today is the lates incarnation of Windows, whose only innovation is taking double the space on your HD and eating double the RAM.
    The era of computer hacking is long gone, with O.S. manufacturers trying to hide the inner workings of their software in order to appeal to the masses of illiterate users who couldn't care less about how it all works.
    Windows is going the way of MacOS; it's extremely easy to write a Word doc, but try configuring Windows for your needs, changing system paths, or putting some order in the amazingly convoluted mess that is the Windows directory. All they want is to have absolute control of what we do and turn our pcs into "black boxes". Don't mess with our O.S., unless you've paid an absurdly expensive developer license.
    Im happy I still posess the last real computer [amiga.de] made.
  • Once and for all, "MS-DOS" was written by Tim Patterson of Seattle Computer Products not M$. It was written to emulate the API of CP/M and the UI was a sort of unix-fied CP/M UI (which was a clone of DEC's RSX-11).

    Also the book points out how the deal with DOS was the second time SCP got screwed over by M$. First time was the Z-80 card for the Apple II.

    I did see a demonstration of IBM's first personal computer - the 5100 - the "PC" was the 5150. The 5100 went for $10,000 in 74 - 75. There were a few people who bought Data General Nova's in the early 70's.

    Probably should by the book - was at UCB in the mid 70's - knew someone who worked for IMSAI (and had stories about the EST crap). BTW, for a while IMSAI was Intel's largest customer for 8080's.

  • I didn't like BYTE much at first, didn't start reading it regularly until 1982. My fave for the early years was Interface Age.

    Pop Tronics was another history source. The Jan 1975 issue is a keeper (intro'd the Altair), also the March 1972 issue which intro'd the HP-35.

    Another mag with early personal computer stories was Analog, the science fact section did have a few stories about the wonders of computers when the idea of a personal computer was - well- science fiction.

  • One of the first build your own computer articles was in Electronics Illustrated late 66 or early 67. It was an adding machine using neon bulbs as the logic elements and display.

    There were a couple of articles using potentieomters in sort of an electronic slide rule.

    Life Magazine had an article about the guy who bought the Whirlwind from MIT.

  • It's a shame that they only make a small mention of the Commodore 64, but unfortunately it, along with a few other systems, were forgotten for a number of reasons. At the time it came out, 1982, the Commodore 64 was superior in graphics and sound capability. The 16 color graphics display beat anything that any of the other major systems had out at the time. There were three channels for sound, which sounds like nothing now, but at the time, it was a big deal, and well ahead of any other systems in those areas. Sure, there was more memory on other systems such as the PC, but Commodore offered their system at much lower cost.

    Perhaps the 64k limitation is one area where it was beat, but it wasn't much of a limitation, apparently, because of all the games that were created for it. And a well-written program could fit in 64k, and whatever disk space was necessary. I challange anyone to write some of those games in C for an x86 (DOS, Windows, whatever) and have the exe be less than 64k. Or even write it in assembly, I don't think it can easily be done. And if you know about the architcture of the Commodore 64, with the ROM in place, there's 48k available (8k BASIC, and 8k Kernel), though BASIC wasn't necessary for writing stuff in assembly. Also, the high resolution graphics screen took another 8k. Even if a game flipped out both BASIC and the Kernel, it still leaves 56k, and not all of that can be used. You still have the zero page, and the i/o locations, which further reduce the available RAM. The point being that a lot of good games were made with less available RAM, because programmers wrote games in low-level languages and optimized them well rather than in high-level languages which are optimized by the compiler. Sure, the Commodore 64 was slow by the standards of the past decade, but any of the good games for it were well optimized and were very well written. A lot can be done, and was done, that wasn't done on other platforms, because the Commodore 64 had a much more versatile architecture and because the programmers had good skill.

    It was in my computer science course last year in high school, we watched a video on the history of computers, from Babbage's mechanical computers, through ENIAC and it's successors, and systems produced by IBM, and Xerox PARC, and when it came to the 1980s, there was no mention at all of Commodore and the effect any of its computers had on the industry. It's rather disappointing of PBS to produce something like this that is so historically inaccurate.

    The fact is, IBM supported an inferior product in the 1960s and won out because of a better marketing staff, which was accurately portrayed in the video. They also won out in the 1980s for the same reason, which was again accurately described by the video, except for a few details. It only described their dominance over Apple and no mention was made of the many other systems available at the time such as the PET, VIC-20, Commodore 64/128 (all of which were produced by Commodore), the TRS-80, Atare, etc.

    It's arguable the effect of the Commodore on computers today simply because they've been gone for so long (the 64 stopped being produced in 1992), but they were a major player in the 1980s, and it's sad that it's been so quickly forgotten.

  • Apple did not steal GUI from PARC; the term "windowing" had already come into use and Raskin had already led the development of the Mac ideas at Apple. Steve Jobs may not have got the clue till he visited PARC however. I think the name of the QDOS company was Seattle Computing Productions; they had purchased CP\M from DR for internal use, then started selling another version of it (they labeled the HD as C instead of A) for the new Intel CPUs. Aside from that, the IBM, M$, and DR versions of the story differ from each other. Apparently, Microsoft purchased stolen intellectual property under false pretenses.
  • "A timeline at the beginning of the book is great reading all by itself, and an included CD-ROM contains more audio clips, pictures and history as well. "

    What? No punchcards included?? WTF!



  • Ah, you are correct sir! I guess half my brain was thinking about the 2600 (the gaming half), and the other was day-dreaming.

    Yes, I have 2 of those things in the attic.



  • Sounds like a laxative right?? Well, that's what I thought it was the first time I heard its name.

    Once, in college, I was assigned the task of creating (in homage to the old days) a virtual VAX machine written in ASM for MIPS chips. What a great combination eh?
    Needless to say, the variable length instructions made me want to scream and hurt people.

    Does anyone still have their Atari 8600 with cassette tape drive? What a great machine =)

  • Reading this review, I am reminded of Russ Walter's observation that a computer hobbyist is someone who likes to tinker with computers, but a computer hobbiest is even more hobbier.

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...