Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

Computer History Museum 155

nickynicky9doors writes: "New Scientist has an interview with computer historian Michael Williams. Mr. Williams has undertaken to set up a world class computer museum. My favourite was always the Cray 2 which used artificial human blood plasma as a coolant, but the article talks of the 1965 HoneyWell kitchen computer which was built for the Neiman Marcus department store. At a cost of $10,500 it came with 2 programming manuals and a cookbook. Garbage In was by way of flickering binary switches and Garbage Out was by a row of blinking lights. There's more at www.computerhistory.org."
This discussion has been archived. No new comments can be posted.

Computer History Museum

Comments Filter:
  • by phraktyl ( 92649 ) <wyattNO@SPAMdraggoo.com> on Saturday February 09, 2002 @11:47PM (#2981111) Homepage Journal
    I always wondered who was willing to pay $75 a week for my plasma. At least I know it was going to a good use!
    • Yeah... that sounds like the computer you'd expect Dracula to use... :] But what about the one that comes with a *cookbook*? After that cooked Palm Pilot just a bit ago, it kinda makes you wonder, doesn't it? Maybe some people aren't getting enough silicon in their diet...? :]
  • by ChristianBaekkelund ( 99069 ) <draco&mit,edu> on Saturday February 09, 2002 @11:48PM (#2981112) Homepage
    I still miss the Computer Museum [mos.org]. :(
    • why? huge trackballs and other kid-oriented stuff?
    • I used to work at the computer museum in 1991 (I built parts of 'computers and technology' and helped build the giant monitor and punchcard exhibits. It's sad to know that it's gone, because I just found out (thanks for the post) and I know that some of my software is now lost to the big museum graveyard at Moffett field.
  • for coolant? Are we cold blooded?
  • by ekrout ( 139379 ) on Saturday February 09, 2002 @11:49PM (#2981115) Journal
    Museums like these help to illustrate just how complex modern computers truly are.

    You get to journey back into time and put yourself in the shoes of researchers who were trying to figure out how to solve the most complex problems of the day, all while having a newfangled electronic appliance the size of a room do all of the work for you.

    In times such as ours where computing for the every day person involves little more than pointing, clicking, and writing IMs or emails, we should all learn to appreciate and marvel at computers.

    There's no better way to learn about the current information technology field than by studying the past.

    I applaud people like Michael Williams.
    • All is true, thats why i think is important to learn ASM and how does the Turning Machine worked, ASM is a nice language, gives you a good perspective of how does computer hardware works inside the system. Its sad to see many young programmers telling its boring, useless, superpassed and old enough to be ignored.
    • Take a look at the Vintage Computer Festival [vintage.org]. There are a lot of people who are working hard to preserve the history of the computer industry.

      Keep in mind that it's also important to show people that what they think is new [microsoft.com] just might not be [spies.com].

      P.S., You can check out my collection [sinasohn.com] too.

  • A tour (Score:2, Funny)

    Tour Guide: 'On your left here is ENIAC. Say hello to ENIAC.'
    'Hello'
    ENIAC: '010100'
    Tour Guide 'It says it's happy to see you children.'
    'Next on your right is an Apple//e....
    • Actually (assuming two leading zeros, to make it an 8 bit word '00010100') is says: ^T

      That's also assuming it's ASCII
      • Re:A tour (Score:1, Funny)

        by Anonymous Coward
        It's not ascii, it's a 6-bit encoding scheme where 010100 is "I'm happy to see you, children".

        Can't you read?
      • Do you think this machine could use 8 bit words?... i dont thinks so but dont really know, its more like to use a not standard ASCII table for alpha to get ride of the many useless characters. (and as long as i know ASCII dates to IBM's glory)
        • ENIAC actually worked in decimal, not binary. IIRC, it was more of a glorified calculator than a general-purpose computer.

          Remember that in those early days every computer was a unique, one-of-a-kind creation. There were no standards for anything -- not even keyboards.

      • I wasn't in the mood to do binary last night :)
  • Memorabilia (Score:2, Insightful)

    by Evanrude ( 21624 )
    This will be a good educational tool and also serve as a way for us younger geeks to take a look at the way things used to be a few decades ago. Very cool idea.
  • Love the questions (Score:2, Interesting)

    by oregon ( 554165 )
    The non techie :

    What about this huge contraption with all the vacuum tubes?

    The realist

    Is it possible the general public doesn't care about these machines?

    The pitiful

    Isn't there something rather sad about collections of old computers?

    It's nice to see some humanity in articles ... instead of just leading questions so the interviewee can press his agenda.
  • 1965 (Score:3, Interesting)

    by blair1q ( 305137 ) on Sunday February 10, 2002 @12:00AM (#2981143) Journal
    35 lousy years.

    That's how long ago it was that the 7-segment display seemed like Space:1999 technology.

    --Blair
  • already done (Score:3, Informative)

    by Khopesh ( 112447 ) on Sunday February 10, 2002 @12:08AM (#2981160) Homepage Journal
    There's a Computer Museum in Boston, MA, on Congress Street by the Children's Museum and Boston Tea Party Ship & Museum. I believe it was founded in 1982. ...there's a possibility it went out of busines recently. It doesn't have a website that I can find, but Yahoo! Yellow Pages [yahoo.com] has a listing [yahoo.com] for it
  • artificial what?? (Score:5, Informative)

    by foobar104 ( 206452 ) on Sunday February 10, 2002 @12:10AM (#2981165) Journal
    My favourite was always the Cray 2 which used artificial human blood plasma as a coolant....

    Oh, man, do I even need to take the time to correct this?

    The Cray 2, like several successors including the C90 and T90, used liquid fluorocarbon as a coolant. This is true.

    To say that liquid fluorocarbon is artificial blood plasma is simply false. There are several commercial products that can be used as blood substitutes-- such as Oxycyte-- but these are oxygen-carrying perfluorocarbon (PFC) emulsions. These products are about as similar to the liquid coolant used in the Cray 2 as scrambled eggs are to mayonnaise.

    The coolants used in the various Crays, plus lots of other electronic systems, were all pure perfluorocarbon liquids, like Fluorinert, which is a commercial product produced and sold by 3M. They're good choices for immersion cooling because they're chemically inert. Ironically, most of them (like FC-87, fer instance) have boiling points well below that of water; FC-87's is around 30C. They're useful anyway because they're dense and have specific heats about four times less than water's.

    The various artificial blood products, though, are PFC emulsions, in which microscopic droplets of PFC are suspended in a saline solution. These liquids can be used because gases like oxygen and CO2 are highly soluble in PFC. Since the droplets of PFC are about 70 times smaller than red blood cells, it's very easy for them to act like RBCs in the blood stream for gas transfer.

    The fact that Cray's are liquid cooled is neat enough without messing it up with misinformation.
    • Lesssse, right aboot now it late Shaturday night, ssso my blood would be a *hic* great coolant.

      Alcahol.

      :#)

      Soko
    • This is not a troll or flamebait, I just want to know...

      They're good choices for immersion cooling because they're chemically inert

      then you say

      These liquids can be used because gases like oxygen and CO2 are highly soluble in PFC.

      If they are inert then how are O2 & CO2 soluble in it?

      • A solution is just...well, it's a solution. If you dissolve salt in water, you don't end up with a chemical reaction between the salt and the water; you just get Na+ and Cl- ions floating around between the water molecules.

        So you can have C02 and O2 dissolved in a chemically inert liquid just fine. No chemical reaction is necessary to get a solution. In fact, if there is a chemical reaction, you don't have a solution; you have a compound.
      • If they are inert then how are O2 & CO2 soluble in it?

        You know... that's a good question. I'm not sure. Maybe calling PFC "inert" is an overstatement. Maybe it's better to call it nonconductive and noncorrosive. Or maybe gases are only soluble in emulsions of PFC.

        You caught me. Good question. ;-)

    • They're useful anyway because they're dense and have specific heats about four times less than water's.

      Most organic liquids have a specific heat capacity much less than that of water; 0.2 to 0.25 is typical. That is not good. High density offsets this when comparison is made on a volumetric basis. Also, most organics have poor heat transfer coefficients compared to water. Coolers are usually designed to boil the heat transfer medium because heat transfer coefficient is much higher when the heated medium boils; a smaller cooling coil can be used.
    • There go my dreams of building Vampire Supercomputers of Doom (TM).

      On the other hand, I have a newfound curiosity about what happens when you use FC-87 as blood substitute... 30C, isn't it?

    • ...cause then the Fluorinert turns into mustard gas [google.com]
      • the Fluorinert turns into mustard gas

        Riiiight... a Google search for fluorinert "mustard gas" turns up a total of 10 hits, the ones that mention Fluorinert turning into mustard gas are from some thread on Beowulf clusters. And that claim was only made by one person... so I'm supposed to believe it?

        Well, I don't... Mustard gas contains sulfur and chlorine... in particular, it's C4H8Cl2S. On the other hand, Fluorinert is a fluorocarbon--it contains carbon, fluorine, and maybe hydrogen. There's no way you're going to get mustard gas from burning it (in air at least). In fact, Fluorinert is non-flammable. Good luck setting it on fire in the first place!

        That said, it does decompose into hydrofluoric acid and other stuff if you heat it too much, so I guess that would be a bad thing :) But it's not mustard gas.

        Here's the MSDS for FC-71: http://www.sisweb.com/referenc/articles/fc71.htm [sisweb.com] and FC-5311: http://www.sisweb.com/referenc/articles/fc5311.htm [sisweb.com]

        Thanks for the FUD, Bob Glamm [beowulf.org]!

  • by geewiz45 ( 310903 ) <geewiz45@ySTRAWahoo.com minus berry> on Sunday February 10, 2002 @12:11AM (#2981167) Homepage
    Who needs that? You should see where I work. There's dust on some of the PC's that is older than me.
    • Interesting... the only PC older than me is the one that IBM released in August 1981. =)
      • Interesting... the only PC older than me is the one that IBM released in August 1981. =)

        Hehe , I have one of them. Got it new, in September 81, when I was five. Everytime I look at it, I wonder how cool it would be to drop an Athlon in it.

        It would look cool as hell, but I would have to butcher it....
  • I went to the Museum of American History, and they have an Information Science section in the basement. I saw a book there that I used as an undergrad: McCracken's _Fortran IV Programming_. Is it too early to relegate that to history, or am I getting old?
    • Don't feel bad, that book is still sitting on my bookshelf. It was the first computer programming book that I bought.
    • Aaargh! I have that book! I had that as a college text! You say it's in a musuem! Aaaargh!

      I'm going to go now and see if there's any openings at the retirement home. There at least I can gum my jello and scribble in my Fortran Coloring Book and read all about Grandma's Drawers...
    • The Smithsonian made at least one mistake which I pointed out here [vrml3d.com] and later backed up with photos. [vrml3d.com]

      I visited the museum several months after the pictures were taken, and the mistake was still there. There was not even a piece of tape over it or anything. Now, I realize nobody likes to see tape over a sign in a museum, but are facts important or are they only concerned about appearances?

      Then of course there is the possibility that Intel donated a lot of money to the museum. The Smithsonian has had some corporate entanglements lately. I'm as much for small government as anybody, but don't do it halfway. Either fund the museum fully so that it doesn't have to have corporate logos all over it, or totally defund it. If you visit the museum, you will see heavy involvement from the Discovery Channel and some other sponsors.

      Then of course there is that other bit where they had a poll in the museum that was supposed to track approval ratings for politicians among museum vistors. I visited that a few times during the Clinton administration, and the numbers for Democratic leaders were all suspiciously pegged at 54%, as if they had throttled negative responses.

      The fall in prestige and other management issues at the Smithsonian are well documented. The bottom line? I don't trust the Smithsonian. Sad but true.

  • The core memory thing was really interesting. The computer I used to maintin in the Navy ran on Core memory. It weighed a ton, ran slower than molasses uphill in a blizzard, but it NEVER failed. It was a great technology for the time.
    • I used to work with some computers that had been designed for the U.S. Navy. One neat feature was a front panel switch labelled "Battle Short". Supposedly this disabled all of the curcuit breakers, for situations where a working fire control computer was more important than some fried wiring.
  • For anyone wondering about the 'dept' line of this story, here's the missing [?] [everything2.com]
  • by maggard ( 5579 ) <michael@michaelmaggard.com> on Sunday February 10, 2002 @12:43AM (#2981237) Homepage Journal
    Is it possible the general public doesn't care about these machines?

    Is it possible the public doesn't care? Yes.

    Witness "The Computer Museum". Originally Gordon Bell's private collection (as in the VAX guy then Nat'l Science Foundation) it was formalized into "The Digital Computer Museum" - Digital as in DEC.

    Later when folks at other companies became leery of donating gifts to their competitor's in-house museum it was spun off into the space built for a transportation museum and became "The Computer Museum". Gordon's wife Gwen took a leadership role, DEC donated lots of support and the place went... nowhere.

    It was a good try. Gwen's vision had always been halls of gray boxes through to the movers and shakers of the industry would make pilgrimages, fund through philanthropic impulse, perhaps hold power lunches at a cafe. Instead the bread-and-butter reality of school groups and tourists finally prevailed and more "friendly" exhibits (eg the Walk-Through Computer) were installed as budgets permitted.

    However too little too late. The costs of running a museum were high, lots of local computer groups were themselves failing (many of them burning the TCM along the way) and the place never really found its feet. The great hope of the sexy Java-based virtual fish tank teaching all sorts of interesting theories took a few million bucks in donations and produced a pretty but most incomprehensible exhibit that, frankly, tanked the place.

    The programs went to Boston's Museum of Science, the collections out to the west coast branch where they were reincarnated as what we see today.

    So - do folks care? I doubt it.

    • There's a limited set of folks that are willing to pay to see collections of old boxes.
    • There are a limited number of folks willing to come and see tomorrows-technologies-today even in Silly Valley and these are by definition short-lived exhibits requiring constant rotation.
    • There is a limited amount of corporate-sponsorship possible before place looses credibility and appears to (or does) sell out and becomes a big sales gimmick.
    • There's a limited amount of government monies out there for museums, historical objects, and research.

    Museums cost. Yes I know to folks on the outside they look nice and simple but they're not. What you see on the floor is usually far less then 1/10th of a collection: A collection that requires high-quality (museum-quality) storage if you're to treat it right for the ages. It requires research and documentation and maintenance. It requires insurance and access and continual expansion if it is to keep up. The public facilities themselves need to be maintained and insured and secured and managed. The exhibits must be maintained and updated and replaced on a regular place if you want folks to ever come back. Grants need to be applied for and marketing has to happen to get word out and keep folks coming. Staff needs to be paid for as well as support be given to research programs and visiting scholars. Then there are the daily school groups and tour groups and regular private rentals and public special events etc.

    This all takes a lot of money and widespread support, particularly if you're to do a good job and respect the trust your collection represents.

    Yes hope springs eternal but we've already been here once. Yes it's Silly Valley and there's a dearth of tourist facilities and lots of folks who did get rich on dotcom and are looking for something they can sponsor. But Boston had it's own computer folks and more students then you can shake a stick at. Six of one, half dozen of the other and it still doesn't add up to enough. I've fond memories of TCM and wish TMHC the best but feel ultimately this sort of project is too big for a stand-alone institution while serving too small a niche; other institutions like the Smithsonian with larger budgets, stronger research programs, their own collections with more facilities & more services would do better.

    But hey, the second time is always better.

    The author was involved at the setting up of Computer Place at Boston's Museum of Science. Later I became a manager at The Computer Museum. Afterwards I went back to school and from there into industry though I've always kept up with much of The Computer Museum news from a distance. If we met there drop a line.

    • ...back when it still existed, and the staff you saw on the floor, was only a small fraction of the total staff..tons of marketers, sales, HR, office support, all sorts of people who probably didn't add much to the museum, but took home a salary.
      also, most of the museums exhibits never advanced past the 80's. One of the interactive display, informed people that most home computers ran at 2-4 mhz, and the "virtual reality" secion, featured a "Hard Drivin'" arcade game, made by Atari, which was quite hard...and the only visitor to ever beat the first race, was a little girl, who came as part of a school group.
      That's my biggest memory of the computer museum.

    • (* Museums cost. Yes I know to folks on the outside they look nice and simple but they're not. What you see on the floor is usually far less then 1/10th of a collection: A collection that requires high-quality.... *)

      I realize dealing with old equipment can be a handful, but why not have a good "Software Museum" *on* the web?

      I tried to find some info about pre-relational database query languages (besides IMS stuff), but came up short beyond a few acronyms mentioned.

      You don't need buildings for a software museum. (Well, maybe some of the manuals are paper of course, but find a cheap warehouse in Wyoming or something. Otherwise, scan it in and put it online.)
      • by Anonymous Coward
        Actually I'm horrified to see I mis-typed 1/10th of a collection is "on the floor" - usually it's anywhere from 1/100th to 1/1000th or greater. Furthermore typically there will be more then a single copy of a whatsits in storage if possible, with the best one out on display if one is out on display, permanent or rotating.

        As to a software museum, actually there is a bit of a collection of that. I recall at TCM loading up SpaceWar from paper-tape (one of my job-skills was being able to read papertape - it's not difficult once one gets the hang and has the symbols committed to memory) onto the PDP-1, there were other like choice bits.

        However there's more issue then just putting the code out there for folks to look at. First off much of it is encumbered by license/copyright so that would need to be resolved. Next is simply getting it off of any medium it is on - the loss of readers is well known in the field. Then of course are the format and OS version questions ("what *is* this string of binary?") Documentation is typically either a storage/indexing issue or simply non-existant, or in it's own difficult-to-decypher format.

        It's a great idea, and it would be wonderful to approach many of the pioneers in the field still extant and ask them for material regarding their discoveries and first applications along with examples. However to simply put out for the world material of unclear provinance would be unethical and the sort of tarpit any museum would steer well-clear of (the nightmare of war-treasure and stolen artifacts is a well recognized one.)

        Posting "orphanware" Atari-cartridges on semi-legit websites is far different then a scholarly presentation of code evolution and diversity.

        However it is more then likely TCHM or others are doing something of the sort, or would welcome support in such a project. While I listed a number of issues they're no more then museums of say Television or Rock & Roll have faced, certianly they're surmountable and I agree, they're going to be important.

        • (* However there's more issue then just putting the code out there for folks to look at. First off much of it is encumbered by license/copyright so that would need to be resolved. *)

          It is odd that companies would be protective of manuals and programming languages from the 1950's.

          However, I guess it is the polite thing to do to ask.

          BTW, I saw nothing on their website about E. F. Codd.
    • I also worked at TCM in the early 90's (and helped bring the above author over from the MOS). It was a wonderful ecletic collection of artifacts (at one point, I could do the historical tour with Whirlwind and SAGE in my sleep). I also worked on the Walk-Through Computer, and I helped behind the scenes for the first two Computer Bowls (seen now on Stewart Chefeit's 'Computer Chronicles'). I mourn the loss of the old place on Congress Street.
  • The site does not describe computing before the 1900's. But there were ancient computing devices that do deserve recognition. Many ancient computing devices never had moving parts, so they could not be easily identified as machines. This shows how advanced they were for their time. Stonehenge [sbc.edu] is a great example.

    Some links

    Two timelines here [digitalcentury.com] and here [cyberstreet.com] which date well back into B.C.

    There is even an ancient Greek clocklike machine over two thousand years old that can be found here. [giant.net.au]

    For those who want links to every type of computing, [warbaby.com] even modern. [thinkquest.org]
    • I was noticing this too. Blaise Pascal and Ada Lovelace did all their work before 1900.

      Weren't there also french looms in the late 1800's that ran from punch cards?

      The timeline they have up doesn't even start until 1945. Maybe they should rename it the "Digital" Computing Museum?

      You gave some great links there! Thanks!
  • sigh (Score:4, Insightful)

    by Bastian ( 66383 ) on Sunday February 10, 2002 @12:56AM (#2981257)
    Looking at this stuff, most of which was created before I was born, I can't help but feel a twinge of remorse. It seems the Golden Age of computer geekdom died with the Information Age.

    Things were really hopping in computers a few decades ago. The high-level-laguage, GUI's, timesharing systems, networking. . . it was all new and exciting. It seems, though, that the wave has broken. Nothing new seems to have happened since the early '90's, when the WWW was first envisioned.

    I catch myself sitting in classes with other CS majors who have never really learned DOS and are awed by and afraid of OpenGL and thinking back to high school when I spent my free time programming cute little VGA hacks in x86 assembly language and can't help but feel a twinge of superiority based on some unfounded feeling that I have touched the machine itself and they have not.

    Then I go to a museum like this, take a look at what my elders were working on, and realize that I am the small fish. The magi have played their part, and I have a feeling of dread that the field has more or less reached its plateau point.
    • load "*",8,1

      I totally understand this article's sentiment, and I'm not even that old. I remember when I was in third grade I went to a form of summer school, and learned to program on the Apple IIE on the side. I still remember the first real program I wrote, that is the first one with more than 10 lines - a simple three frame animation created by manually turning on and off individual pixels. I was really proud of that (remember, I was very young, and working on my own, so even though it wasn't much, it was hard for me).
      I still have it on a totally useless 5 1/4" disk, though I can't use it, and its totally worthless and pathetic compared to today's stuff.

      ..Sigh..

      I don't even think I use most of the knowledge I learned from back then.

      Flowcharts, goto loops, and even BASIC have all been declared bad because they lead to obscurity in coding, and even the interfaces (both command and within the major programs - sort of GUIish) of my first two computers (school used Apple IIE, while I used the Commodore 64 at home) have completely gone the way of the dinosaurs.
    • I completely agree with this, and I'm not even old either (20).

      The first means of Internet access my family had was via this textmode online service called Delphi. This was before there were any local ISPs in my area, at least that I knew about, and on top of that, our 2400bps modem was insufficient for anything other than textmode.

      Anyway, I was a moderate video game enthusiast, and one of the first websites I ever visited, at around 11:30pm, was www.nintendo.com. On Nintendo's site, which had to have been quite new at the time, there was a link to some kind of "About" page. Nothing special, except for the way it "hit" me. There was something about knowing that the characters on my screen had originated from Seattle thousands of miles away that just made me go "wow." On the page was some kind of random quip like, "the food at CafZ Mario is top notch," and it just made me stop and think: What did CafZ Mario look like? Would the lights be on at this time of night? What kind of food did they serve?

      I suppose, in short, the web, or the WWW as it was usually called then, served as the ultimate antithesis to the relative isolation I was so used to. (Or something.)

      I remember my first 28.8 modem, with my first PPP link to an ISP (this was right after SLIP fell out of favor, around them time when 8MB of RAM cost $300). My ISP gave me Mosaic to use as my web browser. Shortly thereafter, I went to download a new web browser... I think it was called "Netscape" or something. This was back when Netscape was considered really cool. Remember how the N used to animate differently than it does now? It was cooler then. Remember how... okay, nevermind.

      I could go on even more about the first time I actually chatted with people over the Net (WOW!! REAL... PEOPLE!! They can hear me! They're TALKING to me! Ahhhh!"), and how now I don't even think twice about it. As could others, I'm sure.

      Alex

      • Hah! My first modem was a 300baud coupler. First game I ever bought was Incunabula. First computer that was *mine* as opposed to a roommate's or school's was an 8086. I learned to program in 4.2BSD. I remember the days before spam.

        So you can see I'm really not that old... I'm a youngster compared to most geeks.
    • Re:sigh (Score:2, Insightful)

      The Magi are still there, the results of their creations are just physically smaller. The kind of engineering that goes into the high-end boxes from all the major vendors, and the botique ones too, today is on par with what went into the high-end boxes of yesteryear.
    • I personally think the best time to learn computers was the 70s; there seems to be so much folklore based on that era from things like Seymour Cray, PDPs, IBM et al. These days, we have (generally) anonymous beige boxes with commodity parts which doesn't leave much room for the folklore engendered in those days.

      Despite missing these things, I'm like you in some ways that my first computing experiences predated the WWW, I've done some assembly (on a speccy, though). I worked in the computing department of a university, and some of the things I saw was worrying, not least the fact that the only language many of them was taught was Java. I'm waiting for a new era of buffer overruns and memory leaks to happen, simply because they've never been taught how. Hell, I learned C/C++ at Uni, and despite my tending to be fairly determined at letting them through, one of my programs had a small (5 bytes every second or so, IIRC, but it adds up) memory leak that went unnoticed for several months.

    • ...to have used some of the stuff in the museum, including two separate machines with iron core RAM, one of the first acoustically-coupled 300 baud modems ever offered for retail sale, paper tape as a storage medium, and Teletype as user interface. And the author is right. Things suck now.

      In the 70's and early 80's, anybody with the time to spare could write a really fine application comparable to anything offered by the SW industry, and a lot of them did and their wares were superb. The C64, Apple II, TRS-80, and even the early PC often hosted apps written by little guys and marketed by small shops.

      Ironically, it was the Mac that killed this. The GUI was so complicated that it could be programmed only with an expensive kit and a lot of experience. (Oddly, the Amiga overcame this but Apple killed it in court.) When Micro$oft followed Apple's lead with Windoze, the door was closed on small, clever apps. Nobody would understand or know how to use them any more unless a huge overhead of (usually useless) functionality was supported.

      In the 1980's some of the most popular games ever written were written by individuals, either working for companies like Atari or on their own. Even the big companies would have individuals or teams of 3-4 people writing a project. Now, writing a game or office utility is like directing a movie. There are far too many details for an individual to take care of, so a massive heirarchally-organized team is needed just to finish the project; and there is no central vision, no personal sense of pride in the people who actually pound out the code. And we wonder why software breaks nowadays.

      It may be quixotic, but you can still go back: stella@biglist.com is the mailing list for retrogamers programming the Atari 2600. Check 'em out. Read the archive and links and marvel at how our ancestors managed with 128 bytes (no K or M there, 0.1K) of RAM and 4K of ROM.

  • ...and this won't be considered offtopic...

    The History of the Slashdot World
    From a mailing list written by Seth


    2.5 million B.C.: OOG the Open Source Caveman develops the axe and releases it under the GPL. The axe quickly gains popularity as a means of crushing moderators' heads.

    100,000 B.C.: Man domesticates the AIBO.

    10,000 B.C.: Civilization begins when early farmers first learn to cultivate hot grits.

    3000 B.C.: Sumerians develop a primitive cuneiform perl script.

    2920 B.C.: A legendary flood sweeps Slashdot, filling up a Borland / Inprise story with hundreds of offtopic posts.

    1750 B.C.: Hammurabi, a Mesopotamian king, codifies the first EULA.

    490 B.C.: Greek city-states unite to defeat the Persians. ESR triumphantly proclaims that the Greeks "get it".

    399 B.C.: Socrates is convicted of impiety. Despite the efforts of freesocrates.com, he is forced to kill himself by drinking hemlock.

    336 B.C.: Fat-Time Charlie becomes King of Macedonia and conquers Persia.

    4 B.C.: Following the Star (as in hot young actress) of Bethelem, wise men travel from far away to troll for baby Jesus.

    A.D. 476: The Roman Empire BSODs.

    A.D. 610: The Glorious MEEPT!! founds Islam after receiving a revelation from God. Following his disappearance from Slashdot in 632, a succession dispute results in the emergence of two troll factions: the Pythonni and the Perliites.

    A.D. 800: Charlemagne conquers nearly all of Germany, only to be acquired by andover.net.

    A.D. 874: Linus the Red discovers Iceland.

    A.D. 1000: The epic of the Beowulf Cluster is written down. It is the first English epic poem.

    A.D. 1095: Pope Bruce II calls for a crusade against the Turks when it is revealed they are violating
    the GPL. Later investigation reveals that Pope Bruce II had not yet contacted the Turks before calling for the crusade.

    A.D. 1215: Bowing to pressure to open-source the British government, King John signs the Magna Carta, limiting the British monarchy's power. ESR triumphantly proclaims that the British monarchy "gets it".

    A.D. 1348: The ILOVEYOU virus kills over half the population of Europe. (The other half was not using Outlook.)

    A.D. 1420: Johann Gutenberg invents the printing press. He is immediately sued by monks claiming that the technology will promote the copying of hand-transcribed books, thus violating the church's intellectual property.

    A.D. 1429: Natalie Portman of Arc gathers an army of Slashdot trolls to do battle with the moderators. She is eventually tried as a heretic and stoned (as in petrified).

    A.D. 1478: The Catholic Church partners with doubleclick.net to launch the Spanish Inquisition.

    A.D. 1492: Christopher Columbus arrives in what he believes to be "India", but which RMS informs him is actually "GNU/India".

    A.D. 1508-12: Michaelengelo attempts to paint the Sistine Chapel ceiling with ASCII art, only to have his plan thwarted by the "Lameness Filter."

    A.D. 1517: Martin Luther nails his 95 Theses to the church door and is promptly moderated down to (-1, Flamebait).

    A.D. 1553: "Bloody" Mary ascends the throne of England and begins an infamous crusade against Protestants. ESR eats his words.

    A.D. 1588: The "IF I EVER MEET YOU, I WILL KICK YOUR ASS" guy meets the Spanish Armada.

    A.D. 1603: Tokugawa Ieyasu unites the feuding pancake-eating ninjas of Japan.

    A.D. 1611: Mattel adds Galileo Galilei to its CyberPatrol block list for proposing that the Earth revolves around the sun.

    A.D. 1688: In the so-called "Glorious Revolution", King James II is bloodlessly forced out of power and flees to France. ESR again triumphantly proclaims that the British monarchy "gets it".

    A.D. 1692: Anti-GIF hysteria in the New World comes to a head in the infamous "Salem GIF Trials", in which 20 alleged GIFs are burned at the stake. Later investigation reveals that many of the supposed GIFs were actually PNGs.

    A.D. 1769: James Watt patents the one-click steam engine.

    A.D. 1776: Trolls, angered by CmdrTaco's passage of the Moderation Act, rebel. After a several-year flame war, the trolls succeed in seceding from Slashdot and forming the United Coalition of Trolls.

    A.D. 1789: The French Revolution begins with a distributed denial of service (DDoS) attack on the Bastille.

    A.D. 1799: Attempts at discovering Egyptian hieroglyphs receive a major boost when Napoleon's troops discover the Rosetta stone. Sadly, the stone is quickly outlawed under the DMCA as an illegal means of circumventing encryption.

    A.D. 1844: Samuel Morse invents Morse code. Cryptography export restrictions prevent the telegraph's use outside the U.S. and Canada.

    A.D. 1853: United States Commodore Matthew C. Perry arrives in Japan and forces the xenophobic nation to open its doors to foreign trade. ESR triumphantly proclaims that Japan finally "gets it".

    A.D. 1865: President Lincoln is 'bitchslapped.' The nation mourns.

    A.D. 1901: Italian inventor Guglielmo Marcoli first demonstrates the radio. Metallica drummer Lars Ulrich immediately delivers to Marcoli a list of 335,435 suspected radio users.

    A.D. 1911: Facing a break-up by the United States Supreme Court, Standard Oil Co. defends its "freedom to innovate" and proposes numerous rejected settlements. Slashbots mock the company as "Standa~1" and depict John D. Rockefeller as a member of the Borg.

    A.D. 1929: V.A. Linux's stock drops over 200 dollars on "Black Tuesday", October 29th.

    A.D. 1945: In the secret Manhattan Project, scientists working in Los Alamos, New Mexico, construct a nuclear bomb from Star Wars Legos.

    A.D. 1948: Slashdot runs the infamous headline "DEWEY DEFEATS TRUMAN." Shamefaced, the site quickly retracts the story when numerous readers point out that it is not news for nerds, stuff that matters.

    A.D. 1965: Jon Katz delivers his famous "I Have A Post-Hellmouth Dream" speech, which stated: "I have a dream that one day on the red hills of Georgia the geeks of former slaves and the geeks of former slave geeks will be able to sit down together at the table of geeks... I have a dream that my geek little geeks will one geek live in a nation where they will not be geeked by the geek of their geek but by the geek of their geek."

    A.D. 1969: Neil Armstrong becomes the first man to set foot on the moon. His immortal words: "FIRST MOONWALK!!!"

    A.D. 1970: Ohio National Guardsmen shoot four students at Kent State University for "Internet theft".

    A.D. 1989: The United States invades Panama to capture renowned "hacker" Manual Noriega, who is suspected of writing the DeCSS utility.

    A.D. 1990: West Germany and East Germany reunite after 45 years of separation. ESR triumphantly proclaims that Germany "gets it".

    A.D. 1994: As years of apartheid rule finally end, Nelson Mandela is elected president of South Africa. ESR is sick, and sadly misses his chance to triumphantly proclaim that South Africa "gets it".

    A.D. 1997: Slashdot reports that Scottish scientists have succeeded in cloning a female sheep named Dolly. Numerous readers complain that if they had wanted information on the latest sheep releases, they would have just gone to freshsheep.net

    A.D. 1999: Miramax announces Don Knotts to play hacker Emmanuel Goldstein in upcoming movie "Takedown"
  • by garcia ( 6573 ) on Sunday February 10, 2002 @01:01AM (#2981266)
    History degree w/a CS minor?

    How about a dork that collects stupid old computers b/c he has no money to buy up-to-date fast ones?

    Please?
  • The original computer Museum was in Boston. It merge with the Boston Museum of Science, and since then the collection of items has moved around the country.

    See details here [mos.org]

    Which is really said, since it was a really great place. The huge collection of old stuff they had is now out at moffet field, as seen in at http://www.computerhistory.org [www.computerhistory.org]

    I was said to see that stuff go.

    • I had no idea that the Computer Museum in Boston was gone. That's sad. I'd been through there years ago, and assumed it was still there. I suppose that when the Route 128 tech corridor tanked, their source of funding dried up.

      They had most of the early robots, ones of major historical importance like Shakey and the Hopkins Beast. I hope they've been preserved.

  • Hey! Where's the flood [computerhistory.org]?????????

    Black wingtips, white socks and hairy ankles. Dad?
  • Tom Carlson has been running htis place for a few years. Fun stuff, looking around his old systems. http://www.obsoletecomputermuseum.org/

    The obsolete Computer Museum! [obsoleteco...museum.org]
  • by Charles Dodgeson ( 248492 ) <jeffrey@goldmark.org> on Sunday February 10, 2002 @01:20AM (#2981298) Homepage Journal
    The timeline starts only at 1945. That misses things like Colossus [codesandciphers.org.uk] which is a decent candidate for first electronic programmable (UTM) computer.
  • Collosus (Score:3, Informative)

    by Decimal ( 154606 ) on Sunday February 10, 2002 @01:24AM (#2981306) Homepage Journal
    It's a shame that some of the first 'real' computers, used in WWII to decrypt German communications, were destroyed after the war ended. They were known as Collosus, and there were at least two of them.
  • From episode [3F20] Much Apu About Nothing [snpp.com]

    [in the late '70s]
    [Frink stands in front of a huge mainframe]
    Frink: Well, sure, the Frinkiac-7 looks impressive [to student] Don't
    touch it! [back to class] But I predict that within 100 years
    computers will be twice as powerful, 10,000 times larger, and
    so expensive that only the five richest kings in Europe will
    own them.
    Apu: Could it be used for dating?
    Frink: Well, technically, yes, but the computer matches would be so
    perfect as to eliminate the thrill of romantic conquest. Ha-ho-
    ha-hey-hoo.
  • I've been enjoying the Steven Levy book "Hackers," which is, of course, about the early days of hacking and the "Hacker Ethic": software hacking courtesy of MIT and hardware hacking courtesy of Stanford/Berkeley. Excellent book to go along with the computing history thread here. I've been reading about the Altair, which sounds a lot like the Honeywell computer, only 10 years after (1975 or so). Same idea: switches for input, blinkity-blinking lights for output. Makes one long for the simple days of the Tech Model Railroad Club, Community Memory, and the People's Computer Company.

    An appropriate quote from Les Solomon, who was editor of Popular Electronics and provided the world with the first glimpse of the Altair:

    The computer is a magic box. It's a tool. It's an art form. It's the ultimate martial art...There's no bullshit in there. Without truth, the computer won't work. You can't bullshit a computer, God damn it, the bit is there or the bit ain't there.
  • There's also The Tech [thetech.com] museum in San José, California. It has many exhibits, not all computer related, but all fascinating. Last I was there, they had a huge exhibit on IC fabrication, some things on robotics, and a whole mess of interesting stuff in between. If you're in the area, and dig computer history stuff like this, it's a cool place to spend an afternoon.
  • Tinkertoy computer (Score:3, Interesting)

    by sunhou ( 238795 ) on Sunday February 10, 2002 @02:01AM (#2981370)
    One interesting exhibit which I believe used to be in the Computer Museum in Boston was the Tinkertoy Computer, built by Danny Hillis (of Thinking Machines Corp.). It was a computer made of tinkertoys, capable of playing tic tac toe. I suppose some slashdot folks will say he should have used Lego blocks...

    I couldn't find any photos of it; just a tiny little blurb [computerhistory.org] here. I think A.K. Dewdney's Scientific American column in October 1989 also talked about it. Dewdney also used it as the title of one of his books collecting his columns.
  • by cr0sh ( 43134 ) on Sunday February 10, 2002 @02:15AM (#2981389) Homepage
    Today I went out "garage saleing" - and managed to find a couple of "interesting" books:

    The Secret Guide to Computers - Vol 1

    and

    The Secret Guide to Computers - Vol 2: Deep Secrets

    Both are of the 11th edition, written and published by Russ Walter - no ISBN, because in Russ' own words: "Ha ha ha! You think this book is standard?"

    These books are weird, and wonderful at the same time - they have strange "rainbow" colored covers, and the introduction in the first volume starts out with the line "Computers are like drugs: you begin by spending just a little money on them, but then you get so excited by the experience-and so hooked-that you wind up spending more and more money, to feed your habit."

    It takes the reader through introduction to programming, microcomputers, a bit of computer history, language history (listing some languages and origins I didn't even know about - and I collect this kind of info!) - you name the topic, and if it is from the early-80's and prior, it is in there. There is a wonderful section on computer "art", with crude black-and-white "photos" of early computer line drawings - including a series of Ivan Sutherland's "Aircraft Carrier Landing Simulator" - 3D graphics from the late 60's - early 70's!!!

    What is even more strange about the books is the amount of background info they give on the histories of various companies involved in microcomputers - plus info on the micros themselves (once again, if it existed, it is in the book - CP/M even features pretty prominantly). It gets even more strange - vague and not-so-vague references to sex, etc - about throughout the book: In the section on Russ' version of assembly language (his own creation), the opening section title is "SEXY ASS" - I kid you NOT (watch the lameness filter catch that). That section details what he terms "Simple EXcellent-for-You ASSembler" - then goes on to "teach" how to use this variant of assembler...

    He has another language called "EASY"...

    How rare (or common?) was this set of books - I have never seen another copy (as opposed to David Ahl's BASIC Games series, of which I have seen numerous copies)? Has anyone else come across it?

    I couldn't resist buying it - and at a quarter per book (oooh, a whole 50 cents!) - it was MINE...
    • These are great books, btw. I've got a copy called "The Secret Guide to Computers" no volume number mentioned, but it is the 14th edition apparently...

      The ISBN for it is 0-939151-14-6, but its not printed inside the book anywhere.
      Apparently, they're still being printed too the 27th! edition just came out. --http://www.angelfire.com/nh/secret/ [angelfire.com] [angilfire.com]

      • For a book so old (for personal computing, it is pretty damn old - its first printing, edition 0, was in 1972) to still be printed, and cheap, as well - almost as cheap as before - only a few dollars more.

        Thanks for the update - I have got to email this guy...
    • probably about 1990 or so, perhaps earlier...

      I think by the time I had this book I as already interested in computers but it definatley helped me choose programming as a career. I really liked the great overview it had of various languages, like COBOL.

    • "Russy-poo" (as he bills himself) has just published the 27th edition of The Secret Guide to Computers, available at amazon.com and elsewhere. His web site is at http://www.angelfire.com/nh/secret/ and he still answers questions if you phone him at home.
      BTW, he will sell you a copy of the original 11th edition of TSGTC (probably the same one you bought) for 30 cents plus a dollar shipping, so you saved $1.05 by picking it up at a garage sale! :)

  • I think my favourite part of the computer museum's website is their inclusion of music performed by an old IBM 1403 printer.

    Apparently, some engineers managed to find the right lines of characters to produce printer noise of a known pitch. Feeding in punch cards appropriately, they were able to produce output which would form certain tunes. They have some recordings available for download here [computerhistory.org].

  • by buckrogers ( 136562 ) on Sunday February 10, 2002 @06:35AM (#2981721) Homepage
    It would be interesting to have a CDROM with a complete history of every computing device, and a simulation, or emulation of each one. Complete with pictures of the machines, the design team and full technical specifications.

    The people who did all the design work would all be talked about too. And any publically available writings would be there too.

    How cool would it be to browse a history book on computer and actually be able to bring up an emulator of the machine in question? Aren't there emulators for most of these things anyway?

    Use a very general purpose emulator that uses a specification file that fully describes the complete machine. The specification file could be in XML so that other programs can mine the config files for data, like a program that will tell you the code for the letter A on every computer ever made. Or tell you every computer ever made that used ascii.
  • from http://www.computerhistory.org/timeline/topics/com puters.page [computerhistory.org]

    i found "John von Neumann" twice...

    is he related to sven Neumann of gimp?
  • I have a copy of Microsoft Windows 1.0 on 5.25" Floppies! It's fully functional, with manual!
    • > I have a copy of Microsoft Windows 1.0 on 5.25" Floppies! It's fully functional, with manual!

      Wasn't it called "Presentation Manager for DOS" back then? I seem to remember it was PMD then Windows 286, then Windows 3.0, after which everyone knows the numbering.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...