65 Years Ago, Manchester's 'Baby' Ran Electronically Stored Program 103
hypnosec writes that the first ever practical implementation of the stored program concept took place 65 years ago, "as the Manchester Small Scale Experimental Machine aka 'Baby' became the world's first computer to run an electronically stored program on June 21, 1948. The 'Baby' was developed by Frederic C. Williams, Tom Kilburn and Geoff Tootill at the University of Manchester. 'Baby' served as a testbed for the experimental Williams-Kilburn tube – a cathode ray tube that was used to store binary digits, aka bits. The reason this became a milestone in computing history was that up until 'Baby' ran the first electronically stored program, there was no means of storing and accessing this information in a cost-effective and flexible way."
ObSwayze... (Score:1)
Nobody programs baby in the corner!
Opportunity missed (Score:2)
Re:Opportunity missed (Score:5, Interesting)
In one sense it wasn't missed. Machines like EDSAC and LEO followed shortly afterwards but the US had a booming economy by comparison and it was a lot easier for US businesses (with the much larger internal market as well) to grow big on the back of that.
Re: (Score:3)
I suspect the UK also didn't go as far because of the sensitivities surrounding anything derived from or connected to the Bletchley Park Enigma work (which is where many of the early British computing pioneers and work came from)
Re:Opportunity missed (Score:5, Informative)
Computing in the UK really had a head start on the US in many ways, but in usual form it was underfunded and lacked vision; in many ways it suffered from the 50's post-war glow that "Britain Will Always Be Great". Once the Americans got in on the act they of course wiped the floor with everyone, and then socialist government meddling in the 60's just about finished off any hope of the UK compan[y|ies] being able to fight back.
Re: (Score:2)
in many ways it suffered from the 50's post-war glow that "Britain Will Always Be Great". Once the Americans got in on the act they of course wiped the floor with everyone,
Isn't this pretty much the story of all great nations? They get into the habit of acting like #1 and before you know it, they're nothing but #2.
America, fuck yeah!
What do they say in China?
Re: (Score:1)
Being #2 keeps you out of gun-sights. Don't knock it.
Re: (Score:2)
The same thing is happening with graphene now. We always fail to capitalize on our inventions.
Re: (Score:2)
Are allotropes, strictly speaking, inventions?
Re:Opportunity missed (Score:4, Informative)
Computing in the UK really had a head start on the US in many ways, but in usual form it was underfunded and lacked vision;
There was a considerable amount of important computer work done in the UK in the early years. For example, when considering Manchester's contributions one shouldn't overlook the pioneering work done with Atlas [wikipedia.org]. But there is far more than that. In some cases you can trace the path of key developments we rely upon today, or that that probably most people have at least heard of, to things developed in Britain through some familiar names.
A notable example is the computer language, "BCPL [wikipedia.org]", developed by Dr. Martin Richards [cam.ac.uk] at Cambridge in 1966. Dennis Ritchie ported BCPL to Multics. Ken Thompson and Dennis Ritchie used BCPL on Multics and from it derived the language "B" [wikipedia.org]. Some early Unix utilities were written in the BCPL derivative B. After additional rework of B, it became C, the heart of the Unix system. And of course C has led to the widely used derivatives C++ and Objective C.
BCPL was also used by Dr. Richards to develop the portable Tripos [wikipedia.org] operating system, which was used on a variety of minicomputers. As microprocessors become ever more powerful and started forming the basis for powerful personal comptuers, Tripos was eventually selected to became the heart of the Amiga's AmigaDOS [wikipedia.org] operating system.
BCPL has been available on many systems with familiar names, including (reportedly) the Raspberry Pi.
Classic BCPL [nordier.com]
To anyone interested in the whys and wherefores of C, a passing acquaintance with BCPL is worthwhile. Viewed forwards through BCPL, rather than backwards through Java and C++, many C constructs, and idiomatic C ways of doing things, just make a lot more sense.
Beyond its historical importance, BCPL had intrinsic merits. In retrospect, what particularly impresses, is the elegant simplicity of its compiler. This is well documented in the book BCPL: the language and its compiler by Martin Richards and Colin Whitby-Stevens (Cambridge: Cambridge University Press, 1979). -- more [nordier.com]
BCPL: A tool for compiler writing and system programming [computer.org]
THE PROGRAMMING LANGUAGE B [bell-labs.com]
The Development of the C Language [bell-labs.com]
Re: (Score:2)
The 50th Anniversary of the Atlas computer [man.ac.uk]
Re: (Score:3)
Not at all. These were not classified, they were in open research environments. The plans for Colossus were destroyed.
EDSAC was inspired by a trip to the US and a lot of what was developed came from the US originally.
Re: (Score:3)
Not exactly the whole truth: During the war, computing ideas were shared between Bletchley Park, whose interest was in language relatied stuff, and Los Alamos, whoe interest was Numerical Computing. There were many transatlantic trips, and knowledge was shared.
After the war, the UK hid all its knowledge for security reasons. In the US, the knowledge was used for commercial profit. Its a cultural difference
Re: (Score:2)
Not true. The greater decline in manufacturing came under Blair and Brown. At the end of Mrs Thatcher's time as PM, manufacturing fell from 25.8 per cent to 22.5 per cent, under Blair/Brown, manufacturing accounted for more than 20 per cent of the economy in 1997, the year Labour came to power, by 2007, that share declined to 12.4 per cent.
That ONS figures
Re: (Score:2)
Re: (Score:2)
A classic example of that being the invention of the RSA cipher by a guy at GCHQ. It was locked in a drawer for fear that it would be more use to the Russkies.
Re:Opportunity missed (Score:4, Informative)
Leo was developed by Lyons, a food manufacturer/wholesaler/retailer. There's a very nice book about about it, A Computer called Leo [amazon.co.uk].
Re:Opportunity missed (Score:5, Insightful)
The UK had a thriving computer industry even into the '80s. Companies like Sinclair did well in the home computer market and Acorn was selling desktops that ran a multitasking GUI very cheaply, with a lot of success in the home and schools markets. The decline started as the IBM PC gained prominence. The UK tech companies found it hard to export to the US, and didn't have as large a domestic market. Selling to mainland Europe required translations, so US companies were able to ramp up economies of scale that left them unable to compete. The ones that were successful, such as ARM (an Acorn spin-off) and Symbian (a Psion spin-off), did so by selling through existing large companies that had an established supply chain.
One of the big problems with getting large multinational companies in the UK is that it's much harder for tech companies to do well on the LSE. A startup in the US wants to get to be worth about a few hundred million and then IPO and continue to grow. A startup in the UK wants to get to be worth a few hundred million and then sell out to a big company. There are a lot of startups in the UK that make it to a few million market cap mark, but almost none that make it past the billion. A lot of this is due to different investor culture, rather than anything related to the people running the companies.
Re: (Score:2)
The UK tech companies found it hard to export to the US
Why?
Selling to mainland Europe required translations
Is that a big deal? Especially if you went for a few major languages, like German, first. I would think that European manufacturers would have been more used to the need for translations than American companies.
P.S. Wish I had mod points to bump up your post.
Re: (Score:2)
Early computers were large and delicate. Not a good combination if it needs to be shipped across the Atlantic.
Though the "not invented here" factor probably had more to do with it.
Re: (Score:2)
Early computers were large and delicate. Not a good combination if it needs to be shipped across the Atlantic.
If you can ship it by road or rail without problems, you can ship it by sea.
Though the "not invented here" factor probably had more to do with it.
Evidence? Or are you just indulging your prejudices again?
Re: (Score:2)
Pity these guys [wikipedia.org] weren't as smart as you, eh? I guess you know exactly which transatlantic railways go up and down with a 50 foot amplitude and which ocean freeways are more prone to sinking.
The mirror is over there ===>.
You need to stand back a bit (a mile should do) or it won't fit you in.
Re: (Score:1)
The problem with a lot of British computer companies was, as usual, lack of vision. LEO Computers was an offshoot from a bakers; the engineers themselves a
Re: (Score:2)
Delicate? Which of the three things Ike credited meet that description?
Re: (Score:2)
It may be that the cost of shipping to America made it uneconomic to ship to a country that had a native computer industry.
In the 60's and 70's selling from one country to another was not a widespread activity generally. People just did not expect to do it.
Re: (Score:2)
The UK tech companies found it hard to export to the US
Why?
Because, at the time, the US government would only buy from US tech companies, and most big businesses had their purchasing decisions strongly influenced by what government bought (often for interoperability reasons), which influenced small businesses (for the same reason). Marketing in the USA required a big budget to get national penetration and there wasn't an obvious place to start.
In contrast, a tech company in California could start selling locally and then just expand slowly into more states. Thei
Re: (Score:2)
The UK tech companies found it hard to export to the US
Why?
IBM
Selling to mainland Europe required translations
Is that a big deal? Especially if you went for a few major languages, like German, first. I would think that European manufacturers would have been more used to the need for translations than American companies.
P.S. Wish I had mod points to bump up your post.
It isn't a big deal. The dissolution of proprietary architectures is a natural process. It even occured back then.
Re: (Score:2)
There's a difference between putting R&D on a back burner while your economy recovers and taking an axe to what you've got. Over manufactured concerns of national security while the USA goes ahead and builds commercial versions of their version.
And then there's Canada (nobody starving up there in the 1950s) who had to chop up their prototype supersonic interceptor at Boeing's request.
Was it really only 65 years ago? (Score:5, Interesting)
Wow. It's easy to forget that the entire industry of programmable computers is younger than a lot of ordinary people walking around today. It makes me wonder what entirely new industry I might see develop from nothing over my lifetime.
Re:Was it really only 65 years ago? (Score:5, Funny)
Re: (Score:1)
Well, it won't be @&!$ flying cars
Re: (Score:1)
Why would anybody use Perl in a flying car?
Re: (Score:2)
Actually no, Computers have been around far longer.
It's only been 65 years since the first ELECTRONICALLY STORED PROGRAM computer has been around.
Prior to this, computers existed, but the program they ran had to be set up before hand by moving jumpers and other such things aro
Re: (Score:2)
Yeah? Well when you did it with gears (that you had to make yourself using a rusty tin-can lid and a blunt file) you can stand on my lawn, mmmkay?
Re: (Score:2)
Wikipedia is a source of information but it's not necessarily the definitive source of information.
Re: (Score:2)
...this machine isn't even mentioned in the Wikipedia computer entry, then? http://en.wikipedia.org/wiki/Computer [wikipedia.org]
Yes it is. It's in the section on "Stored-program architecture":
A number of projects to develop computers based on the stored-program architecture commenced around this time, the first of which was completed in 1948 at the University of Manchester in England, the Manchester Small-Scale Experimental Machine (SSEM or “Baby”).
Re:How Come.... (Score:4, Insightful)
...this machine isn't even mentioned in the Wikipedia computer entry, then? http://en.wikipedia.org/wiki/Computer [wikipedia.org]
According to the wiki, the Germans were first with a calculator, followed by the Americans. The Brits are given a sentence, saying that they built the Colossus, which had 'limited programmability', but that the US machine ENIAC was really the first proper computer....
Wikipedia is a fifedom not an encyclopedia. Editor's persistance beats facts a lot of the time.
You should treat wikipedia like the smart guy down the pub who seems to know what he is talking about but he might be just making everything up.
Re: (Score:1)
Nice anti-Wikipedia rant, but I already pointed out that the OP was simply mistaken and the article does mention Baby. Don't let that get in the way of a good tirade though. Speaking of facts, do you have any that demonstrate that Wikipedia is less accurate than highly regarded encyclopedias like Britannica?
Re: (Score:2)
Read a wikipedia article about any subject you really know about. Depending on who's fifedom that article falls under it might be well written, or it may be trash. Either way it's unlikely you will be able to improve it without facing protected edit wars, editors using sockpuppet accounts, flames, accusations of bias, power-tripping administrators, and all other imaginative kinds of abuse. There are exceptions but only on articles that are abandoned fifedoms.
Try adding a well formatted and well written arti
Re: (Score:2)
Yes, go and pat your buddies' backs and laugh at them dumbass limeys while drinking beer we wouldn't wash glasses in. Still, I guess we need stronger beer to distract us from each other's teeth, or we'd have died out year ago, right?
I'm sure that if there was an Encyclopedia Americanica it would be 100% correct. Especially about the age of the Earth and the plural of "toma
Re: (Score:2)
No. The moditors (or wp:whateverTheEmoAspieFuckersCallThemselves) like to blow their own trumpet. A flute is way too quiet and subtle.
What "computer"? (Score:2)
In days of old ... (Score:2)
and punch-cards weren't invented
we drank our joe
by the warm tube glow
and went on quite contented.
Re: (Score:2)
Someone with poetical talent should write the history of early computing as a proper epic in dactylic hexameter.
Re: (Score:2)
Not to be a Doubting Thomas, but... (Score:1)
Is it just me, or does it seem with each passing year, the earliest date in which something is claimed to have happened for the first time gets pushed back a year? Just about this time last year, it seems that it was 64 years ago that the first electronically stored program was run by a computer, and now they're claiming it was 65. Way to revise history, guys. Next you'll be claiming that everyone is a year older now than they were before. Where will it end?
Turing's Cathedral (Score:1)
When I first started in this industry... (Score:5, Interesting)
When I first started in this industry, I worked with Chris Burton who'd worked on Baby (and later led the team which rebuilt it [digital60.org]); he had known Turing, as had another man I worked with later. Our team was led by Charlie Portman, who gets a credit in The Mythical Man Month [wikipedia.org]. It's pretty amazing how close we are - two generations away - from the legendary figures who founded our industry, who built the first computers.
Chris was famous in our team because we had some new Mannesman Tally inkjet printers, which could only print ASCII, and we needed them to print bitmaps. The processor in the printers was one that no-one in the team had any experience of. So Chris took the datasheet for the printer, the datasheet for the processor, a dump of the printer ROM, and a square ruled pad home with him on the train, and came back in the morning on the train with code for a new ROM for the printer, written not in assembler but in the actual opcodes (hexadecimal), in pencil on the pad. We blew them into the ROM and it worked first time printing perfect bitmaps, no errors, no bugs to fix.
That's how good the first generation programmers were. I am still in awe of that. And he was a very modest man, very generous with his experience. I'm proud to have learned from him.
Re: (Score:2)
I love stories like this.
Personally, I think such people should have appellations akin to those of ancient Greek Heroes. I could just be weird, though.
Re: (Score:2)
I think such people should have appellations akin to those of ancient Greek Heroes
Given ancient Greek tastes in intimacy, that would be especially appropriate for Turing. Shame the British government didn't see it that way.
Re: (Score:2)
He, some of MY friends were students when Tom Kilburn was head of CS at Manchester. We went to a lot of the Manchester 50 events, with our baby son. I doubt he remembers.
Re: (Score:2)
Technically, things were also a lot simpler - the ROM was probably only a few K in size, so manual disassembly was a very doable thing. In addition, a programmer had to work at the machine code level - asesmblers were often quite hard to come by (or expensive), so code being hand-assembled was common. Which isn't too bad a thing
Blathering about Manchester University. (Score:2)
A billion years ago, when I was studying for my Computer Science degree at Manchester University, the design of the Mark 1 and its test machine was certainly on the curriculum. I remember an exam where I had to describe the evolution of ALUs from Mark I to Cray I. Kids these days just get a bunch of Java and Hadoop.
I don't where 'Baby' came from, I never heard it referred to as that by the staff who worked on it. I graduated in 1990. I don't think I heard it referred to as 'Baby' until I was living in the '
First example of DRAM (Score:2)
Using a CRT to scan data onto high persistence phosphor, and then use optical sensors to feed that data back to the electron gun created the first dynamic storage system. This machine not only was the first machine with electronic storage, but was the first machine to exercise an example of Dynamic Random Access Memory.
Re: (Score:2)
Re: (Score:1)
Something posh, I'll wager. After all, he used to be Queen Elizabeth's brother in law.
Re: (Score:1)
How is this relevant to geeks and nerds?
Surrender your geek card immediately. How is the anniversary of the first run of an electronically stored program *not* relevant to geeks and nerds?
Re: (Score:2)
And it didn't happen in the USA.
Re: (Score:3)
And it didn't happen in the USA.
Give it a rest. The idea that Americans think Americans pioneered everything is even more of a shopworn generalization than Americans who actually think Americans pioneered everything.
Re: (Score:3)
Two words: Al Gore.
You'll be claiming you aren't all fat next.
Re: (Score:2)
Two words: Al Gore.
Two words: urban legend.
Re: (Score:2)
"I took the initiative on creating the internet".
http://www.youtube.com/watch?v=BnFJ8cHAlco [youtube.com]
Re: (Score:2)
The urban legend is that Gore claimed he invented the Internet. What he actually said was poorly phrased, and typical of a politician, but no different from Eisenhower saying he took the initiative on creating the Interstate Highway system. Even Vint Cerf and Newt Gingrich have said the the urban legend is silly. [wikipedia.org]
It's not that I'm a great Gore defender, or even that I mind people using the urban legend as a joke, but it's going too far for the aptly pseudonymous Hognoxious to use it as support for his resent
Re: (Score:2)
Well if only I had a capacious vocabulary like you I could have chosen a different one.
Of course that assumes as a prerequisite that I'm capable of donating an airborne copulatory event.
Re: (Score:3)
Two words: Al Gore.
Two words: urban legend.
I'm reasonably certain that Al Gore isn't an urban legend. But if anyone could prove it, there might be 10 quid in it for charity.
Re: (Score:3)
And it didn't happen in the USA.
Give it a rest. The idea that Americans think Americans pioneered everything is even more of a shopworn generalization than Americans who actually think Americans pioneered everything.
Yet many Americans do believe that the US invented everything and can often recall names and dates to back this up. Yet they have no knowledge of the many times the same thing was invented before. People only know what they are taught so I blame the American education system for that one.
Amusingly Indians (from India, not native Americans) believe the exact same thing.
Re: (Score:2)
Yet many Americans do believe that the US invented everything and can often recall names and dates to back this up.
Concrete examples?
Better yet, stats or studies. You can always cite anecdotes of a few people with an absurd misunderstanding of something, but inferring too much from that may be a matter of confirmation bias, or worse yet, over-generalization.
Re: (Score:3)
Amusingly Indians (from India, not native Americans) believe the exact same thing.
Proof here [youtube.com].
Re: (Score:2)
And it didn't happen in the USA.
Then--like the Roman Empire and the birth of Barrack Obama--it never happened!
Re: (Score:2)
http://en.wikipedia.org/wiki/Z3_(computer) [wikipedia.org]
http://en.wikipedia.org/wiki/Z4_(computer) [wikipedia.org]
and the life of http://en.wikipedia.org/wiki/Konrad_Zuse [wikipedia.org]
Re:How (Score:5, Informative)
Re: (Score:1)
Zuses early machines used a mechanical data memory and a tape for the program. This is about the first computer using electrically stored memory and supporting stored memory programs - a Z4 would most probably be more useful in comparison but that doesn't change the fact that the SCEM did two very important contributions to computation machines.
Re:How (Score:4, Funny)
There's a standard template to apply to any debate about the history of computing:
The first computer with $GIVEN_FEATURE was actually invented by $GENIUS_LONER who worked for $SOME_INSTITUTION in $CENTRAL_EUROPEAN_COUNTRY a full $N_GREATER_THAN_10 years before $GIVEN_DATE. Sadly, his invention was ignored because of $INSTITUTION_POLITICS, the inventor's $PERSONAL_FAILINGS, and meddling by the $OPPRESSIVE_REGIME. Only a single example of the system was built, and it languished in $DISUSED_BASEMENT, until was unfortunately destroyed during $WARTIME_EVENT.
Re: (Score:2)
Re:How (Score:5, Informative)
it lays to rest the myth that Americans invented the computer
It does, but it's been many years since the "ENIAC was the first electronic computer" myth was prevalent anyway.
The post is right that Baby was tremendously important for being the first computer with an electronically stored program. However if you want to debate who invented the modern computer, it's absurd to say that any one person or group did so. Histories are right to trace it back at least as far as Babbage. In the 1930's and 1940's there were numerous people and groups in the UK, US and even Germany (Zuse) that all made important contributions.
Re: (Score:2)
This was the first electronically-stored program. Earlier computers had things like tubes of mercury with vibrations travelling down them [wikipedia.org] to do the same thing.
Re: (Score:2)
It depends how you define 'modern computer'.
If you mean 'programmable machine', Babbage's Difference Engine is usually credited as the first.
If you mean 'electronic general-purpose computer', it was ENIAC
If you mean 'stored-program computer' (which all modern PCs are), then it was the 'Baby'.
Re: (Score:2)
Let's hope it also lays to rest the myth that we only have computers because of NASA.
Sounds like you're creating myths about myths.
Re: (Score:2)
'Hello my dear canine friend, since you have expressed a liking of myths, I have endeavoured to place a myth within your myth...'
And I can't think of a good ending :(
Re: (Score:2)
... in order to facilitate recursive and/or parallel mythologising?
Re: (Score:2)
Re: (Score:2)
Nobody thinks we only have computers because of NASA. It's velcro. Or do I mean post-it notes? I always get them confused because they're next to each other in the dictionary.
Re: (Score:2)
depends on definitions. first electronic binary computer was invented by Vincent Atanasoff, but it was not a general purpose machine.
Re: (Score:2)
of course it was, look up the definition of computer sometime.