What If Babbage Had Succeeded? 212
mikejuk writes "It was on this day 220 years ago (December 26 1791) that Charles Babbage was born. The calculating machines he invented in the 19th century, although never fully realized in his lifetime, are rightly seen as the forerunners of modern programmable computers. What if he had succeeded? Babbage already had plans for game arcades, chess playing machines, sound generators and desktop publishing. A Victorian computer revolution was entirely possible."
We'd be all programming in Ada right now (Score:5, Funny)
Re:We'd be all programming in Ada right now (Score:5, Informative)
Stephen Stirlings "Peshawar Lancers" has the British Empire move to India after an catastrophe, and they had an analytical engine as well. Eric Flint's alternate history might make better reading if you're postulating "what if." Flint covers "gearing down," because in order to make advanced technology, there's a logical procession. Many of the things that we take for granted are the result of incremental improvements and discoveries.
Simply put, there's no way to make the leap from a mechanical "analytical engine" OR a mechanical "difference calculator" even to to the original IBM PC. (Or for that matter, the first Z80-based 8 bit computers.)
There's no doubt that Babbage might have moved technology forward a few decades. But what you and I know of as "computers" nowadays are based on a number of discoveries, from physics (Quantam Theory, in particular) to electromagnetism to advanced fab technologies for silicon to you name it.
I love reading alternative history, but I prefer those that are realistic. If you and I were to find ourselves as the "Yankee In King Author's Court," we'd actually be frustrated more than anything else. There's so much technology that even our grandparents took for granted that wouldn't be available.
Just the ability to measure down to microns (and smaller) is vital when making a great deal of modern technology.
Re: (Score:2)
And by the way, I also ought to add ... if Babbage HAD started a revolution that moved technology forward even just a few decades, WWI quite possibly wouldn't have been survivable for the species. There would have been pockets of civilization that survived with a hunter-gatherer or farming level of technology, but it would have been bad. VERY bad.
Think about it. Given the attitudes and mores of the time (and that's something else that most of us don't think about, by the way), if either side had had nukes (
Re:We'd be all programming in Ada right now (Score:4, Insightful)
Jus being able to refine ballistic tables could have made WWI much more lethal. It mightmade longer-ranged artillery practical, and of course better weapons get used more.
Re: (Score:2)
> Jus being able to refine ballistic tables could have made WWI much more lethal.
Exactly. Like I said, though, the big problem was the attitudes back then. We easily make the mistake of assuming that people back then thought like we do nowadays. That's NOT the case. Look up that famous image of a young Adolf Hitler standing in the square when WWI was announced, hat waving in the air and cheering. Then look at that equally-famous image of Americans equally as thrilled when the US entered the war, cheering
Re: (Score:2)
As opposed to americans cheering over entering Iraq and casting bitter recriminations and epithets at those who disagreed?
Re: (Score:2)
See my comment [slashdot.org], above, for more information concerning WW1 long range artillery. As in, "75 miles (120km) to target" long-range projectile weapons.
Re: (Score:2)
"Given the attitudes and mores of the time"
Well, for once, it was not the attitude of the time to focus on civilian targets; that was more the WWII attitude.
Re: (Score:3)
Well, for once, it was not the attitude of the time to focus on civilian targets; that was more the WWII attitude.
Industrial production was very labor intensive. So servicing it required large populations. In fact, servicing industrial production is the reason why most modern cities appeared.
Re:We'd be all programming in Ada right now (Score:4, Interesting)
I'm no historian, but I seem to recall that in WWII, when the Allies bombed Germany, the targets generally were the factories; and the Germans deliberately put their factories in the middle of populated areas. (Were they relying on us being less willing to bomb them for fear of collateral damage?)
The allies carpet-bombed Hamburg, and fire-bombed Dresden. Between the two campaigns, both cities were almost completely destroyed, and about 100,000 victims. Only about 40,000 civilians were killed during the German bombing of England. It wasn't because the Germans deliberately put their factories in civilian areas (the English did the same), it was because the allies were bombing from very high altitude and there was a great deal of luck involved in actually hitting your target from that altitude.
And in preparation for D-Day, they completely levelled some cities in northern France. (seriously... the ground is now more than 1m higher in Caen than it was before WWII, and the cathedral was the only building left standing in that city following the allied bombardment... and even the cathedral was partly destroyed... they even destroyed the citadel of William the Conqueror, because of the risk that it could be used by German soldiers to hide out... and Caen had no industrial complex to speak of, it was just a military stronghold).
Bombs, at least the kind of bombs they were using in WWII, were not a precision instrument. The primary targets were factories, railroads, things like that. But the standing order for most aircrews was "drop any remaining ordnance wherever you want before coming home".
Re: (Score:3)
But the Germans *did* have long range targeting, and weapons capable of using that data (indeed, the "Paris Gun" [wikipedia.org] was the reason for discovering how the Coriolis Effect [wikipedia.org] affected their targeting - at ranges of roughly 75 miles (120km), the rotation of the earth was enough to affect the projected 3-minute trajectory of the weapon's explosive projectiles).
In other words, your conclusion is based on a false premise. More information is always a good thing, when asking questions about possibilities.
Re: (Score:2)
This is why I love conspiracy theories involving aliens in 1949. Literally the technology to understand one quarter of a crashed alien spaceship wouldn't get invented for another 30+ years.
Then.. (Score:5, Funny)
1800 would have been the year of Linux on the Desktop.
Re:Then.. (Score:5, Funny)
Here's TFA (Score:5, Funny)
Very interesting read. Here's a complete copy of the article for anyone who's interested:
Catchable fatal error: Argument 1 passed to TeraWurfl::addTopLevelSettings() must be an array, null given, called in /home/iprogr6/public_html/plugins/mobile/terawurfl/TeraWurfl.php on line 334 and defined in /home/iprogr6/public_html/plugins/mobile/terawurfl/TeraWurfl.php on line 463
"what if" game (Score:3)
and what if a great-grandmother had balls? She'd be great-grandfather.
The point is that Babbage did succeed, except it was through his inspiration, which took his ideas and better manufacturing processes and newer knowledge of materials and a refined computing model.
Re:"what if" game (Score:5, Interesting)
"The point is that Babbage did succeed, except it was through his inspiration, which took his ideas and better manufacturing processes and newer knowledge of materials and a refined computing model."
This is a common misconception based on earlier analyses. In fact, portions of his engines have been built from the original plans, using techniques available in his day, and it has been determined that it would indeed have worked if only it had been built.
Contrary to popular belief, the two biggest problems that Babbage faced were: (1) his inability to convince investors in the worth of his invention, and (2) his insistence on constant refinement rather than freezing the plans at some viable point, in order to make a working device.
Re: (Score:3)
I think people are overvaluing the idea of "computation" in the abstract, rather than the implementation of actual
Re:"what if" game (Score:5, Insightful)
Yes and no. Mechanical computers do not have the scalability of electronic computers, to be sure, so that line of development would have reached its end.
At the same time, having a Turing complete computer, even a mechanical one, in the first half of the 19th century would have given mathematicians and engineers a whole new grammar to begin working on, much as even the relatively primitive digital computers of the 1940s to 1960s spurred on an absolutely astonishing amount of R&D, some of it still bearing fruit today.
I expect that if the Babbage machines had been built and had been put to use, they would have spurred the digital revolution nearly a century earlier, concentrating huge amounts of R&D by the Great Powers in the post-Napoleonic era. The military value, for instance, of fast and accurate cannon/mortar trajectory calculations would have given whoever developed such machines a considerable edge. The late 19th-early 20th century arms race was transformative in many ways, and the successors of Babbage's machines would have been caught up in that.
Re:"what if" game (Score:5, Informative)
The mechanical approach was still a dead end that was not on the path to anything like where we are today. He was like the guys, previous to the Wright Brothers, who spent their (short) lives working on flapping wings. You could argue they had the right idea - heaver-than-air powered flight - and thus inspired those who came after - but the fact remains, they were barking up the wrong tree.
The difference (ha!) here is that the flapping wings didn't work for powering manned flight while the Babbage machines would have. Sure they'd have been limited but they would have worked ! From there, as TFA says, refinements would have been implemented. It isn't as though modern computers are what was first designed, implemented or even conceived of. Great progress such as we've seen typically requires LOTS of folks putting their own mark on things.
Somewhat OT but imagine what would have happened had the Greeks realized the true power of steam. That they were tinkering with it is well known. We might have had flying chariots by now!
Re: (Score:2)
Somewhat OT but imagine what would have happened had the Greeks realized the true power of steam. That they were tinkering with it is well known. We might have had flying chariots by now!
Flying chariots? Like these? [media-imdb.com]
Re: (Score:2)
Unfortunately, the link you supplied is broken, due to the referrer being outside imdb.com's domain. Perhaps if you linked to the movie the image belongs to, instead, you would have at least gotten a "funny" mod, instead of being largely ignored because you didn't check your links in the preview pane.
Just saying.
Re: (Score:2)
But even if we go ahead and give Babbage full credit for his invention (pretending he'd marshaled the resources to build a working copy), would computers as we know them have occurred any sooner? Here's the crux of the article in my view:
Re: (Score:2)
The power of steam is known since ancient times.
What people did discover recently is how to do something other than exploding things with that power. That was thanks to lots of advances on physics and material working, and that last one didn't stop advancing through the Midle Ages.
Re: (Score:2)
So even if a few of Babbage's full-scale machine were only used by rich institutions (like government), smaller and simpler versions would surely have found plenty of good use.
Even a custom-built device, designed to do nothing but calculate cosines, could have had a major impact on war.
Re: (Score:2)
... as spoken by someone who obviously didn't read the article.
Your entire premise is flawed, in that had Babbage been able to fund the production of his machine, then he would have created "an actual machine to do it quickly, reliably, and cheaply." His Analytical Engine was a precursor to modern digital machines, and the article expresses how we might have been exactly where we are now, except 100 years earlier... and with a different power source.
It even postulates that something approximating the intern
Re: (Score:2)
Re: (Score:3)
All of which were set back about 1000 years by the dark ages and the mentality that still pervades.
To further your point: The US has shot itself in the foot by impeding the progress of medical science. All the vehement arguments about stem cell research that caused the US to outlaw accessing the best source of stem cells has resulted in Belgium coming up with a cure for AIDS, instead of the US.
Here's a link [nytimes.com] to the NYTimes story. Please keep in mind while reading it that the story seems to have a massive "sour grapes" slant, deeming the procedure "impractical" due in part to the fact that the patient's im
Re: (Score:2)
So you'd be OK with paying Babbages descendents a royalty for every computing device implemented since then?
read the book (Score:2, Informative)
The Difference Engine. We'd eventually get to the same place.
Re: (Score:2)
Yeah I read the book but I reckon that scenario used too much energy, particularly once you started talking about GUIs and processors running at Ghz. We would have needed transistors then, just as we need photonic logic now to keep improving.
Re: (Score:2)
The energy issue hasn't changed - we'll always need "just a little bit" more than we currently have. We could actually have tapped quite a few sources of reliable, renewable energy, it's just not economically viable to do so (at least, not while fossil fuels are still available at such (artificially) cheap rates).
Re: (Score:2)
I mean that inside your computer, if you wanted to do all that we do with gears and wheels and such like, you would need a lot more energy than we currently use pushing electrons around.
Re: (Score:2)
The middle third (it's split into three parts, with three different protagonists who don't interact much, so it's more like three short books) wasn't bad, but when it ended it felt like the story ended while th
Roman steam engine (Score:2)
Re: (Score:2)
Re: (Score:2)
Naah, they were more expensive. The problem is cultural, in that the leaders came more or less directly from military success, both against barbarians and in civil war. Military success implies capture of slaves. Coming up with a technological "solution" that expressly does not require the leaders most important product, is not gonna fly.
Its like trying to sell electric cars to Americans, no matter how much better they are than gas powered cars its culturally unacceptable. Must wait for the culture to s
Re: (Score:2)
Romans didn't use steam engines because slaves were much cheaper. Which is the reason Babbage's would have been just a similar toy: just because something is technically feasible doesn't mean society is ready for it.
Re: (Score:2)
Not really (Score:2)
The engines they've used (basically a sphere with a couple of nozzles) had very poor efficiency. They were not really suitable for anything but simple toys. They'd have to invent a lot of new technology to make real piston steam engines. Never mind steam turbines.
That's the same problem as with Babbage's engine.
This is where Steampunk died (or was born) (Score:5, Interesting)
The concept of huge mechanical computers fulfilling any purpose that seems hard for us to comprehend today.
Yet huge mechanical computers for specialized use were in actual deployment in several industries, not the least of which were "fire control computers" [wikipedia.org] on US and British Battle Ships and Heavy Cruisers in the pre WWII era. These were initially fairly huge mechanical beasts [wikipedia.org] that were originally developed around the time of the first World War, and which were initially totally mechanical in nature. By the Second World War they were electro-mechanical (solenoids and relays and stepper motors), and were enclosed in battle hardened enclosures [wikipedia.org].
Still 1920-to-1945 is hardly 1833, and the size and complexity of such devices taxed the manufacturing capabilities of the day, and the size and complexity of the problems they could solve was probably more easily worked out on paper than set (programmed) onto the machine.
Having worked out the concepts, one wonders how far Babbage could have progressed with a large budget and a larger machine shop to build his engines. There were precious few problems to which you could apply this technology in that day. But its a chicken and egg problem. Its hard to know what computations would have been attempted had such equipment been available. The calculation problems any society tackles tend to be near the limits of the computing capabilities available to the task.
A man before his time.
Re: (Score:3)
Babbage was working at the bleeding edge of the engineering of his time. Engines which have been built to his designs, and using the machining available to him, barely work. The long chains of gears frequently jam. There is just too much slack built into his systems. Its not his fault, just a natural consequence of the way engineering was done when he was alive.
So no, I don't think it could have gone far.
Re: (Score:2)
Re: (Score:3)
Well okay so that gets us to The Diamond Age if you assume it has to use moving parts. Maybe working Babbage machines would have brought forward the development of electronics.
Oh no, not this again. (Score:5, Interesting)
He had the money.
The people in 1800's Britain knew a good thing when they saw it. And when small prototypes were demonstrated the British Government committed to build the difference engine. And guess what, they wanted to use it for gunnery on ships! They invested *big*. How much? One fully kited out battleship's worth. One of these: http://en.wikipedia.org/wiki/HMS_Warrior_(1860) [wikipedia.org] (more or less). That is a huge amount of money.
The skills were available.
Have a look at a British clock from this period. Very intricate work and at a lot smaller scale than Babbage required. Sure, what he was doing was on a large scale, but the skills and tools were out there. Indeed, Babbage teamed up with them and had the money to do it.
But he committed the cardinal sin. Babbage was forever changing the design. Yes Mr Babbage, your analytical engine idea is nice but we are paying you for the difference engine! He could not stay focused to build what was paid for and required. Falling out with the machinists capable of building it hardly helped maters. He did not deliver. As a result he blew not only his own reputation but that of the whole idea, killing it for the best part of a century. That is how bad he was.
You can be the most talented man in the world, but if you are so disorganised and uncivil that nobody wants to work for you it is all for nothing. A lesson we can all still learn form.
Re: (Score:2, Funny)
Babbage = Sheldon
Re: (Score:2)
Obviously not or they'd not have gone to the difficulty of building machines which tax the limits of precision mechanical engineering to solve them. And part of what the mechanical FC computers did - stabilize the guns on a ship that's pitching and turning and rolling - can't be done with a precomputed table.
Re: (Score:2)
And part of what the mechanical FC computers did - stabilize the guns on a ship that's pitching and turning and rolling - can't be done with a precomputed table.
Fire Control computers of that vintage did not attempt to stabilize the guns. That didn't come till much later, and never was used on very large bore guns (> 6inch). Simply too much mass to control. Instead the delay firing until the ship rolls or pitches thru the optimum fire point.
Two words for you... (Score:5, Informative)
Two words for you: "Difference Engine" [wikipedia.org]. Bruce Sterling and William Gibson. That's what would happen if Babbage had succeeded.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
surely you mean mechanical boogaloo?
Not possible. (Score:3, Insightful)
A Victorian computer revolution was not possible, as should be obvious to anyone who understands how computers work. Just think of how massive (and weak) computers were back in the days of vacuum tubes. Now imagine how massive, weak, and prone to break downs they'd be if they were made of clockwork. You'd have an entire warehouse filled with moving parts that might be equivalent to a digital watch... at least until one of the gears breaks. The technology simply didn't exist to make computing feasible.
Re: (Score:2)
But a computing machine is actually far more straight-forward than the way an electronic computer works. It would be much slower, but I don't think it would be any more prone to breakdown than the vacuum tube machines were to burned-out tubes. Compare a mechanical desk-top adding machine of old to the earliest calculators -- they really weren't that much bulkier near the end of the era of the adding machines.
And the odds are, if you find one, the adding machine will still work.
Re: (Score:2)
Also, the computer revolution really took off because computers became ever cheaper and easier to manufacture. The problem with Babbage's design was that a lifetime wasn't long enough to build one without CAD/CAM. So, even if he'd worked twice as fast and got the thing working, it would have been a one-off for another few decades.
If Babbage had succeeded, it would have sparked the "man as machine" line of thought that has changed so much in our society, and that could have changed the course of history in a
butterfly effect my a55 (Score:2)
LoB
Re: (Score:3)
Re: (Score:2)
The V2's accuracy was set by the onboard integrating accelerometer, already a precision mechanical device, which integrated an error of some hundreds of meters over the course of its flight. No mechanical computer, in the sense of a gear-based version of t
Re: (Score:2)
I'm not suggesting that a mechanical computer could have replaced the Apollo flight computer. But
Butterfly effect (Score:2)
If/when something bad happens in my life and, something awesome happens later after that, it makes me easier to accept the sucky occurance reasoned with the butterfly effect, as without it the great thing might not ever have happened. :)
Anyway, it's always amazing to think how the current state of world is a result of millions small things coming together. Without everything going exactly like this, even the probability of me existing would be extremely low.
Had He Succeeeded... (Score:2)
If Babbage had succeeded, then there would have been a programming language called Babbage, and a software store chain in malls called Ada's instead of the other way around.
Storage not computing (Score:5, Interesting)
Historically computing has never been a processing problem, but a storage problem. Or all computing, from embedded stuff to supercomputers, pretty much seems to revolve around turning a computation bound problem into a storage bound problem, and waiting for storage to improve so you can roll out faster processors to make use of it.
Try it yourself, if you have the skills. I had a pretty decent bitslice ALU design for a relay CPU down a total of 20 relays per bit slice, not just a wimpy bare adder but a pretty full featured design complete with comparator and roller/shifter unit. An 8 bit processor is well within my entertainment budget at a couple bucks per relay, and if I package each bitslice into something the size of a ream of paper, which is probably pretty pessimistic, the entire 8 bit CPU is only about the size of a box of bulk laserprinter paper. I figured for about $500 total all costs of all parts I can get a decent reliable relay based 8 bit CPU operational.
But a couple hundred bytes of relay based ram to run some "real programs" is way outside my budget, both financial, storage, and power. Even tradeoffs don't work, like using latching relays saves me considerable (cheap) power at a cost of roughly twice as much per bit. Inevitably you get into weird dynamic electrolytic capacitor designs, strange attempts at homemade core memory... Cheating and using modern sram isn't cool. Hundreds of latching relays at lets say $5 per bit isn't gonna fly if I "need" a K or so of memory to have fun, that would be $40K just of storage relays to say nothing of the address decode logic etc. Also that would be well in excess of 8000 relays for a K of memory, vs a mere 160 relays for the processor. About 80 times bigger. So that goes from a small box sized CPU to basically a room of my house.
This has interesting MTBF implications, in that any "non-trivial" relay computer is going to mostly fight memory breakdowns, not processor failures.
To an amateur, calculating is the hard part. To a pro, storage is where the real problem lies.
Re: (Score:2)
There are tradeoffs, depending on the problem. If you had several orders of magnitude faster processing, for some applications storage would become less of a bottleneck, because you could just recalculate a lot of data on the fly instead of storing it (the well-known time-space tradeoff). So in a sense storage is a bottleneck in those applications only because processing isn't fast enough--- meaning the bottleneck is processing when you look at it from another perspective.
Re: (Score:2)
Yup thats the problem with trying to replicate what amounts to a 1980 KIM-1 in relays.
The ancients ran in to the same dilema and their solution was wide word sizes like 60 bits. Thus you end up with simpler shorter programs and more calculation per cycle and less memory size required.
An 8 bit machine with a K of ram is what you get when memory and CPU are fast and space is cheap. What I'm used to, basically.
The ancients idea of large word length makes sense if memory is expensive. Also lots of CPU regist
Re: (Score:3)
Yeah, there's some amount of primary storage that you can't do without, but it's not entirely fixed. In a lot of scientific-computing apps, for example, the trend over the past decade has been towards ripping out things like lookup tables, because they aren't worth the RAM or L1/L2 cache space: it's cheaper to just re-calculate sin(whatever) every time you need it than to store a big sin table, or recalculate pi to 10,000 digits instead of storing it as a giant constant, which didn't used to be the case. Oc
Too early for production use (Score:5, Interesting)
Much as I like the steampunk concept, Babbage's machine was at the upper end of what was buildable as an expensive prototype. Bear in mind that even consistently-good, moderately priced steel wasn't available until the 1880s. That's why fine machinery was made of brass until the 20th century.
The commercial history of mechanical calculators is not what you'd expect. Leibniz built the first mechanical multiplier in 1694. The commercial version, the "Arithmometer", wasn't produced until 1851. (It took a very long time to commercialize technology before there was industrial infrastructure.) Adding machines came later, because an adding machine is only a marginal improvement over an abacus, but a multiplier is a huge win.
The first high-volume mechanical arithmetic device was the cash register. When, in 1884, cash registers first got tape printers, for the first time merchants had some real mechanical bookkeeping assistance. By then, good steel was available, and stamped parts could be made in volume. That's the point at which something like Babbage's machine might first have been a commercial success.
Which it was. Hollerith's first punched card machines were used for the 1880 census. The Computing-Tabulating-Recording Company manufactured Hollerith machines commercially. The CTR became the International Tabulating and Recording Company, which became International Business Machines, which is today's IBM.
By 1880, there was enough manufacturing infrastructure to make stuff, and there was continuous year to year progress in mechanical calculation. The peak in purely mechanical systems was probably the Burroughs Sensimatic, in 1953, which was essentially a spreadsheet program made out of gears. IBM tabulators were more advanced, but they were electromechanical.
Re:Too early for production use (Score:5, Informative)
Bear in mind that even consistently-good, moderately priced steel wasn't available until the 1880s. That's why fine machinery was made of brass until the 20th century.
As an amateur machinist guy I can assure you that fine machinery was made of brass because steel/iron/etc was a nightmare to machine with the tools of the day, but brass is OK, not so labor intensive.
Bulk steel was actually pretty cheap. Not cheap enough to make a bridge out of it, but cheap enough to fill the world with rifles and swords. Before 1880 steel was too expensive to make a steel bridge over every river, or a steel locomotive rail thru every little two horse town, or a steel computer in every house, or a steel computer based internet, which is just as well because they didn't have the proper carbides and HSS to machine it anyway at any affordable rate.
Brass was, is, and probably always will be terribly expensive but it machines and wears (self lubricates, to an extent) like a dream. And the finish is quite attractive and simple, unlike steel or aluminum finishes. To this day, the amateur machinist guys make homemade steam engines out of brass, not steel, if they can afford it, anyway. I certainly prefer to work with brass. There are some issues with the cutting angles on lathe tools etc but its all really no big deal.
Brass is much closer in cost to being a precious metal than it is to being a structural metal. Always has been. This explains the fascination brass holds with the local meth user population, a little pocket sized outside water hose fitting is worth darn near as much as a small iron sewer/drain grate at the recycler.
Re: (Score:2)
Sci-fi "The Difference Engine" (Score:5, Interesting)
http://en.wikipedia.org/wiki/The_Difference_Engine [wikipedia.org] - by Bruce Sterling and William Gibson is a fascinating and complex exploration of exactly this concept: namely that Babbage succeeded. The key historical difference - the premise of the book - is that England's backing of the American Civil War succeeded, due to cryptography in part. Towards the end of the book it's made clear that the continued war between France and England has turned "cold" and thus much effort is dedicated to sneaking obfuscated "divide by zero" algorithms into the opposing side's Difference Engines. this book is one of the only sci-fi books (out of over 500 that i've read) that i actually found it hard to understand even 50% of what was going on. still made a damn good story, though.
Automatic Telephone Exchange (Score:2)
The Automatic Telephone Exchange was patented in the 1890s and was available [wikipedia.org] in the 1900s. The relays could have been rewired as an electromechanical computer, as was done in 1943 on the Z3 computer [wikipedia.org].
No one thought of it.
It would make a very good Star Trek time travel ep (Score:2)
The Enterprise travels back in time, assists Babbage in finishing his analytical engine, and mankind gets warp drive 150 years before Cochrane did.
In the mean time, Hitler fails to capture power in Germany, thanks to the internet.
Stalin fails to rise to power, because people are quickly informed via their phones about his actions.
etc
What if Babbage had succeeded ? (Score:2)
He would have probably created the first bug !
Well, for one thing... (Score:3)
Your cell phone would use gears...
Economic Justification? (Score:3)
The thing is, a room full of humans can compute also, perhaps aided by simpler mechanical calculators. Redundant calculations could be done to reduce and detect errors. Nobody has shown an economic argument for Babbage's monstrosity being that it would be damned expensive to build at the time and require lots of maintenance.
Without Boolean Logic? (Score:4, Interesting)
I'm surprised no one has mentioned this yet, but I think the biggest deficiency of Babbage's design was the base-10 numbering paradigm. Sure, he had the computer architecture down, to what we would now call the Von Neumann architecture [wikipedia.org], with the load, compute, store instructions. But making it all work in base-10 was incredibly messy, and I would think that is mostly why it was so difficult for him to implement.
It was not until 1854 that George Bool invented what we now call Boolean Algebra. [wikipedia.org]
Boolean logic allowed us to simplify computing circuitry, improving it's efficiency and size. Take a look at this famous YouTube video [youtube.com], it shows a mechanical calculator built with marbles, where a marble indicates a one and no marble indicates a zero. AND and OR gates are incredibly simple lever mechanisms, and it is powered by gravity and the weight of the marbles. What if Babbage had thought to use marbles and base-2 numbering instead of gears and base-10 numbering to do computations? He couldn't have because Bool's idea had not been thought of until some 30 years after his death, and even after that, it was not until Alan Turing (120 years after Babbage) that anyone was clever enough to realize that Boolean logic, as any other logic, could be used to program a computer. Before Turing, Boolean logic was more or lest a reasoning language for testing the logical soundness of true/false propositions.
So, architecturally, Babbage was ahead of his time, and perhaps had his idea succeeded, it may have encouraged research and development leading to the use of Boolean logic in computing much earlier. But that wasn't the case. It is fun to think of what may have happened though: we may have seen immense computing factories powered by mills which lifted grounded marbles into a giant bin above the factory, and all of that weight would filter through the mechanisms of the computer to produce results. Such a thing would have been unbearably noisy, but fast, simple, easily reparable, and effective. And it would have continued that way until someone thought of using electrical charges instead of marbles.
In all, I think if Babbage's design had succeeded, it may have made the computer revolution happen 30 or 40 years earlier, in which case, I would have been born in the the mostly ignorant generation of kids comprising the social-networking and internet revolution, and not in the more down-to-earth generation of 8-bit gaming, Q-BASIC and assembler-programming, personal computer revolution folk.
Re: (Score:2, Insightful)
The question was more, "What would the world have been like had he succeeded, then and centuries later". Or did you skip most of it to get a first post?
Re:Of course it was possible (Score:5, Interesting)
It almost seems possible that it could have been done much earlier. Automatic sources of rotary motion were known (waterwheel, steam engine) along with gearing mechanisms.
One of the first "programmable toys" was a cart controlled by a winding string [newscientist.com]
But it wasn't until the Industrial revolution in the 1850's, that the use of punched cards for storing instructions and input data that made mathematical calculating machines possible. That's one important factor. The other one is the use of mathematical notation for expressing algebra that can be converted into instructions.
What if he had got both these engines working by 1849? Would he have moved onto more advanced calculations or extended the use of mechanical computation to commerce like Hollerith punched cards did in 1889? If so, that would have advanced computing by 40 years.
First documented geared calculating mechanism (Antikythera) 150 - 100BC [wikipedia.org]
First documented use of waterwheels - 300BC [wikipedia.org]
First documented Steam Engine - 1AD [wikipedia.org]
First use of punched cards - 1725AD [wikipedia.org]
Re: (Score:3)
Re:Of course it was possible (Score:5, Interesting)
Re: (Score:3)
I'd say the internet is much more important than the computer itself, even though a computer is needed to fuel the internet.
Global communication is what made the computer explode in terms of usefulness.
Re:Of course it was possible (Score:5, Interesting)
But it wasn't until the Industrial revolution in the 1850's, that the use of punched cards for storing instructions and input data that made mathematical calculating machines possible. That's one important factor. The other one is the use of mathematical notation for expressing algebra that can be converted into instructions.
What if he had got both these engines working by 1849? Would he have moved onto more advanced calculations or extended the use of mechanical computation to commerce like Hollerith punched cards did in 1889? If so, that would have advanced computing by 40 years.
Yes, it would have advanced computing by about 40 years. But computing had reached a plateau in the 1940's (and arguably before then, there just wasn't any impetus to make a digital computer before WWII), and couldn't really advance any further than it had at that point until the invention of the transistor... the transistor itself arose from a chance discovery in late 1947, and wasn't readily available until the mid 1950's. Similarly, the integrated circuit wasn't available until the mid 1950's, either. In the absence of those technologies, it's arguable how far computing could have advanced beyond how far it had already advanced by the late 1940's, and neither IC's nor the transistor arose from people researching how to improve computers.
It really is debatable how far computing could have gone if Babbage had succeeded, considering that the computer revolution really didn't take off until integrated circuits made miniaturization possible in the 1960's.
Re:Of course it was possible (Score:5, Insightful)
I'd have to disagree, although the development of computing would have certainly taken a more "leisurely" pace than what it did in the late 20th Century.
What would have happened is that computers would have been seen as these big boxes or even completely separate buildings and have been used mostly for large organizations and governments. Keep in mind that most "Information Technology" departments owe at least some of their heritage to the "high priesthood of computer technicians" where only a select few were granted access to the computers.
I'm old enough to have been alive when programmers didn't even have a terminal on their desktop. Instead they have reams of paper that they carefully wrote software character by character with a pencil and then handed out sheets of code to transcriptionists who converted those sheets of paper into punch cards.... where you might be lucky to get the results of your software test in about a day or two unless your software was a high priority project. "The computer" was a place you could visit and go inside.
The question is how long that era of computer technology would have lasted. If Babbage had succeeded in getting funding from the British government in the 19th Century to complete his devices, their utility certainly would have been obvious and many of the suggestions made in this article would have occurred. Hinted at by the author would be the driving need to develop material science much earlier than actually happened, especially with the need to develop strong and lightweight materials in an attempt to miniaturize the devices. That would have in turn impacted the British military in some rather profound ways that might have pushed them into advancing in a great many other areas of scientific research.
For instance, how would World War One (not Two) have turned out differently with artillery that had the deadly accuracy that ended up being used in the Gulf War of 1990? Would Rudolf Diesel have developed a more efficient engine having those metal parts designed for computing available for internal combustion? How much earlier would aviation had developed with lightweight metals?
It is very hard to say what would have happened. Communications would have been slower (was slower) in the 19th Century, and that would have in turn slowed the development of computing compared to what we have today... but given a hundred year head start it certainly would have impacted more than just computing.
BTW, the integrated circuit didn't become used on a widespread basis until about 1970 or so. One of the very first significant applications of the technology was the Apollo Guidance Computers used for the lunar exploration program, where NASA ended up purchasing a substantial high two-digit percentage fraction of the total world-wide production (and one of the early sources of seed money for developing the semi-conductor industry). Most of the computers built in the 1960's used either vacuum tubes or discrete transistors when they were "improved versions". It is hard to say that computing technology hit a plateau until 1970.
Re:Of course it was possible (Score:4, Interesting)
neither IC's nor the transistor arose from people researching how to improve computers.
However, supposing there were mechanical computers, and they were useful; there would be an impetus to improve upon the capabilities / efficiency of the mechanical computers.
The availability of computers to scientists could have had a great effect on their studies, and it's difficult to predict what the results of that would have been.
The transistor could have been discovered a lot earlier, if inventors were looking for options, in efforts to find ways of adapting computation mechanisms into electric circuits; once all cost-effective improvements had been considered, computing would stagnate.
The transistor might have been discovered much earlier. LEDs might have been discovered, without a lightbulb ever having been invented -- Tesla might have never come to the US, AC may have not been discovered or put to practical use, even to date....
World War I and II might not have happened; because technology had a role in events leading up to them. With no real moving pictures or radio to use as propaganda tools; low-speed media communication, WWII and WWI are simply unlikely to have sparked...
With no world wars, no cold war, therefore no real investment in space travel, no NASA; no satellites, rockets, GPS, many technologies not existing.
Today, there might be no such thing as personal computers or Cell phones... no radio, no wireless communication, no television, but no PCs = no internet, just a small global network of big companies' number crunchers at most.
Re: (Score:3)
AC is really a stepping stone, as far as power distribution technology goes. The need for it is only when you don't have electronics with enough power-handling capability to partake in power distribution. Transformers working at 50 or 60Hz are monsters, pretty much. Without those transformers, there'd be no need for AC in power distribution. You can make more efficient switching power converters that are comparably tiny. You can push a couple megawatts through a ferrite ring fitting on a sheet of letter siz
Re:Of course it was possible (Score:5, Interesting)
The difference is that there was no popularly conceived need for such tools back then. Would you rather spend the modern equivalent of millions on a tabulating machine of some sort, or hire several accountants?
Bookkeeping wasn't nearly as complex than as it is today. There was negligible need for anything like this: society at large moved much slower, and there was time to do the basic arithmetic necessary to meet their needs. (Even today, most people don't need anything much more complex than a calculator around tax time...)
Cryptography was the first demonstrated use for modern computing (during WWII). Now, consider cryptography during the US Civil War. It basically didn't exist: they used cipher disks which utilized simple substitution ciphers and what we might today call a seed (by means of a visual or auditory cue). "Something you know and something you have". Imagine how complex, expensive, and precise the machinery needed to perform WWII-era ciphers would be if it were purely mechanical. It would also have to be fairly single-purpose.
The sad fact is, there's really little practicality to computers until you get to electronics. Even with electronics, it was a long time coming until they were practical for common use, and were only significantly used by governments and large corporations for one-off massive computation (code breaking, report generation, number crunching). The IC really was the bottleneck that needed to be beaten to make them generally practical (in terms of time and money vs. the results).
Re: (Score:3)
What if he had got both these engines working by 1849? Would he have moved onto more advanced calculations or extended the use of mechanical computation to commerce like Hollerith punched cards did in 1889? If so, that would have advanced computing by 40 years.
It could have and in fact likely would have resulted in an entirely different way of developing computers. Someone else said that materials science would have become interesting and important much earlier than it did in order to increase the speed and efficiency of mechanical computers, and I agree. I could imagine bypassing electrical computing almost entirely and instead developing nanoscale mechanical technology or something like that.
It's fascinating stuff to think about and definitely provides lots o
Re: (Score:2)
I think he was stuck on the many-worlds interpretation which allows for a universe to exist in which he is not a virgin in his mother's basement.
When you think about it in that context, of course anything could be possible.
Personally, I think that computing as it exists now is much like finding life in the universe similar to ours. If you think about it, so many things could have gone differently. Personal computing may have never taken off, languages could be different, etc. Imagine that the US lost Wor
Re: (Score:2)
Not much about it has changed. In fact I'd say almost nothing changed. It's more software than hardware.
Comment removed (Score:4, Interesting)
Re:Of course it was possible (Score:4, Insightful)
Given the current state of things in the US I'm not entirely sure that deserves to be in the past tense.
Re: (Score:2)
From my study of Catholic cultural intervention
Your knowledge of history seems pretty poor. You must have gone to public school.
Re: (Score:2)
Re:Of course it was possible (Score:4, Interesting)
Re:Of course it was possible (Score:4, Informative)
The library of Alexander was mostly destroyed by Julius Caesar, and while it was partially rebuilt it slowly grew smaller and smaller over time as the Roman Empire broke down and Alexandria ceased to be the greatest city of the world.
Actually, scholarly opinion is divided, so don't take it as established fact. The available evidence is ambiguous and rarely first-hand or unbiased, so it's likely to remain controversial.
The fire set by Caesar's troops among the Egyptian navy vessels spread onshore, but only into the port of Alexandria. Many thousands of "books" were burned in the port as a result, but most of them were commercial ledgers and suchlike. The Great Library was not in the port, and likely was relatively unscathed by this fire.
A better case can be made that the Library was destroyed during Emperor Aurelian's conflict with Queen Zenobia, which actually did devastate the requisite part of the city. Of course, being a repository of flammable materials (papyri) with lighting by candles and oil lamps, occasional fires at the Library probably reduced their holdings from time to time.
Re:What if? (Score:5, Funny)
What if the Black Death hadn't have occurred? What if Rome had never fallen? What if the Chinese had used gunpowder for more than fireworks? What if Christianity had never caught on? What if Native Americans had thrown off the colonists?
Until we start figuring out how to travel the multiverse, it's all subjective opinion...
What if there were no rhetorical questions?
Re: (Score:2)
Some of those questions are worth considering because they were single point sources. Presumably there existed one rat and one dead dude on a probably italian sailing ship who brought the plague to Europe... If a cat killed that rat with the flea holding the mutated virus, life would be quite different.
Some of those questions are kind of pointless, like the fall of rome. Rome was getting the giant flushing sound because of centuries of bad decisions and cultural failings, there is no "the" fall of rome.
Re: (Score:2)
Re: (Score:2)
It's traditional to attribute such quotes [uchronia.net].
Re: (Score:2)
Flying cars are totally possible, were it not for the fact that it's politically impossible to have everyone flying.
Re: (Score:2)
Ah so now the true motivation of the 1% is clear: They want flying cars. In order to do that they need to ensure that only they can fly them :P
Now it all makes much more sense :)