Forgot your password?
typodupeerror
Programming Hardware Technology News

What If Babbage Had Succeeded? 212

Posted by Soulskill
from the steampunk-personal-computing dept.
mikejuk writes "It was on this day 220 years ago (December 26 1791) that Charles Babbage was born. The calculating machines he invented in the 19th century, although never fully realized in his lifetime, are rightly seen as the forerunners of modern programmable computers. What if he had succeeded? Babbage already had plans for game arcades, chess playing machines, sound generators and desktop publishing. A Victorian computer revolution was entirely possible."
This discussion has been archived. No new comments can be posted.

What If Babbage Had Succeeded?

Comments Filter:
    • by smpoole7 (1467717) on Monday December 26, 2011 @05:37PM (#38496930) Homepage

      Stephen Stirlings "Peshawar Lancers" has the British Empire move to India after an catastrophe, and they had an analytical engine as well. Eric Flint's alternate history might make better reading if you're postulating "what if." Flint covers "gearing down," because in order to make advanced technology, there's a logical procession. Many of the things that we take for granted are the result of incremental improvements and discoveries.

      Simply put, there's no way to make the leap from a mechanical "analytical engine" OR a mechanical "difference calculator" even to to the original IBM PC. (Or for that matter, the first Z80-based 8 bit computers.)

      There's no doubt that Babbage might have moved technology forward a few decades. But what you and I know of as "computers" nowadays are based on a number of discoveries, from physics (Quantam Theory, in particular) to electromagnetism to advanced fab technologies for silicon to you name it.

      I love reading alternative history, but I prefer those that are realistic. If you and I were to find ourselves as the "Yankee In King Author's Court," we'd actually be frustrated more than anything else. There's so much technology that even our grandparents took for granted that wouldn't be available.

      Just the ability to measure down to microns (and smaller) is vital when making a great deal of modern technology.

      • by smpoole7 (1467717)

        And by the way, I also ought to add ... if Babbage HAD started a revolution that moved technology forward even just a few decades, WWI quite possibly wouldn't have been survivable for the species. There would have been pockets of civilization that survived with a hunter-gatherer or farming level of technology, but it would have been bad. VERY bad.

        Think about it. Given the attitudes and mores of the time (and that's something else that most of us don't think about, by the way), if either side had had nukes (

        • by rickb928 (945187) on Monday December 26, 2011 @06:06PM (#38497122) Homepage Journal

          Jus being able to refine ballistic tables could have made WWI much more lethal. It mightmade longer-ranged artillery practical, and of course better weapons get used more.

          • by smpoole7 (1467717)

            > Jus being able to refine ballistic tables could have made WWI much more lethal.

            Exactly. Like I said, though, the big problem was the attitudes back then. We easily make the mistake of assuming that people back then thought like we do nowadays. That's NOT the case. Look up that famous image of a young Adolf Hitler standing in the square when WWI was announced, hat waving in the air and cheering. Then look at that equally-famous image of Americans equally as thrilled when the US entered the war, cheering

          • by znerk (1162519)

            See my comment [slashdot.org], above, for more information concerning WW1 long range artillery. As in, "75 miles (120km) to target" long-range projectile weapons.

        • "Given the attitudes and mores of the time"

          Well, for once, it was not the attitude of the time to focus on civilian targets; that was more the WWII attitude.

          • by superwiz (655733)

            Well, for once, it was not the attitude of the time to focus on civilian targets; that was more the WWII attitude.

            Industrial production was very labor intensive. So servicing it required large populations. In fact, servicing industrial production is the reason why most modern cities appeared.

        • by znerk (1162519)

          But the Germans *did* have long range targeting, and weapons capable of using that data (indeed, the "Paris Gun" [wikipedia.org] was the reason for discovering how the Coriolis Effect [wikipedia.org] affected their targeting - at ranges of roughly 75 miles (120km), the rotation of the earth was enough to affect the projected 3-minute trajectory of the weapon's explosive projectiles).

          In other words, your conclusion is based on a false premise. More information is always a good thing, when asking questions about possibilities.

      • by peragrin (659227)

        This is why I love conspiracy theories involving aliens in 1949. Literally the technology to understand one quarter of a crashed alien spaceship wouldn't get invented for another 30+ years.

  • Then.. (Score:5, Funny)

    by Haedrian (1676506) on Monday December 26, 2011 @05:17PM (#38496800)

    1800 would have been the year of Linux on the Desktop.

  • Here's TFA (Score:5, Funny)

    by rgbrenner (317308) on Monday December 26, 2011 @05:18PM (#38496804)

    Very interesting read. Here's a complete copy of the article for anyone who's interested:

    Catchable fatal error: Argument 1 passed to TeraWurfl::addTopLevelSettings() must be an array, null given, called in /home/iprogr6/public_html/plugins/mobile/terawurfl/TeraWurfl.php on line 334 and defined in /home/iprogr6/public_html/plugins/mobile/terawurfl/TeraWurfl.php on line 463

  • by roman_mir (125474) on Monday December 26, 2011 @05:20PM (#38496812) Homepage Journal

    and what if a great-grandmother had balls? She'd be great-grandfather.

    The point is that Babbage did succeed, except it was through his inspiration, which took his ideas and better manufacturing processes and newer knowledge of materials and a refined computing model.

    • Re:"what if" game (Score:5, Interesting)

      by Jane Q. Public (1010737) on Monday December 26, 2011 @05:39PM (#38496958)

      "The point is that Babbage did succeed, except it was through his inspiration, which took his ideas and better manufacturing processes and newer knowledge of materials and a refined computing model."

      This is a common misconception based on earlier analyses. In fact, portions of his engines have been built from the original plans, using techniques available in his day, and it has been determined that it would indeed have worked if only it had been built.

      Contrary to popular belief, the two biggest problems that Babbage faced were: (1) his inability to convince investors in the worth of his invention, and (2) his insistence on constant refinement rather than freezing the plans at some viable point, in order to make a working device.

      • by timeOday (582209)
        The mechanical approach was still a dead end that was not on the path to anything like where we are today. He was like the guys, previous to the Wright Brothers, who spent their (short) lives working on flapping wings. You could argue they had the right idea - heaver-than-air powered flight - and thus inspired those who came after - but the fact remains, they were barking up the wrong tree.

        I think people are overvaluing the idea of "computation" in the abstract, rather than the implementation of actual

        • Re:"what if" game (Score:5, Insightful)

          by MightyMartian (840721) on Monday December 26, 2011 @06:24PM (#38497216) Journal

          Yes and no. Mechanical computers do not have the scalability of electronic computers, to be sure, so that line of development would have reached its end.

          At the same time, having a Turing complete computer, even a mechanical one, in the first half of the 19th century would have given mathematicians and engineers a whole new grammar to begin working on, much as even the relatively primitive digital computers of the 1940s to 1960s spurred on an absolutely astonishing amount of R&D, some of it still bearing fruit today.

          I expect that if the Babbage machines had been built and had been put to use, they would have spurred the digital revolution nearly a century earlier, concentrating huge amounts of R&D by the Great Powers in the post-Napoleonic era. The military value, for instance, of fast and accurate cannon/mortar trajectory calculations would have given whoever developed such machines a considerable edge. The late 19th-early 20th century arms race was transformative in many ways, and the successors of Babbage's machines would have been caught up in that.

        • Re:"what if" game (Score:5, Informative)

          by JustNilt (984644) on Monday December 26, 2011 @06:44PM (#38497364) Homepage

          The mechanical approach was still a dead end that was not on the path to anything like where we are today. He was like the guys, previous to the Wright Brothers, who spent their (short) lives working on flapping wings. You could argue they had the right idea - heaver-than-air powered flight - and thus inspired those who came after - but the fact remains, they were barking up the wrong tree.

          The difference (ha!) here is that the flapping wings didn't work for powering manned flight while the Babbage machines would have. Sure they'd have been limited but they would have worked ! From there, as TFA says, refinements would have been implemented. It isn't as though modern computers are what was first designed, implemented or even conceived of. Great progress such as we've seen typically requires LOTS of folks putting their own mark on things.

          Somewhat OT but imagine what would have happened had the Greeks realized the true power of steam. That they were tinkering with it is well known. We might have had flying chariots by now!

          • by roman_mir (125474)

            Somewhat OT but imagine what would have happened had the Greeks realized the true power of steam. That they were tinkering with it is well known. We might have had flying chariots by now!

            Flying chariots? Like these? [media-imdb.com]

            • by znerk (1162519)

              Unfortunately, the link you supplied is broken, due to the referrer being outside imdb.com's domain. Perhaps if you linked to the movie the image belongs to, instead, you would have at least gotten a "funny" mod, instead of being largely ignored because you didn't check your links in the preview pane.

              Just saying.

          • by timeOday (582209)

            The difference (ha!) here is that the flapping wings didn't work for powering manned flight while the Babbage machines would have.

            But even if we go ahead and give Babbage full credit for his invention (pretending he'd marshaled the resources to build a working copy), would computers as we know them have occurred any sooner? Here's the crux of the article in my view:

            In addition the need to build more and better machines would have caused a rapid development in materials science. With better, stronger, li

          • The power of steam is known since ancient times.

            What people did discover recently is how to do something other than exploding things with that power. That was thanks to lots of advances on physics and material working, and that last one didn't stop advancing through the Midle Ages.

        • On the contrary: mechanical "adding machines", while far simpler than Babbage's more general device, could be and were built, using principles similar to Babbage's. They were in commercial use clear up until 1985.

          So even if a few of Babbage's full-scale machine were only used by rich institutions (like government), smaller and simpler versions would surely have found plenty of good use.

          Even a custom-built device, designed to do nothing but calculate cosines, could have had a major impact on war.
        • by znerk (1162519)

          ... as spoken by someone who obviously didn't read the article.

          Your entire premise is flawed, in that had Babbage been able to fund the production of his machine, then he would have created "an actual machine to do it quickly, reliably, and cheaply." His Analytical Engine was a precursor to modern digital machines, and the article expresses how we might have been exactly where we are now, except 100 years earlier... and with a different power source.

          It even postulates that something approximating the intern

    • All of which were set back about 1000 years by the dark ages and the mentality that still pervades.

      • by znerk (1162519)

        All of which were set back about 1000 years by the dark ages and the mentality that still pervades.

        To further your point: The US has shot itself in the foot by impeding the progress of medical science. All the vehement arguments about stem cell research that caused the US to outlaw accessing the best source of stem cells has resulted in Belgium coming up with a cure for AIDS, instead of the US.

        Here's a link [nytimes.com] to the NYTimes story. Please keep in mind while reading it that the story seems to have a massive "sour grapes" slant, deeming the procedure "impractical" due in part to the fact that the patient's im

    • So you'd be OK with paying Babbages descendents a royalty for every computing device implemented since then?

  • read the book (Score:2, Informative)

    by Anonymous Coward

    The Difference Engine. We'd eventually get to the same place.

    • Yeah I read the book but I reckon that scenario used too much energy, particularly once you started talking about GUIs and processors running at Ghz. We would have needed transistors then, just as we need photonic logic now to keep improving.

      • by znerk (1162519)

        The energy issue hasn't changed - we'll always need "just a little bit" more than we currently have. We could actually have tapped quite a few sources of reliable, renewable energy, it's just not economically viable to do so (at least, not while fossil fuels are still available at such (artificially) cheap rates).

        • I mean that inside your computer, if you wanted to do all that we do with gears and wheels and such like, you would need a lot more energy than we currently use pushing electrons around.

    • Eugh, must I? I really didn't like that book. It's been years, I admit, but all I remember of the story was complaints about pollution, complaints about bureaucracy, a MacGuffin everyone wanted, and a nonsensical epilogue. It felt like it was a modern-day thriller except with a thin coat of Ye Olde.

      The middle third (it's split into three parts, with three different protagonists who don't interact much, so it's more like three short books) wasn't bad, but when it ended it felt like the story ended while th

  • I always wonder what the world would have been like had the Romans realized that the steam engine-primitive forms of which were used only in temples and in entertainment/toys- could be used as a form of locomotion. Hind sight really is always 20/20, and makes you wonder if we have anything today that we use that could be used for something we could never dream of.
    • by Coldmoon (1010039)
      They still would not have gone there as human labor was too cheap to spur investment in productivity. It is similar to what the Chinese have now - far cheaper to throw a 1,000 people at the problem then to create something that would reduce human labor in deference to a machine approach...
      • by vlm (69642)

        Naah, they were more expensive. The problem is cultural, in that the leaders came more or less directly from military success, both against barbarians and in civil war. Military success implies capture of slaves. Coming up with a technological "solution" that expressly does not require the leaders most important product, is not gonna fly.

        Its like trying to sell electric cars to Americans, no matter how much better they are than gas powered cars its culturally unacceptable. Must wait for the culture to s

    • by Hentes (2461350)

      Romans didn't use steam engines because slaves were much cheaper. Which is the reason Babbage's would have been just a similar toy: just because something is technically feasible doesn't mean society is ready for it.

    • The engines they've used (basically a sphere with a couple of nozzles) had very poor efficiency. They were not really suitable for anything but simple toys. They'd have to invent a lot of new technology to make real piston steam engines. Never mind steam turbines.

      That's the same problem as with Babbage's engine.

  • by icebike (68054) * on Monday December 26, 2011 @05:27PM (#38496866)

    The concept of huge mechanical computers fulfilling any purpose that seems hard for us to comprehend today.

    Yet huge mechanical computers for specialized use were in actual deployment in several industries, not the least of which were "fire control computers" [wikipedia.org] on US and British Battle Ships and Heavy Cruisers in the pre WWII era. These were initially fairly huge mechanical beasts [wikipedia.org] that were originally developed around the time of the first World War, and which were initially totally mechanical in nature. By the Second World War they were electro-mechanical (solenoids and relays and stepper motors), and were enclosed in battle hardened enclosures [wikipedia.org].

    Still 1920-to-1945 is hardly 1833, and the size and complexity of such devices taxed the manufacturing capabilities of the day, and the size and complexity of the problems they could solve was probably more easily worked out on paper than set (programmed) onto the machine.

    Having worked out the concepts, one wonders how far Babbage could have progressed with a large budget and a larger machine shop to build his engines. There were precious few problems to which you could apply this technology in that day. But its a chicken and egg problem. Its hard to know what computations would have been attempted had such equipment been available. The calculation problems any society tackles tend to be near the limits of the computing capabilities available to the task.

    A man before his time.

    • Babbage was working at the bleeding edge of the engineering of his time. Engines which have been built to his designs, and using the machining available to him, barely work. The long chains of gears frequently jam. There is just too much slack built into his systems. Its not his fault, just a natural consequence of the way engineering was done when he was alive.

      So no, I don't think it could have gone far.

      • Once a good use was found for it, the technology would have improved. It always has.
        • Well okay so that gets us to The Diamond Age if you assume it has to use moving parts. Maybe working Babbage machines would have brought forward the development of electronics.

    • by Ga_101 (755815) on Monday December 26, 2011 @06:09PM (#38497138)
      Babbage was not "a man before his time". He didn't need more money. He didn't need a larger machine shop. He blew it!

      He had the money.
      The people in 1800's Britain knew a good thing when they saw it. And when small prototypes were demonstrated the British Government committed to build the difference engine. And guess what, they wanted to use it for gunnery on ships! They invested *big*. How much? One fully kited out battleship's worth. One of these: http://en.wikipedia.org/wiki/HMS_Warrior_(1860) [wikipedia.org] (more or less). That is a huge amount of money.

      The skills were available.
      Have a look at a British clock from this period. Very intricate work and at a lot smaller scale than Babbage required. Sure, what he was doing was on a large scale, but the skills and tools were out there. Indeed, Babbage teamed up with them and had the money to do it.

      But he committed the cardinal sin. Babbage was forever changing the design. Yes Mr Babbage, your analytical engine idea is nice but we are paying you for the difference engine! He could not stay focused to build what was paid for and required. Falling out with the machinists capable of building it hardly helped maters. He did not deliver. As a result he blew not only his own reputation but that of the whole idea, killing it for the best part of a century. That is how bad he was.

      You can be the most talented man in the world, but if you are so disorganised and uncivil that nobody wants to work for you it is all for nothing. A lesson we can all still learn form.
      • Re: (Score:2, Funny)

        by Anonymous Coward

        Babbage = Sheldon

    • the size and complexity of the problems they could solve was probably more easily worked out on paper than set (programmed) onto the machine.

      Obviously not or they'd not have gone to the difficulty of building machines which tax the limits of precision mechanical engineering to solve them. And part of what the mechanical FC computers did - stabilize the guns on a ship that's pitching and turning and rolling - can't be done with a precomputed table.

      • by icebike (68054) *

        And part of what the mechanical FC computers did - stabilize the guns on a ship that's pitching and turning and rolling - can't be done with a precomputed table.

        Fire Control computers of that vintage did not attempt to stabilize the guns. That didn't come till much later, and never was used on very large bore guns (> 6inch). Simply too much mass to control. Instead the delay firing until the ship rolls or pitches thru the optimum fire point.

  • Two words for you... (Score:5, Informative)

    by Anonymous Coward on Monday December 26, 2011 @05:28PM (#38496870)

    Two words for you: "Difference Engine" [wikipedia.org]. Bruce Sterling and William Gibson. That's what would happen if Babbage had succeeded.

  • Not possible. (Score:3, Insightful)

    by artor3 (1344997) on Monday December 26, 2011 @05:29PM (#38496876)

    A Victorian computer revolution was not possible, as should be obvious to anyone who understands how computers work. Just think of how massive (and weak) computers were back in the days of vacuum tubes. Now imagine how massive, weak, and prone to break downs they'd be if they were made of clockwork. You'd have an entire warehouse filled with moving parts that might be equivalent to a digital watch... at least until one of the gears breaks. The technology simply didn't exist to make computing feasible.

    • by msobkow (48369)

      But a computing machine is actually far more straight-forward than the way an electronic computer works. It would be much slower, but I don't think it would be any more prone to breakdown than the vacuum tube machines were to burned-out tubes. Compare a mechanical desk-top adding machine of old to the earliest calculators -- they really weren't that much bulkier near the end of the era of the adding machines.

      And the odds are, if you find one, the adding machine will still work.

    • by melonman (608440)

      Also, the computer revolution really took off because computers became ever cheaper and easier to manufacture. The problem with Babbage's design was that a lifetime wasn't long enough to build one without CAD/CAM. So, even if he'd worked twice as fast and got the thing working, it would have been a one-off for another few decades.

      If Babbage had succeeded, it would have sparked the "man as machine" line of thought that has changed so much in our society, and that could have changed the course of history in a

  • you or I wouldn't be here to ask the question for one thing because the world would be an entirely different place. Probably much stranger weather-wise too since the industrial revolution would have occurred a century or so earlier and who knows what military(s) would have used it to the best of their ability.

    LoB
    • by Psion (2244)
      But the Industrial Revolution was in full swing by the 1830s. In many ways, Babbage's ideas were a product of that era. I don't think the world would be too terribly different a place than it is today. Perhaps, with proper error-free reference tables, science and engineering would have made a few more advances, but the complexity of all those moving parts in Babbage's Analytical Engine would have prevented something like Victorian PCs. I think the big change would have happened around the second World War,
      • Might the Germans have used aluminum calculating machines for more accurate V1 and V2 missiles? Could that have made a difference in the Space Race, or would that still have to wait for the weight-saving economy of the transistor and integrated circuits?

        The V2's accuracy was set by the onboard integrating accelerometer, already a precision mechanical device, which integrated an error of some hundreds of meters over the course of its flight. No mechanical computer, in the sense of a gear-based version of t

        • by Psion (2244)

          As for the Apollo flight computer, a very limited orbit-tracking version might have been possible but integrating error would have made it deeply suspect over such a long time period I think. In terms of all the other things the Apollo computer did in terms of attitude control and timing the firing of thrusters correctly, I doubt you could make a one cubic foot mechanical or electromechanical computer do that.

          I'm not suggesting that a mechanical computer could have replaced the Apollo flight computer. But

  • If/when something bad happens in my life and, something awesome happens later after that, it makes me easier to accept the sucky occurance reasoned with the butterfly effect, as without it the great thing might not ever have happened. :)

    Anyway, it's always amazing to think how the current state of world is a result of millions small things coming together. Without everything going exactly like this, even the probability of me existing would be extremely low.

  • If Babbage had succeeded, then there would have been a programming language called Babbage, and a software store chain in malls called Ada's instead of the other way around.

  • by vlm (69642) on Monday December 26, 2011 @06:01PM (#38497100)

    Historically computing has never been a processing problem, but a storage problem. Or all computing, from embedded stuff to supercomputers, pretty much seems to revolve around turning a computation bound problem into a storage bound problem, and waiting for storage to improve so you can roll out faster processors to make use of it.

    Try it yourself, if you have the skills. I had a pretty decent bitslice ALU design for a relay CPU down a total of 20 relays per bit slice, not just a wimpy bare adder but a pretty full featured design complete with comparator and roller/shifter unit. An 8 bit processor is well within my entertainment budget at a couple bucks per relay, and if I package each bitslice into something the size of a ream of paper, which is probably pretty pessimistic, the entire 8 bit CPU is only about the size of a box of bulk laserprinter paper. I figured for about $500 total all costs of all parts I can get a decent reliable relay based 8 bit CPU operational.

    But a couple hundred bytes of relay based ram to run some "real programs" is way outside my budget, both financial, storage, and power. Even tradeoffs don't work, like using latching relays saves me considerable (cheap) power at a cost of roughly twice as much per bit. Inevitably you get into weird dynamic electrolytic capacitor designs, strange attempts at homemade core memory... Cheating and using modern sram isn't cool. Hundreds of latching relays at lets say $5 per bit isn't gonna fly if I "need" a K or so of memory to have fun, that would be $40K just of storage relays to say nothing of the address decode logic etc. Also that would be well in excess of 8000 relays for a K of memory, vs a mere 160 relays for the processor. About 80 times bigger. So that goes from a small box sized CPU to basically a room of my house.

    This has interesting MTBF implications, in that any "non-trivial" relay computer is going to mostly fight memory breakdowns, not processor failures.

    To an amateur, calculating is the hard part. To a pro, storage is where the real problem lies.

    • by Trepidity (597)

      There are tradeoffs, depending on the problem. If you had several orders of magnitude faster processing, for some applications storage would become less of a bottleneck, because you could just recalculate a lot of data on the fly instead of storing it (the well-known time-space tradeoff). So in a sense storage is a bottleneck in those applications only because processing isn't fast enough--- meaning the bottleneck is processing when you look at it from another perspective.

      • by vlm (69642)

        Yup thats the problem with trying to replicate what amounts to a 1980 KIM-1 in relays.
        The ancients ran in to the same dilema and their solution was wide word sizes like 60 bits. Thus you end up with simpler shorter programs and more calculation per cycle and less memory size required.
        An 8 bit machine with a K of ram is what you get when memory and CPU are fast and space is cheap. What I'm used to, basically.
        The ancients idea of large word length makes sense if memory is expensive. Also lots of CPU regist

        • by Trepidity (597)

          Yeah, there's some amount of primary storage that you can't do without, but it's not entirely fixed. In a lot of scientific-computing apps, for example, the trend over the past decade has been towards ripping out things like lookup tables, because they aren't worth the RAM or L1/L2 cache space: it's cheaper to just re-calculate sin(whatever) every time you need it than to store a big sin table, or recalculate pi to 10,000 digits instead of storing it as a giant constant, which didn't used to be the case. Oc

  • by Animats (122034) on Monday December 26, 2011 @06:09PM (#38497134) Homepage

    Much as I like the steampunk concept, Babbage's machine was at the upper end of what was buildable as an expensive prototype. Bear in mind that even consistently-good, moderately priced steel wasn't available until the 1880s. That's why fine machinery was made of brass until the 20th century.

    The commercial history of mechanical calculators is not what you'd expect. Leibniz built the first mechanical multiplier in 1694. The commercial version, the "Arithmometer", wasn't produced until 1851. (It took a very long time to commercialize technology before there was industrial infrastructure.) Adding machines came later, because an adding machine is only a marginal improvement over an abacus, but a multiplier is a huge win.

    The first high-volume mechanical arithmetic device was the cash register. When, in 1884, cash registers first got tape printers, for the first time merchants had some real mechanical bookkeeping assistance. By then, good steel was available, and stamped parts could be made in volume. That's the point at which something like Babbage's machine might first have been a commercial success.

    Which it was. Hollerith's first punched card machines were used for the 1880 census. The Computing-Tabulating-Recording Company manufactured Hollerith machines commercially. The CTR became the International Tabulating and Recording Company, which became International Business Machines, which is today's IBM.

    By 1880, there was enough manufacturing infrastructure to make stuff, and there was continuous year to year progress in mechanical calculation. The peak in purely mechanical systems was probably the Burroughs Sensimatic, in 1953, which was essentially a spreadsheet program made out of gears. IBM tabulators were more advanced, but they were electromechanical.

    • by vlm (69642) on Monday December 26, 2011 @06:31PM (#38497274)

      Bear in mind that even consistently-good, moderately priced steel wasn't available until the 1880s. That's why fine machinery was made of brass until the 20th century.

      As an amateur machinist guy I can assure you that fine machinery was made of brass because steel/iron/etc was a nightmare to machine with the tools of the day, but brass is OK, not so labor intensive.

      Bulk steel was actually pretty cheap. Not cheap enough to make a bridge out of it, but cheap enough to fill the world with rifles and swords. Before 1880 steel was too expensive to make a steel bridge over every river, or a steel locomotive rail thru every little two horse town, or a steel computer in every house, or a steel computer based internet, which is just as well because they didn't have the proper carbides and HSS to machine it anyway at any affordable rate.

      Brass was, is, and probably always will be terribly expensive but it machines and wears (self lubricates, to an extent) like a dream. And the finish is quite attractive and simple, unlike steel or aluminum finishes. To this day, the amateur machinist guys make homemade steam engines out of brass, not steel, if they can afford it, anyway. I certainly prefer to work with brass. There are some issues with the cutting angles on lathe tools etc but its all really no big deal.

      Brass is much closer in cost to being a precious metal than it is to being a structural metal. Always has been. This explains the fascination brass holds with the local meth user population, a little pocket sized outside water hose fitting is worth darn near as much as a small iron sewer/drain grate at the recycler.

    • by pmontra (738736)
      The http://en.wikipedia.org/wiki/Antikythera_mechanism [wikipedia.org] was not a general purpose computer but as a computing machine pre dates any known device.
  • by lkcl (517947) <lkcl@lkcl.net> on Monday December 26, 2011 @06:32PM (#38497276) Homepage

    http://en.wikipedia.org/wiki/The_Difference_Engine [wikipedia.org] - by Bruce Sterling and William Gibson is a fascinating and complex exploration of exactly this concept: namely that Babbage succeeded. The key historical difference - the premise of the book - is that England's backing of the American Civil War succeeded, due to cryptography in part. Towards the end of the book it's made clear that the continued war between France and England has turned "cold" and thus much effort is dedicated to sneaking obfuscated "divide by zero" algorithms into the opposing side's Difference Engines. this book is one of the only sci-fi books (out of over 500 that i've read) that i actually found it hard to understand even 50% of what was going on. still made a damn good story, though.

  • The Automatic Telephone Exchange was patented in the 1890s and was available [wikipedia.org] in the 1900s. The relays could have been rewired as an electromechanical computer, as was done in 1943 on the Z3 computer [wikipedia.org].

    No one thought of it.

  • The Enterprise travels back in time, assists Babbage in finishing his analytical engine, and mankind gets warp drive 150 years before Cochrane did.

    In the mean time, Hitler fails to capture power in Germany, thanks to the internet.

    Stalin fails to rise to power, because people are quickly informed via their phones about his actions.

    etc

  • He would have probably created the first bug !

  • by Black Parrot (19622) on Monday December 26, 2011 @08:43PM (#38498340)

    Your cell phone would use gears...

  • by Tablizer (95088) on Tuesday December 27, 2011 @12:32AM (#38499834) Homepage Journal

    The thing is, a room full of humans can compute also, perhaps aided by simpler mechanical calculators. Redundant calculations could be done to reduce and detect errors. Nobody has shown an economic argument for Babbage's monstrosity being that it would be damned expensive to build at the time and require lots of maintenance.

  • by Ramin_HAL9001 (1677134) on Tuesday December 27, 2011 @03:01AM (#38500412)

    I'm surprised no one has mentioned this yet, but I think the biggest deficiency of Babbage's design was the base-10 numbering paradigm. Sure, he had the computer architecture down, to what we would now call the Von Neumann architecture [wikipedia.org], with the load, compute, store instructions. But making it all work in base-10 was incredibly messy, and I would think that is mostly why it was so difficult for him to implement.

    It was not until 1854 that George Bool invented what we now call Boolean Algebra. [wikipedia.org]

    Boolean logic allowed us to simplify computing circuitry, improving it's efficiency and size. Take a look at this famous YouTube video [youtube.com], it shows a mechanical calculator built with marbles, where a marble indicates a one and no marble indicates a zero. AND and OR gates are incredibly simple lever mechanisms, and it is powered by gravity and the weight of the marbles. What if Babbage had thought to use marbles and base-2 numbering instead of gears and base-10 numbering to do computations? He couldn't have because Bool's idea had not been thought of until some 30 years after his death, and even after that, it was not until Alan Turing (120 years after Babbage) that anyone was clever enough to realize that Boolean logic, as any other logic, could be used to program a computer. Before Turing, Boolean logic was more or lest a reasoning language for testing the logical soundness of true/false propositions.

    So, architecturally, Babbage was ahead of his time, and perhaps had his idea succeeded, it may have encouraged research and development leading to the use of Boolean logic in computing much earlier. But that wasn't the case. It is fun to think of what may have happened though: we may have seen immense computing factories powered by mills which lifted grounded marbles into a giant bin above the factory, and all of that weight would filter through the mechanisms of the computer to produce results. Such a thing would have been unbearably noisy, but fast, simple, easily reparable, and effective. And it would have continued that way until someone thought of using electrical charges instead of marbles.

    In all, I think if Babbage's design had succeeded, it may have made the computer revolution happen 30 or 40 years earlier, in which case, I would have been born in the the mostly ignorant generation of kids comprising the social-networking and internet revolution, and not in the more down-to-earth generation of 8-bit gaming, Q-BASIC and assembler-programming, personal computer revolution folk.

I cannot conceive that anybody will require multiplications at the rate of 40,000 or even 4,000 per hour ... -- F. H. Wales (1936)

Working...