Forgot your password?
typodupeerror
Robotics Technology

Krugman: Is the Computer Revolution Coming To a Close? 540

Posted by samzenpus
from the end-of-an-era dept.
ninguna writes "According to Paul Krugman: 'Gordon argues, rightly in my view, that we've really had three industrial revolutions so far, each based on a different cluster of technologies. The analysis in Gordon's paper links periods of slow and rapid growth to the timing of the three industrial revolutions:
IR #1 (steam, railroads) from 1750 to 1830.
IR #2 (electricity, internal combustion engine, running water, indoor toilets, communications, entertainment, chemicals, petroleum) from 1870 to 1900.
IR #3 (computers, the web, mobile phones) from 1960 to present.
What Gordon then does is suggest that IR #3 has already mostly run its course, that all our mobile devices and all that are new and fun but not that fundamental. It's good to have someone questioning the tech euphoria; but I've been looking into technology issues a lot lately, and I'm pretty sure he's wrong, that the IT revolution has only begun to have its impact.' Is Krugman right, will robots put laborers and even the educated out of work?"
This discussion has been archived. No new comments can be posted.

Krugman: Is the Computer Revolution Coming To a Close?

Comments Filter:
  • I would argue (Score:5, Interesting)

    by geekoid (135745) <dadinportland@ya ... m minus math_god> on Wednesday December 26, 2012 @08:13PM (#42399681) Homepage Journal

    that IR 4 is robotics. Not that robotics are a continuation of IT.

    • I agree. I think that the next "Industrial Revolution" will be robotic factories producing robots for other tasks/industries.

      Now, will the robotic-built robots be single purpose or general purpose? I don't know. But general purpose robots would lead (I believe) to another "hacker" revolution. The same as the general purpose computer did.

      • Tag on 3D printing and automated machining of parts and you don't have to have humans except to fix the things that robots can't. And to do all that, you need energy. Energy is key to producing wealth.

        Want to fix our economy? Give tax breaks for creating energy. Actually, do not tax production of energy at all. Make it THE tax free enterprise. If we did that, the world would follow, and all the stupid wars over oil would be gone. We could then cut military, because we wouldn't be in every hell hole in the w

    • Re:I would argue (Score:5, Interesting)

      by DigiShaman (671371) on Wednesday December 26, 2012 @10:21PM (#42400657) Homepage

      IR 4 is cheap and abundant energy (solar, wind, nuclear, fusion perhaps...etc). Emphasis on 'cheap and abundant'.

      IR 5 would be robotics that require an IR 4.

      • Re:I would argue (Score:5, Insightful)

        by Rockoon (1252108) on Wednesday December 26, 2012 @11:00PM (#42400885)
        It could easily be argued that we have been in "the cheap and abundant energy" phase for a century... oil, coal, gas...
        • by wanax (46819)

          Remember the "and" part. Yes, we have abundant energy, but it's not cheap. My ability to get computations per dollar has increased many orders of magnitude in the last 30 years (or 60, but I'm not that old), to the level that my smartphone would have been the fastest computer in the world when I was born. Energy, on the other hand, is within an order of magnitude, the same cost: the real coal price is about the same as in 1800 (see: http://econbus.mines.edu/working-papers/wp201210.pdf [mines.edu] and that's externalizi

  • Not really (Score:5, Interesting)

    by Tough Love (215404) on Wednesday December 26, 2012 @08:18PM (#42399757)

    The only the silicon part of the revolution is slowing down. The software revolution has barely begun, especially after being set back ten years or so during the Microsoft dark ages. What the future holds can scarcely be imagined today. Think of it this way: we already have more processing available on a single, $50, add in card than a modest sized mammalian brain. It isn't our hardware that sucks, it's our algorithms.

    • Re:Not really (Score:5, Insightful)

      by linatux (63153) on Wednesday December 26, 2012 @08:31PM (#42399883)

      I think our algorithms have sucked, but it hasn't mattered much until recently.
      Now we are able to make vast amounts of data available easily, so it matters a lot more.

      Processing power still has a long way to go, but figuring out HOW to make use of the data is currently more important than the speed at which we can do it.

    • Re:Not really (Score:5, Insightful)

      by forkazoo (138186) <wrosecransNO@SPAMgmail.com> on Wednesday December 26, 2012 @08:36PM (#42399923) Homepage

      It's not clear that software is heating up as much as you propose. Most systems depend on vast foundation libraries, and commercial viability frequently depends on vast developer ecosystems. It is getting harder and harder over time to launch novel software stacks. As new computer programs depend on ever larger and more stable platforms, inertia naturally means that the rate of "real" change is less now than it was earlier in the evolution of computer programs.

      I think it's perfectly fair to say that the computer revolution is slowing down. Even as people remain hard at work, and some metrics continue to climb as fast as ever, the different between a 16 KB home computer and a 16 MB home computer is extraordinary. The difference between a 16 MB system and a 16 GB system is really much smaller, even though the systems are separated by a factor of 1000x (for the sake of a simple argument, assume compute performance and storage capacity scale at a rate roughly equal to main memory.) A 16 MB 686 running Windows 95 has windows, icons, color graphics, a mouse. A 16 GB Sandy Bridge running Windows 7 has windows, icons, color graphics, a mouse. A user teleported some years in the future would have no problem accepting the faster system. A 16 KB system has a keyboard, text mode, built in BASIC, incredibly primitive graphics with limited colors. Moving from that to the 16 MB one would be a revelation.

      We've seen massive consolidation of operating systems since the 80's. IT at this point is relatively stable and mature. Though, I don't agree that there were several completely distinct revolutions. I would argue that Facebook is part of the same revolution as the telegraph and radio. Likewise, computers are largely a technology of reliable small scale finely detailed manufacturing which started quite some time ago.

      • by Bomazi (1875554)

        Progress didn't stop in 1995.

        First there was the multimedia revolution. You can now capture audio, picture and video (even HD) content digitally, transfer it on a computer, edit it, and distribute it world wide, all on a (relatively) cheap PC. Try to do this on a 486.

        Now you have smartphones packed with sensors, ubiquitous high speed internet access and web apps backed by massive datacenters.

        And it is not just the hardware that improves. There have been tremendous improvements in recent years in fields like

  • by sien (35268) on Wednesday December 26, 2012 @08:19PM (#42399769) Homepage

    Gordon's Paper has been thoroughly investigated [thebreakthrough.org] by Roger Pielke Jnr at the Breakthrough Institute.

    Gordon's smoothing of growth fails to show the variability and creates a picture of trends that are not really there. A quote from the article linked above:

    In short, there is no evidence of a stair step reduction in the growth rate of US per capita GDP in either dataset. The US BEA and Census data shows essentially no change (a linear trend, blue line, shows a statistically insignificant downward tick) whereas the Maddison data shows a bit of an increase (red line). The data is sensitive to the time period chosen – for instance, from 1970 the BEA/Census data shows an increase in the annual rate of per capita GDP growth. I can find no evidence of a post-1950 secular decline in per capita economic growth in the United States, and in fact, there is evidence that growth rates have accelerated a bit from 1970.

  • by jabberwock (10206) on Wednesday December 26, 2012 @08:34PM (#42399911) Homepage
    The poster can't read, or summarize.

    Here's the link to Krugman's column: http://krugman.blogs.nytimes.com/2012/12/26/is-growth-over/ [nytimes.com]

    And this means that in a sense we are moving toward something like my intelligent-robots world; many, many tasks are becoming machine-friendly. This in turn means that Gordon is probably wrong about diminishing returns to technology.

    Ah, you ask, but what about the people? Very good question. Smart machines may make higher GDP possible, but also reduce the demand for people — including smart people. So we could be looking at a society that grows ever richer, but in which all the gains in wealth accrue to whoever owns the robots.

    And then eventually Skynet decides to kill us all, but that’s another story.

    Anyway, interesting stuff to speculate about — and not irrelevant to policy, either, since so much of the debate over entitlements is about what is supposed to happen decades from now.
  • Already Happening (Score:5, Interesting)

    by Anonymous Coward on Wednesday December 26, 2012 @08:39PM (#42399955)

    As a physician, I see the future, and it's increasingly moving away from me and towards the computer. There will still be a role for us, but it will be in the areas where big data doesn't come up the obvious answer. As humans, we suck at reliably following algorithms. For a lot of medical conditions, following an algorithm reliably will give much better results than the haphazard method in which it is practiced now. Let the computer do that and let us practice the art of medicine where we don't know the correct answer yet.

  • by petes_PoV (912422) on Wednesday December 26, 2012 @08:50PM (#42400015)

    The industrial revolution is driven by man's ability to harness energy. So far that's all been fossil fuel and has limited what we can do - and how fast we can do it.

    That phase of the industrial revolution is still going strong and has nothing to do with electronics, electricity or computers. Those developments are a completely different strand of development, and (themselves) have barely started, either.

    The next phase of human-kinds development is when we break out, past the limitations (both of availability and rate of generation) of fossil fuels into a new era where there is MORE energy available to each human. Probably several times more energy.

    However, if you really want to talk about computers, then we're still in the pre-condensing boiler stage. We can make computing devices that seem pretty powerful (because we have nothing better to compare them with), but they're not particularly powerful, complex or scalable. Also, it's debatable whether there is anything on the horizon (quantum, possibly - but it seems to be a hellishly complicated way to do things and needs a lot of supporting structure, compared to, say, the human brain) to take us to the next phase.

    So, no. We have NOT come to the end of IR3, we're still firmly stuck in the first industrial revolution, probably for another 50 - 100 years until we get our asses into gear and get past fossil fuels. Computing also seems firmly stuck on the bottom rung, with no promising technologies to move up, past the limitations of current semiconductor processors and logic-gate based architectures.

  • IR Dates all Wrong (Score:5, Interesting)

    by tjstork (137384) <todd@bandrowsky.gmail@com> on Wednesday December 26, 2012 @08:53PM (#42400049) Homepage Journal

    Let's see, he cuts off IR#1 at 1830, which pretty much misses the entire steamship revolution and the invention of so many consumer goods of the 19th century, not to mention, the facilitation of mass immigration to the USA by all those steamships, the openning of the west due to practical railroads.

    Then, he cuts off the next IR at 1900, and thus misses aircraft, the widespread adoption of the telephone and radio, and consumer appliances.

    And then, having decided that aircraft, telephones, radio and steamships were useless, he says that the next 60 years of IT will mean absolutely nothing.

    I would be inclined to think he is totally wrong.

    • by rubycodez (864176)

      we're still in the era of steam, for fossil fuel and nuclear plants and nuclear sea vessels.
      1900 B.C. to present: era of algebra
      1000 to present era of explosives
      1450 to present era of mass media
      1750 to present: era of steam
      1870 to present: era of electricity, also of telephony
      1900 to present: era of flight
      1960 to present: era of space, also of integrated circuit
      1970 to present: era of the internet
      1975 to present, era of the personal computer

  • No work==good (Score:5, Insightful)

    by Alomex (148003) on Wednesday December 26, 2012 @08:54PM (#42400051) Homepage

    will robots put laborers and even the educated out of work?"

    Let me remind people here that this is, in the long run, a good thing (TM). Machines putting people out of work enabled us to have, in the long run, the 40hr work week and a society where people are majoritarily middle class.

    Short term it can be a disaster though. For example the 2nd industrial revolution caused massive unemployment in industrial England and leadto asinine ideologies such as fascism, luddism and socialism elsewhere. These ideologies were misguided attempts to compensate for this momentous labour force disruption by addressing the wrong aspects of the industrial revolution (democracy, machines and capital respectively).

  • by skine (1524819) on Wednesday December 26, 2012 @09:08PM (#42400149)

    So IR#1 = Steam, Railroads

    IR#3 = Buying Railroad Tycoon on Steam.

  • by conspirator23 (207097) on Wednesday December 26, 2012 @09:31PM (#42400313)

    The development of modern computing and telecommunications is not an industrial revolution of the type characterized by IR #1 and IR #2, and this is where Gordon's assumptions falter and Krugman's skepticism gains traction.

    The "I" in this case refers to Information not Industry, and it is the 2nd one. The 1st one was the development of the printing press. From this standpoint, IR #1 (the printing press and movable type) took centuries for it's impact to be fully realized. The depth and breadth of it's influence on western civilization is difficult to measure in "simple" macroeconomic terms. Likewise, IR #2 (the electronic digitization of information) is a revolution that is so fundamental in nature that I don't believe it lends itself to being mapped as cleanly as Gordon implies.

    Krugman starts the conversation in a couple of good spots: robotics and it's impact on GDP, and the potential of Big Data to drive decision making. What about desktop manufacturing (aka 3D printing)? MOOC? Genomics? Realtime translation?

    In fact the more that I think about it, the more I think that Gordon has successfully found an important trend, but has the wrong story to explain it. The first two Industrial revolutions owe their economic impacts to advances in our energy metabolism as a species. Gordon's IR#1 was about the conversion of hydrocarbons into mechanical energy using steam. Gordon's IR#2 was about the conversion of hydrocarbons into electricity using steam turbines, and into mechanical energy using internal combustion. Economic benefits from the digital revolution has much more to do with efficiency and productivity, and almost nothing to do with finding new sources of energy to exploit. Indeed we're using more energy than ever to push information around, but each joule expended has had a significant ROI from an economic standpoint. Consider Just In Time [wikipedia.org] production techniques, which are dependent on the ability to rapidly gather and disseminate information up and down the manufacturing supply chain. There's not a whole hell of a lot more efficiency that we're going to wring out of JIT. In fact, Japan's Tsunami disaster demonstrated that we are now SO optimized from an industrial standpoint that natural disasters in one part of the world can have nearly immediate impacts across the global economy. In other words, we have reached the point of diminishing returns on the productivity gains that digital information can provide to the industrial economy.

    So Gordon is wrong, but about the right things.

  • by Stirling Newberry (848268) on Wednesday December 26, 2012 @09:51PM (#42400461) Homepage Journal
    1) What he labels as "the first industrial revolution" wasn't the first, and it only occurred for much its length in the UK, he drags down earlier eras by not adjusting for geography in the same way he does for eras he focuses on. The first European technological revolution begins in the 1500's, a period of time where wages double in real terms in urban centers and then double again. It would take another 130 years to double a third time.
    2) The second industrial revolution does not produce increased GDP in the form of steam engines until the 1840's by which time the telegraph is a major part of the information infrastructure to run railroads, so his IR#1, first, is really IR#2 and for most of the world is 1830-1860.
    3) Much of the period of so-called IR#2 is during the long depression. As with previous waves of industrialization or technological revolution, city centers grow rapidly, as it is much cheaper per person to extend infrastructure, and there are higher profits. The electrical economy does not penetrate much of even developed countries, as measured by penetration of electrical devices and their costs, until 1930-1950.
    4) In developed countries the rebound from WWII was the period of fastest GDP growth.
    In productivity terms, the information revolution was not visible until the mid-1990s, and there are still large productivity wins.
    Krugman is late to the party, and falls into the "lump of work" fallacy. The real problem here is that if there is a roughly constant standard of living as a target, the amount of work to be done will drop, and it will be less well paying. The only way to produce more work is to increase the demands on society. This will be opposed by those for whom the present standard is enough, and who enjoy its higher levels of benefit, but this is largely a political problem, for which political will is required. Economics can help ease transition forward, but it cannot generate political will from nothing.
  • by WaffleMonster (969671) on Wednesday December 26, 2012 @10:14PM (#42400621)

    I think IR #3 is a bit too nebulous and abstract to be useful... I can't imagine how you top "information age" or what could ever possibly come next.

    Instead I think you really need to think in terms of a tech tree with more specific items such as cheap high density batteries, memristers, large scale 3d stacking / optical or plasmon gates, room temp superconductors, optical frequency fourier antennas, quantum computers with thousands of entangled qbits, tabletop fusion, warp drives..etc are likely to dominate the landscape of future changes vs general themes.

    I think a mistake is made when you confuse the effects of diminishing first order returns on information and information processing technology from the more important secondary effects it has on the worlds industries and feedbacks on information technology itself.

    For example faster Internet or a faster computer at this point would continue to provide ever diminishing returns to the average consumer.

      Likewise the always connected mobile computers and communications provide limited little additional value over traditional fixed hardwired systems.

    When you end your analysis with this narrow view of technology itself you are blind to what is really going on in terms of aggregate effects on all of industry.

    All advances in pharmaceuticals / chemistry / material science is fully contingent on complex large scale computation.

    Astronomy and basic research.

    Computational biology and insanely cheap + fast sequencing is just now starting to go apeshit..

    Automation in design, manufacturing and logistics of all kinds throughout all of industry.

    Facebook, mobile phones, twitter and assorted consumer gadgets are red herrings... They are just noise that never really mattered.

  • by countach (534280) on Wednesday December 26, 2012 @10:52PM (#42400855)

    I'm pretty sure the full impact of the internal combustion engine wasn't known in 1900.

  • by cjsm (804001) on Thursday December 27, 2012 @01:07AM (#42401519)
    Don't tell me the computer revolution is slowing down when Microsoft has just released Windows 8. Krugman obviously is out of touch with the computer industry and must be living in a cave, to be unaware of this life changing, revolutionary breakthrough. Windows 8 alone will throw the computer industry back into the dark ages, allowing the growth cycle to start all over again, reigniting the industry.
  • by Tablizer (95088) on Thursday December 27, 2012 @02:33AM (#42401879) Homepage Journal

    I once plotted all the major inventions by time, and didn't really see clustered causes, but did see clustered time areas of rapid innovation. I saw an expected bulge roughly around 1910, but I was surprised by the size of the invention bulge centered roughly around the 1950's, at least in terms of when they started impacted our lives (not nec. first creation).

    The 50's "cluster" had these involved:

    Atomic weapons
    Electronic computer (mainframe)
    Vaccines
    TV
    Jet travel
    Transistor

  • Krugman...and (Score:5, Insightful)

    by lilfields (961485) on Thursday December 27, 2012 @02:36AM (#42401887) Homepage
    Krugman is wrong about...a lot of things, he is very good on trade policy (which is how he won his Nobel) but his Nobel on trade policy doesn't make him an expert on anything else. The media seems to think otherwise, but he has 0 fiscal policy experience, 0 technology experience, etc, etc. He is pretty incompetent in those regards. Computing has a good ways to go, I do think that the upgrade cycles will be getting longer on tablets and phones soon though. I don't know why people seem to think the upgrade cycle of those devices will somehow never get longer like the PC's cycle did.

A LISP programmer knows the value of everything, but the cost of nothing. -- Alan Perlis

Working...