Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×
Intel Hardware

End of Moore's Law in 10-15 years? 248

javipas writes "In 1965 Gordon Moore — Intel's co-founder — predicted that the number of transistors on integrated circuits would double every two years. Moore's Law has been with us for over 40 years, but it seems that the limits of microelectronics are now not that far from us. Moore has predicted the end of his own law in 10 to 15 years, but he predicted that end before, and failed."
This discussion has been archived. No new comments can be posted.

End of Moore's Law in 10-15 years?

Comments Filter:
  • Law? (Score:3, Insightful)

    by haystor ( 102186 ) on Wednesday September 19, 2007 @10:54AM (#20667687)
    Can we stop calling a prediction a law?
  • by Anonymous Coward on Wednesday September 19, 2007 @10:59AM (#20667753)
    We will start using a new technology without transistors, but which will exhibit a similar exponential gain so long as there is money to be made from it.

    In fact, now I come to think of it, ALL human endeavour exhibits exponential growth so long as there is money to be made from it. Technology is just one field where it's true. Sex (Malthus), agriculture, you name it - humans do it exponentially!

    I think I'll put that on my t-shirt, and call it 'Anonymous Coward's Law'.

  • Re:Gordon Moore (Score:2, Insightful)

    by sexybomber ( 740588 ) <boccilino @ g m a i> on Wednesday September 19, 2007 @10:59AM (#20667757)

    Actually, I don't think it will matter. In 10-15 years, the emphasis will shift away from traditional binary computing and towards quantum computing anyway, making Moore's law sorta moot.

    And if quantum computing should herald The Singularity, then it's definitely moot, since no predictions (Moore's Law included) can be made about post-Singularity computing.
  • Re:Gordon Moore (Score:3, Insightful)

    by plover ( 150551 ) * on Wednesday September 19, 2007 @11:00AM (#20667767) Homepage Journal
    I don't think quantum computing will be the future for general-purpose computing, and certainly not in 10 years. I think you're nearly right in that the future will lie in parallel computing -- increasing the number of CPUs will be the path to higher throughputs (which coincidentally aligns nicely with Intel's goal: sell more CPUs.)

    Either way, when Gordon Moore eventually dies he will still be overflowing the (long)money; variable.

  • Re:Gordon Moore (Score:3, Insightful)

    by oliverthered ( 187439 ) <`oliverthered' `at' `'> on Wednesday September 19, 2007 @11:00AM (#20667773) Journal
    quantum computers aren't really general purpose machines and wouldn't be able to replace traditional CPUs for a lot of tasks.
  • by vlad_petric ( 94134 ) on Wednesday September 19, 2007 @11:02AM (#20667805) Homepage
    ... there's nothing fundamental about it. Instead, it's a self-fulfilling prophecy. The big players in the silicon world all use the "law" and its corollaries as their business plan. They'll likely discard a feature/product if it falls behind the curve in terms of speed. For the layperson, this "precision" may indeed create the appearance of an actual law, even though it's just an observation (similar to Malthus' "law")
  • by goombah99 ( 560566 ) on Wednesday September 19, 2007 @11:04AM (#20667843)
    Moore's law is not about physics it's about economics. Basically the entire industry has built an economic engine that requires that growth pattern to sustain it self.

    To put it another way, growth needs to be geometric not addative. that is things need to grow at x% per year, which leads to a doubling time. If the grew linearly at x += D then as x grew the proportional rate (1/x dx/dt) of x growing shrinks with time--or the doubling period gets longer and longer. Eventually it takes a lifetime before your computer is 2x more capable. Then it takes 2 lifetimes.

    Why would you ever upgrade at that point? except due to wear and tear. Things become commodities and sales are based on price and other values-added. So long to intel's industry domination model.

    Moore's law is also a limit too. Namely that very same growth engine will not invest twice as many research dollars to get a slightly faster doubling time. The fact that it has held steady tells you that this is so. Empirically this growth rate is the sweat spot between creating innovation at the lowest cost, and reaping a profit on it.

    Indeed the only surprising thing we've seen in the consumer market that seemed (superficially) to violate this was apples' replacement of the ipod mini with the ipod nano shortly after it's introduction. They could easily have milked it for longer. But here the driver was the competition that they needed to stay ahead of.

  • Re:Gordon Moore (Score:3, Insightful)

    by Anonymous Coward on Wednesday September 19, 2007 @11:08AM (#20667881)
    One problem in these discussions is that different people use different definitions of "Moore's Law." Strictly the law is an observation about the increasing density of transistors (i.e. decreasing size of each transistor). However, as we all know, many people simply use the term "Moore's Law" loosely, referring to all exponential increases in computing power.

    There is no doubt that we will reach a hard physical barrier beyond which we cannot shrink individual transistors any longer. This limit will be reached in a decade or two, and is probably what Moore is referring to: at our current scaling, we will hit atomic limits rather soon.

    But, that doesn't mean that the exponential increases in computing power will end. There are many other things that may happen, such as figuring out ways to build microprocessors with transistors stacked in 3D (rather than having a single 2D layer of transistors), which will increase the transistor count in our computers by orders-of-magnitude. Improvements in chip designs, layouts, and algorithms are other areas that may see improvement. Specialized and dynamically re-programmable chips may also provide us with further advances. Or perhaps, as you pointed out, quantum computers will become viable and mainstream.

    There is no guarantee that these more exotic technologies will work out. Yet the microelectronics industry has surprised us time and again with their ability to overcome huge technical obstacles. Thus it seems at least possible that they will deliver technology that is very much up against the physical limits of what is achievable. And with regard to those physical limits, the hard-wall the Moore is predicting in 10 years is only one aspect. There are many other ways for this technology to be advanced.
  • by jcr ( 53032 ) <jcr AT mac DOT com> on Wednesday September 19, 2007 @11:08AM (#20667883) Journal
    Moore's law is not about physics it's about economics.

    Exactly. Whenever one process technology reaches its physical limits, we get a new one, because the new process makes money. X-ray lithography, chip stacking, 3D circuits, and eventually nanotech will all keep us on the Moore's law path probably for the rest of my life, at least.


  • by dave_mcmillen ( 250780 ) on Wednesday September 19, 2007 @11:09AM (#20667903)
    I have no idea if Moore's Law will really start to "fail" in a particular time scale (one of these times it's gotta be true, right?), but a related issue I find interesting is that CPU speeds don't seem to be being touted to computer buyers so heavily anymore. Walk into a big electronics store and look at their desktop offerings: where they used to prominently feature how many GHz they had inside (and people vaguely felt that more of these mysterious GHz was better), now the CPUs are given code names and numbers that don't reflect CPU speed: Check out this nifty X2, or the Turion 64, or ...

    The new hook for consumers is the number of "cores", and once again most people have probably picked up the vague sense that having more of them inside means the computer is better. I've been told by people who might be in a position to know that it's not that they can't keep cranking up CPU speeds, but that the cost/benefit (profit-wise) stops making sense at some point because of the huge cost of implementing a new fab at a finer length scale, and we're pretty much at that point. So it makes sense that cores are the new GHz, and Moore's Law will have less and less direct impact on the end computer buyer from now on.

    Maybe there's a Core Law to be formulated about how often the average number of processors per computer can be expected to double?

  • by Colin Smith ( 2679 ) on Wednesday September 19, 2007 @11:09AM (#20667905)
    Moore's law will continue until THE SINGULARITY takes US ALL!!!!!!

    Or at least, that's what the singularity nuts claim. Sorry people, there are limits on this planet.
  • two variables (Score:2, Insightful)

    by OwlofCreamCheese ( 645015 ) on Wednesday September 19, 2007 @11:15AM (#20668015)
    he forgets his law has two variables, the number of transistors and the cost. The number of transistors might stop doubling, but there is still huge change to come with the transistors being fixed and the cost changing.

    I mean, a 8 core chip would be an improvement right now, but so would a 4 core chip at half the price. Think about a world where a 80 core chip exists, and how it would change the world. It would do so again when it went from 999 dollars to 99 dollars to 9 dollars to 99 cents to 9 cents to 9 for a cent/
  • Not yet, (Score:4, Insightful)

    by SharpFang ( 651121 ) on Wednesday September 19, 2007 @11:17AM (#20668031) Homepage Journal
    The law speaks about number of transistors. Considering current size of a typical CPU die (about 1cmx1cmx1mm) and assuming a "reasonable" maximum size of some 10cmx10cmx10cm we have about 15 years of doubling the SIZE of the CPU (with some challenges like heat dissipation, but nothing nearly as difficult as increasing the density further) and that's not considering increasing the density any more. So even if the density reaches its limits, the CPUs may simply grow in size for a good while.
  • by Urban Garlic ( 447282 ) on Wednesday September 19, 2007 @11:22AM (#20668093)
    Luckily, there are enough geeky pedants on slashdot to make up for the fact that the editors have actually messed up this totemic bit of geek lore.

    Moore was/is a technology manager, and his law is a management law. It says the number of transistors that can be economically placed on an integrated circuit, i.e. the transistor density of the price/performance "sweet spot", will increase exponentially, doubling roughly every two years.

    The original [] refers to "complexity for minimum component cost", which emphasizes the economic aspect of it even more strongly.

    Moore's law has never been about what's possible, it's always been about what's cheap.
  • by hattig ( 47930 ) on Wednesday September 19, 2007 @11:32AM (#20668241) Journal
    That's 32 times as many transistors... whereas today you can get 4 cores on a CPU in under 300mm^2, you'll be getting 128 cores in 2017 (simplistic, you'll get a variety of generic cores, and application specific cores, and per-core improvements will increase their size, so say 32 generic cores and 32 application specific cores).

    If it's 16 years, thats 256 times as many transistors. 256 generic cores and 256 application specific cores in 2024? Let's not even imagine the per-core speeds! It's all pretty exciting, and I'm being conservative with the figures here.

    Of course, applications will grow to utilise this stuff, but more and more tasks are getting to the point of 'fast enough', even despite the bloating efforts of their creators. Even if there is a 10 year hiatus in process improvements after 2024, it'll take some time for the applications to catch up apart from certain uses. If those uses are common enough, there will be hardware available for it instead. Of course if only Intel and IBM have fabs that can make these products, because the fabs cost $20b each...
  • Re:Not yet, (Score:4, Insightful)

    by Chirs ( 87576 ) on Wednesday September 19, 2007 @11:35AM (#20668287)
    You've neglected to consider power issues. The 10cm cube you mention is 10000x the volume of the "typical" current size you mention. Assuming power scales linearly with volume, that would require approximately 300KW of power just for the cpu. That works out to about 1250Amps of current at 240V.

    Nobody wants to increase the size of cpus...defects scale more than linearly with area, so there is a strong incentive to keep the die area down. Also, as the physical size increases you run into other problems...power transfer, clock pulse transfer, etc.
  • Re:it's the law (Score:3, Insightful)

    by tomstdenis ( 446163 ) <tomstdenis AT gmail DOT com> on Wednesday September 19, 2007 @11:47AM (#20668457) Homepage
    But the IQ is the average ... so it can't halve. :-)
  • nonsense (Score:5, Insightful)

    by Quadraginta ( 902985 ) on Wednesday September 19, 2007 @12:22PM (#20668935)
    That's nonsense. The industry grew around the physics, not vice versa. The fact that the industry is predicated on a constant improvement of speed and complexity is because such a thing is achievable in microelectronics, certainly not because microelectronics is the only industry where such a thing is desirable.

    I mean, who wouldn't want cars to become twice as gas efficient (without losing power) every 18 months, ad infinitum? If such a thing were technically possible, it would happen, because all the car makers would jump on the gas-mileage bandwagon to get ahead of their competitors.

    Who wouldn't want the amount of food that can be grown per man-hour to double every 18 months, so the price per pound of beans and broccoli fell as fast as the price per CPU cycle of computers? If such a thing were possible, it would happen, as every farmer raced to lower his costs of production and undersell his neighbors like crazy, earning millions.

    In very few industries other than microelectronics has anything like Moore's Law applied, and that's not from a lack of economic incentive, but from the plain uncooperativity of Mother Nature. You're arguing backwards, from effect (the economic structure of the industry) to cause (the physical nature of microelectronics).
  • sure they do (Score:5, Insightful)

    by Quadraginta ( 902985 ) on Wednesday September 19, 2007 @12:45PM (#20669265)
    They only change in ways that are generally not possible to anticipate, hence which haven't been predicted.

    And of course they would. Technology, like the stock market or the weather, is inherently a chaotic system over a certain characteristic timespan (1-2 weeks for the stock market and the weather, 25-50 years for technology). That is, over the characteristic timespan very small causes can produce enormous, system-wide effects, what you might call the butterfly wing flapping causing the hurricane phenomenon.

    For example, a couple of guys (Jobs and Wozniak) screw around in the garage in the early 80s, trying to put together a really cheap personal computer. That's a very small cause. And twenty-five years later, it has had a giant effect: iMacs and iPods and iTunes oh my. Problem is, there was no practical way in the 1980s to distinguish the small cause that mattered (Jobs and Wozniak) from the other 50 zillion small causes that didn't matter (the other 50 zillion pairs of scruffy entrepreneurs in garages whose brilliant idea went nowhere).

    This is why predictions of the future out more than 50 years usually end up looking hilarious in hindsight. When sf writers of the 50s and 60s predicted the present, they projected the dominant themes of their time (spaceflight, atomic physics, the struggle with Soviet Communism). They did not -- and could not -- realize that all three themes would pretty much abruptly and surprisingly come to an end in the 90s. When present writers predict the future, they project the dominant themes of our times (e.g. networked computing). It's very likely these projections, too, will end up wildly wrong. Networked computing is likely to become as humdrum and static as telephony within the next half-century or so.
  • by cowscows ( 103644 ) on Wednesday September 19, 2007 @12:47PM (#20669285) Journal
    Moore's Law doesn't really have anything to do with MHz or GHz, or clock speeds at all. It's more about the number of transistors crammed into a cost effective chip. For a while there, one of the main things that intel decided to do with all those transistors was to push the clock-speed as fast as they could. This certainly made computers more powerful, and it was an easy number to work into advertisements and such.But at the end of the day, it wasn't the only way that processors could be improved. To bring in a dreaded car analogy, they were making a car go faster by adding more cylinders to the engine, while mostly ignoring things like aerodynamics and efficiency in other parts of the car. But eventually they reached a point where there wasn't any room in the engine compartment for more cylinders, so now they're looking at making the rest of the vehicle more efficient.

    The transistor count will keep going up, Moore's Law will continue. It's just that those new transistors will be used a little differently.
  • Re:nonsense (Score:2, Insightful)

    by avirrey ( 972127 ) on Wednesday September 19, 2007 @12:52PM (#20669361)
    I mean, who wouldn't want cars to become twice as gas efficient (without losing power) every 18 months, ad infinitum? If such a thing were technically possible, it would happen, because all the car makers would jump on the gas-mileage bandwagon to get ahead of their competitors.

    Hah! Exxon wouldn't, and they don't, so we have no say.
    X's and O's for all my foes.
  • by Miamicanes ( 730264 ) on Wednesday September 19, 2007 @01:51PM (#20670319)
    When the limit of Moore's law is reached insofar as doubling of computing power for a given size, things built from the components in question will simply start getting bigger and drawing more power. The Pentium 4 was just a preview of what's to come. Forget about tech-design fantasies of "computers in a keyfob" that go everywhere. The desktop PC of 2050 will occupy a case that would have given a 1995 power user a serious case of "tower envy", draw more power than the half-dozen halogen floodlights illuminating the back yard to stadium levels, and an active cooling system with outdoor condenser that needs to be connected by refrigerant-containing hoses through a hole in the wall or a partly opened door/window. Of course, it'll have the equivalent of 65,536 discrete CPUs, a terrabyte of onboard ram, up to a petabyte in nonvolatile storage, a GPU subsystem that's more powerful (and draws more power) than the CPU array itself (needed for immersive 3D games if you want to use a haptic bodysuit & gyroscopic G-Force simulator platform along with it).

    Believe it. The industry won't come to a halt when exponential growth of component power ends... it'll just embrace the exponential growth of size and power consumption, and convince consumers that they simply MUST have a computer the size of a small refrigerator that renders apps into solid-feeling virtual tablets (haptic glove & visor required for full Genuine Windows Experience) that can float, spin, and (if you angrily hurl one at a wall) smash into a million pieces before turning into pools of mercury and slurping back into the computer in a cute "shatter" effect.
  • by goombah99 ( 560566 ) on Wednesday September 19, 2007 @02:13PM (#20670681)

    Moores law stays fixed because the industry invests enough research dollars--and not one dollar more-- to keep it at that rate. Their entire economic model is built on this.
    What makes you say that? What about competition? If you knew "the other guys" were striving to exactly meet Moore's law, wouldn't you try to beat it?
    No it's called a Nash Equilibrium. A point in competition space where no player can imporve his strategy given the other players moves. I can't say what the costs that drive it are. But it's so fixed it apparently has reached equilibrium.
  • Re:Python (Score:3, Insightful)

    by philipgar ( 595691 ) <(ude.hgihel) (ta) (2gcp)> on Wednesday September 19, 2007 @02:54PM (#20671185) Homepage

    IN ten years, according to moore's law python will be 32 times faster than it is now.

    Moore's law says nothing at all about the speed of a processor, or of a program. It only says the number of transistors will double every 2 years. The fact that performance benefits have traditionally been had by adding transistors does not mean this will hold. In fact today the performance of most applications is no faster on current computers than top of the line computers from 2 years ago (it's definitely not twice).


  • by Anonymous Coward on Wednesday September 19, 2007 @03:11PM (#20671359)

    Of course, applications will grow to utilise this stuff, but more and more tasks are getting to the point of 'fast enough', even despite the bloating efforts of their creators.

    I'm going to call your bull here. Applications will grow, and some will scale to use more cores. Some applications are "embarrassingly parallel". Most are not. Optimizing compilers might extract enough threads to fill 4 cores, maybe splitting an application into obvious "components" will pull a bit more, but all of this is expensive to do. Ever try writing parallel code that does something worthwhile, has good scaling, and is correct? Most experts can't even do that.

    The computer industry is a speeding car headed for a brick wall right now. We don't know how to improve performance of current applications, and we don't know how to make applications that can scale in the future. Research is being done, but it takes a long time for research to trickle down into real life. Within a few years we'll be using 16 core chips that may utilize 4 cores...

  • by RandCraw ( 1047302 ) on Wednesday September 19, 2007 @03:22PM (#20671511)
    Moore's Law describes a CPU speedup that died at least 3 years ago (all other legalisms aside).

    To wit: I bought a laptop in 2003 with a 2.2 GHz 32 bit P4. According to The Law, by 2005 CPUs on comparable laptops should have run at 4.4 GHz, and by today they should zip along at 8.8 GHz. But in fact, no commodity CPU runs at that speed nor even *half* that speed.

    And don't you believe the claim that multicore or power throttling compensates for or explains The Law's "failure to thrive". The fact is, the industry is no longer delivering CPUs whose SPECMarks/FLOPs/etc (AKA performance) is rising at the rate that they have for the previous 20 years. I tell you, "Moore's Law is pushin' up daisies. It's a DEAD parrot."

    What's puzzling to me is that while this emperor is clearly naked, for some reason, nobody wants to admit it. Why not? Are we afraid that sexy soothsayers like Ray Kurzweil or Rod Brooks will be regarded laughably when they forsee cool stuff like The Singularity or robots possessing human-level cognition, brought to us by the inexorable exponential march of Moore's Law? Or do we simply dread the day when we have to depend entirely on advances in *software* to deliver our next high-tech fix? Perish forbid *that* thought.

    Well, we'd better get used to it, the emperor is naked *and* dead. There's a new emperor in town, and Moore's Law 2.0 depicts a future that looks a hell of a lot like the past.


FORTUNE'S FUN FACTS TO KNOW AND TELL: A giant panda bear is really a member of the racoon family.