End of Moore's Law in 10-15 years? 248
javipas writes "In 1965 Gordon Moore — Intel's co-founder — predicted that the number of transistors on integrated circuits would double every two years. Moore's Law has been with us for over 40 years, but it seems that the limits of microelectronics are now not that far from us. Moore has predicted the end of his own law in 10 to 15 years, but he predicted that end before, and failed."
Law? (Score:3, Insightful)
And I predict that in 10-15 years time.... (Score:2, Insightful)
In fact, now I come to think of it, ALL human endeavour exhibits exponential growth so long as there is money to be made from it. Technology is just one field where it's true. Sex (Malthus), agriculture, you name it - humans do it exponentially!
I think I'll put that on my t-shirt, and call it 'Anonymous Coward's Law'.
Re:Gordon Moore (Score:2, Insightful)
And if quantum computing should herald The Singularity, then it's definitely moot, since no predictions (Moore's Law included) can be made about post-Singularity computing.
Re:Gordon Moore (Score:3, Insightful)
Either way, when Gordon Moore eventually dies he will still be overflowing the (long)money; variable.
Re:Gordon Moore (Score:3, Insightful)
Moore's law isn't really a law (Score:3, Insightful)
It's a law of econmics (Score:3, Insightful)
To put it another way, growth needs to be geometric not addative. that is things need to grow at x% per year, which leads to a doubling time. If the grew linearly at x += D then as x grew the proportional rate (1/x dx/dt) of x growing shrinks with time--or the doubling period gets longer and longer. Eventually it takes a lifetime before your computer is 2x more capable. Then it takes 2 lifetimes.
Why would you ever upgrade at that point? except due to wear and tear. Things become commodities and sales are based on price and other values-added. So long to intel's industry domination model.
Moore's law is also a limit too. Namely that very same growth engine will not invest twice as many research dollars to get a slightly faster doubling time. The fact that it has held steady tells you that this is so. Empirically this growth rate is the sweat spot between creating innovation at the lowest cost, and reaping a profit on it.
Indeed the only surprising thing we've seen in the consumer market that seemed (superficially) to violate this was apples' replacement of the ipod mini with the ipod nano shortly after it's introduction. They could easily have milked it for longer. But here the driver was the competition that they needed to stay ahead of.
Re:Gordon Moore (Score:3, Insightful)
There is no doubt that we will reach a hard physical barrier beyond which we cannot shrink individual transistors any longer. This limit will be reached in a decade or two, and is probably what Moore is referring to: at our current scaling, we will hit atomic limits rather soon.
But, that doesn't mean that the exponential increases in computing power will end. There are many other things that may happen, such as figuring out ways to build microprocessors with transistors stacked in 3D (rather than having a single 2D layer of transistors), which will increase the transistor count in our computers by orders-of-magnitude. Improvements in chip designs, layouts, and algorithms are other areas that may see improvement. Specialized and dynamically re-programmable chips may also provide us with further advances. Or perhaps, as you pointed out, quantum computers will become viable and mainstream.
There is no guarantee that these more exotic technologies will work out. Yet the microelectronics industry has surprised us time and again with their ability to overcome huge technical obstacles. Thus it seems at least possible that they will deliver technology that is very much up against the physical limits of what is achievable. And with regard to those physical limits, the hard-wall the Moore is predicting in 10 years is only one aspect. There are many other ways for this technology to be advanced.
Comment removed (Score:3, Insightful)
CPU speed already on the wane as consumer bait (Score:5, Insightful)
The new hook for consumers is the number of "cores", and once again most people have probably picked up the vague sense that having more of them inside means the computer is better. I've been told by people who might be in a position to know that it's not that they can't keep cranking up CPU speeds, but that the cost/benefit (profit-wise) stops making sense at some point because of the huge cost of implementing a new fab at a finer length scale, and we're pretty much at that point. So it makes sense that cores are the new GHz, and Moore's Law will have less and less direct impact on the end computer buyer from now on.
Maybe there's a Core Law to be formulated about how often the average number of processors per computer can be expected to double?
NO NO NO NO NO!!!!! (Score:3, Insightful)
Or at least, that's what the singularity nuts claim. Sorry people, there are limits on this planet.
two variables (Score:2, Insightful)
I mean, a 8 core chip would be an improvement right now, but so would a 4 core chip at half the price. Think about a world where a 80 core chip exists, and how it would change the world. It would do so again when it went from 999 dollars to 99 dollars to 9 dollars to 99 cents to 9 cents to 9 for a cent/
Not yet, (Score:4, Insightful)
Obligatory Pedantry -- it's about what's cheap (Score:3, Insightful)
Moore was/is a technology manager, and his law is a management law. It says the number of transistors that can be economically placed on an integrated circuit, i.e. the transistor density of the price/performance "sweet spot", will increase exponentially, doubling roughly every two years.
The original [wikipedia.org] refers to "complexity for minimum component cost", which emphasizes the economic aspect of it even more strongly.
Moore's law has never been about what's possible, it's always been about what's cheap.
10 years is 5 more cycles (Score:3, Insightful)
If it's 16 years, thats 256 times as many transistors. 256 generic cores and 256 application specific cores in 2024? Let's not even imagine the per-core speeds! It's all pretty exciting, and I'm being conservative with the figures here.
Of course, applications will grow to utilise this stuff, but more and more tasks are getting to the point of 'fast enough', even despite the bloating efforts of their creators. Even if there is a 10 year hiatus in process improvements after 2024, it'll take some time for the applications to catch up apart from certain uses. If those uses are common enough, there will be hardware available for it instead. Of course if only Intel and IBM have fabs that can make these products, because the fabs cost $20b each...
Re:Not yet, (Score:4, Insightful)
Nobody wants to increase the size of cpus...defects scale more than linearly with area, so there is a strong incentive to keep the die area down. Also, as the physical size increases you run into other problems...power transfer, clock pulse transfer, etc.
Re:it's the law (Score:3, Insightful)
nonsense (Score:5, Insightful)
I mean, who wouldn't want cars to become twice as gas efficient (without losing power) every 18 months, ad infinitum? If such a thing were technically possible, it would happen, because all the car makers would jump on the gas-mileage bandwagon to get ahead of their competitors.
Who wouldn't want the amount of food that can be grown per man-hour to double every 18 months, so the price per pound of beans and broccoli fell as fast as the price per CPU cycle of computers? If such a thing were possible, it would happen, as every farmer raced to lower his costs of production and undersell his neighbors like crazy, earning millions.
In very few industries other than microelectronics has anything like Moore's Law applied, and that's not from a lack of economic incentive, but from the plain uncooperativity of Mother Nature. You're arguing backwards, from effect (the economic structure of the industry) to cause (the physical nature of microelectronics).
sure they do (Score:5, Insightful)
And of course they would. Technology, like the stock market or the weather, is inherently a chaotic system over a certain characteristic timespan (1-2 weeks for the stock market and the weather, 25-50 years for technology). That is, over the characteristic timespan very small causes can produce enormous, system-wide effects, what you might call the butterfly wing flapping causing the hurricane phenomenon.
For example, a couple of guys (Jobs and Wozniak) screw around in the garage in the early 80s, trying to put together a really cheap personal computer. That's a very small cause. And twenty-five years later, it has had a giant effect: iMacs and iPods and iTunes oh my. Problem is, there was no practical way in the 1980s to distinguish the small cause that mattered (Jobs and Wozniak) from the other 50 zillion small causes that didn't matter (the other 50 zillion pairs of scruffy entrepreneurs in garages whose brilliant idea went nowhere).
This is why predictions of the future out more than 50 years usually end up looking hilarious in hindsight. When sf writers of the 50s and 60s predicted the present, they projected the dominant themes of their time (spaceflight, atomic physics, the struggle with Soviet Communism). They did not -- and could not -- realize that all three themes would pretty much abruptly and surprisingly come to an end in the 90s. When present writers predict the future, they project the dominant themes of our times (e.g. networked computing). It's very likely these projections, too, will end up wildly wrong. Networked computing is likely to become as humdrum and static as telephony within the next half-century or so.
Re:CPU speed already on the wane as consumer bait (Score:5, Insightful)
The transistor count will keep going up, Moore's Law will continue. It's just that those new transistors will be used a little differently.
Re:nonsense (Score:2, Insightful)
Hah! Exxon wouldn't, and they don't, so we have no say.
--
X's and O's for all my foes.
Re:It's a law of econmics (Score:2, Insightful)
Believe it. The industry won't come to a halt when exponential growth of component power ends... it'll just embrace the exponential growth of size and power consumption, and convince consumers that they simply MUST have a computer the size of a small refrigerator that renders apps into solid-feeling virtual tablets (haptic glove & visor required for full Genuine Windows Experience) that can float, spin, and (if you angrily hurl one at a wall) smash into a million pieces before turning into pools of mercury and slurping back into the computer in a cute "shatter" effect.
Re:Corollary to moores law (Score:3, Insightful)
Re:Python (Score:3, Insightful)
Moore's law says nothing at all about the speed of a processor, or of a program. It only says the number of transistors will double every 2 years. The fact that performance benefits have traditionally been had by adding transistors does not mean this will hold. In fact today the performance of most applications is no faster on current computers than top of the line computers from 2 years ago (it's definitely not twice).
Phil
Re:10 years is 5 more cycles (Score:1, Insightful)
I'm going to call your bull here. Applications will grow, and some will scale to use more cores. Some applications are "embarrassingly parallel". Most are not. Optimizing compilers might extract enough threads to fill 4 cores, maybe splitting an application into obvious "components" will pull a bit more, but all of this is expensive to do. Ever try writing parallel code that does something worthwhile, has good scaling, and is correct? Most experts can't even do that.
The computer industry is a speeding car headed for a brick wall right now. We don't know how to improve performance of current applications, and we don't know how to make applications that can scale in the future. Research is being done, but it takes a long time for research to trickle down into real life. Within a few years we'll be using 16 core chips that may utilize 4 cores...
Moore's Law is a DEAD parrot (Score:2, Insightful)
To wit: I bought a laptop in 2003 with a 2.2 GHz 32 bit P4. According to The Law, by 2005 CPUs on comparable laptops should have run at 4.4 GHz, and by today they should zip along at 8.8 GHz. But in fact, no commodity CPU runs at that speed nor even *half* that speed.
And don't you believe the claim that multicore or power throttling compensates for or explains The Law's "failure to thrive". The fact is, the industry is no longer delivering CPUs whose SPECMarks/FLOPs/etc (AKA performance) is rising at the rate that they have for the previous 20 years. I tell you, "Moore's Law is pushin' up daisies. It's a DEAD parrot."
What's puzzling to me is that while this emperor is clearly naked, for some reason, nobody wants to admit it. Why not? Are we afraid that sexy soothsayers like Ray Kurzweil or Rod Brooks will be regarded laughably when they forsee cool stuff like The Singularity or robots possessing human-level cognition, brought to us by the inexorable exponential march of Moore's Law? Or do we simply dread the day when we have to depend entirely on advances in *software* to deliver our next high-tech fix? Perish forbid *that* thought.
Well, we'd better get used to it, the emperor is naked *and* dead. There's a new emperor in town, and Moore's Law 2.0 depicts a future that looks a hell of a lot like the past.
Randy