Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware

End of Moore's Law in 10-15 years? 248

javipas writes "In 1965 Gordon Moore — Intel's co-founder — predicted that the number of transistors on integrated circuits would double every two years. Moore's Law has been with us for over 40 years, but it seems that the limits of microelectronics are now not that far from us. Moore has predicted the end of his own law in 10 to 15 years, but he predicted that end before, and failed."
This discussion has been archived. No new comments can be posted.

End of Moore's Law in 10-15 years?

Comments Filter:
  • by User 956 ( 568564 ) on Wednesday September 19, 2007 @10:53AM (#20667659) Homepage
    Moore has predicted the end of his own law in 10 to 15 years, but he predicted that end before, and failed.

    So then it seems with regards to his Law, Moore has fallen prey to Murphy.
    • by click2005 ( 921437 ) on Wednesday September 19, 2007 @10:56AM (#20667713)
      What about the inverse of Moore's Law.. Every 2 years, the average IQ of all users on the internet halves.
      • Every 2 years, the average IQ of all users on the internet halves.
        So true. I believe we have now coined the click2005 law.
      • Re: (Score:3, Insightful)

        by tomstdenis ( 446163 )
        But the IQ is the average ... so it can't halve. :-)
        • It can if you count all the people that don't use the internet? As long as they gain in current IQ terms sufficiently to balance it all out :P
        • Re: (Score:3, Funny)

          Normally that holds true, but this is the internet we're talking about...the average IQ can decrease relative to itself because people can be just that stupid. Not even mathematics can keep up with the drop in the internet's IQ.
      • I would agree the Average IQ of the people on the internet is lower but it is not like moore's law inverse it is aproaching a point where it is matching the worlds average IQ

        Early on the IQ of people who used the Internet was much higher then the Average General Population IQ because inorder to use the Internet you normally needed to be in College and/or have attened college.
        Then it dropped to the people that could afford the high price of the Internet with the cost of the computer.

        Now as the internet is ge
      • by fbjon ( 692006 )

        What about the inverse of Moore's Law.. Every 2 years, the average IQ of all users on the internet halves.
        That's related to Godwin's law. In fact, I'll make a prediction:


        In 10-15 years, we will see the End of Godwin's Law, as internet forums reach their theoretical limit of inanity and flaming. Unfortunately, this prediction will also fail.

    • Moore's law is not about physics it's about economics. Basically the entire industry has built an economic engine that requires that growth pattern to sustain it self.

      To put it another way, growth needs to be geometric not addative. that is things need to grow at x% per year, which leads to a doubling time. If the grew linearly at x += D then as x grew the proportional rate (1/x dx/dt) of x growing shrinks with time--or the doubling period gets longer and longer. Eventually it takes a lifetime before y
      • Re: (Score:3, Insightful)

        Comment removed based on user account deletion
        • so you're 75? 80?
        • Exactly. Whenever one process technology reaches its physical limits, we get a new one, because the new process makes money. X-ray lithography, chip stacking, 3D circuits, and eventually nanotech will all keep us on the Moore's law path probably for the rest of my life, at least.

          Ye be forgettin' one thing, matey, they be makin' multiple cores now. Eventually we be lookin at distributed computing on an individual platform. Ye may be layin' claim to Moore's law applyin', but it be tenuous a claim at best

          • Why do you think that multiple cores and Moore's Law mutually exclusive? All Moore's law does is predict that the number of transistors on a given chunk of silicon will increase exponentially. Put 1 or 80 cores on the same chunk of silicon... it makes no difference as far as Moore's law is concerned.

            Oh yeah, yaaaaarrrr.
        • by hackstraw ( 262471 ) on Wednesday September 19, 2007 @12:40PM (#20669193)
          Whenever one process technology reaches its physical limits, we get a new one, because the new process makes money.

          I kinda agree and kinda disagree.

          Moore's "Law" is clearly stated in terms of physics. It says that the number of transisters will double, not the speed will double over time.

          However, as Kurzweil and other's have observed, the speed of _computation_ has doubled over time before Moore's law and there is no reason or hint that this will stop once Moore's law is obsolete.

          Take a peek at http://www.kurzweilai.net/articles/art0134.html?printable=1 [kurzweilai.net] specifically http://www.kurzweilai.net/articles/images/chart03.jpg [kurzweilai.net]

          ICs have been good for a while, but then so were abacus' at one time.

          CPUs are simply different than they were a few years ago. Things like the Niagra chip from Sun and the multi-core stuff from AMD and Intel is pretty different design (SMP on a chip -- yes, that is an oversimplification).

          10-15 years is about in the middle of 2020, which seems to be a common point of a number of interesting stuff. Physics computations are predicted to be pretty interesting by then. Computers are predicted to be interesting by then. Who knows what else.

          Its not hardware that I think is the problem or challenge, its the pains of software that seems to be more challenging. I mean its 2007 and we have what for software? OSes and compilers and whatnot have pretty much stagnated since the early 70s. Sure, we have 4g languages that are easier for us stupid people to program with, but from a performance and efficiency POV they are backwards, not forwards. JIT stuff in .NET and Java are a little interesting, but programming computers is still a PITA.

          I guess we will have to wait and see.

      • Re: (Score:3, Interesting)

        by Jeff DeMaagd ( 2015 )
        You are right, but that's also because the fabs get more expensive on each generation, I think each feature size shrink requires a fab that costs 50% more than the previous fab.
        • You are right, but that's also because the fabs get more expensive on each generation, I think each feature size shrink requires a fab that costs 50% more than the previous fab.

          See my post below about the corollary for more discussion. But right now your point does not hold simply because the size of the market is increasing and revenues are also increasing. Therefore 1) cost-per-cpu cycle and the cost per unit of computation is falling despite the increasing cost of fabs 2) the growing cost of the fabs as a fraction of growing revenue is not increasing (I believe) yet.

      • by goombah99 ( 560566 ) on Wednesday September 19, 2007 @11:23AM (#20668103)
        If you accept the statement I just made about moore's law being sustained because of economics then here's a corollary which makes an observable prediction.
        Moores law stays fixed because the industry invests enough research dollars--and not one dollar more-- to keep it at that rate. Their entire economic model is built on this.

        Therefore, if we every do reach a point where we simply are running out of available physics and computer science (multiprocessing) then the first sign of this will be an increasing fraction of research dollars spent to sustain moores law.

        Plot the industry's margin, smooth the curve, and you will be able to extrapolate to the point where the research dollars cross the profit line. somewhere shortly before that is when moore's law will end.

        The only way that would not be true is if the nature of innovation changes from frequent small leaps to massive leaps spaced far apart.

        • Moores law stays fixed because the industry invests enough research dollars--and not one dollar more-- to keep it at that rate. Their entire economic model is built on this.
          What makes you say that? What about competition? If you knew "the other guys" were striving to exactly meet Moore's law, wouldn't you try to beat it?
          • Re: (Score:3, Insightful)

            by goombah99 ( 560566 )

            Moores law stays fixed because the industry invests enough research dollars--and not one dollar more-- to keep it at that rate. Their entire economic model is built on this.

            What makes you say that? What about competition? If you knew "the other guys" were striving to exactly meet Moore's law, wouldn't you try to beat it?

            No it's called a Nash Equilibrium. A point in competition space where no player can imporve his strategy given the other players moves. I can't say what the costs that drive it are. But it's so fixed it apparently has reached equilibrium.

        • Plot the industry's margin, smooth the curve, and you will be able to extrapolate to the point where the research dollars cross the profit line. somewhere shortly before that is when moore's law will end.
          Because the future can always be predicted from current trends...
      • nonsense (Score:5, Insightful)

        by Quadraginta ( 902985 ) on Wednesday September 19, 2007 @12:22PM (#20668935)
        That's nonsense. The industry grew around the physics, not vice versa. The fact that the industry is predicated on a constant improvement of speed and complexity is because such a thing is achievable in microelectronics, certainly not because microelectronics is the only industry where such a thing is desirable.

        I mean, who wouldn't want cars to become twice as gas efficient (without losing power) every 18 months, ad infinitum? If such a thing were technically possible, it would happen, because all the car makers would jump on the gas-mileage bandwagon to get ahead of their competitors.

        Who wouldn't want the amount of food that can be grown per man-hour to double every 18 months, so the price per pound of beans and broccoli fell as fast as the price per CPU cycle of computers? If such a thing were possible, it would happen, as every farmer raced to lower his costs of production and undersell his neighbors like crazy, earning millions.

        In very few industries other than microelectronics has anything like Moore's Law applied, and that's not from a lack of economic incentive, but from the plain uncooperativity of Mother Nature. You're arguing backwards, from effect (the economic structure of the industry) to cause (the physical nature of microelectronics).
      • Re: (Score:3, Informative)

        by FuzzyDaddy ( 584528 )
        Moore's law is not about physics it's about economics.

        Your point on economics is well taken. However, there is one aspect of physics in Moore's law - that the equations governing a MOS transistor scale with size. That is, if you make a transistor that is 1/2 the size in all dimensions, and you run it at 1/2 the voltage, it will behave exactly the same as the original. So there has always been a clear development path for doubling your transistor density - cutting the size in half.

        Other technologies

  • Law? (Score:3, Insightful)

    by haystor ( 102186 ) on Wednesday September 19, 2007 @10:54AM (#20667687)
    Can we stop calling a prediction a law?
    • When a prediction is confirmed over a long enough time and with enough consistency, that makes it a law.

      Although at this point, I think it can only be justified as between "hypothesis" and "theory", but I'm no expert.
    • by Surt ( 22457 )
      I'm sorry, but law is completely correct, and calling it Moore's law complies with definition (1A!) from websters:
      http://www.m-w.com/dictionary/law [m-w.com]
      1 a (1) : a binding custom or practice of a community
    • Can we stop calling a prediction a law?

      Not when that prediction is also a law, no. All it takes for something to become a law (in the scientific sense) is a consistent observed pattern. The law of gravity, for example, is nothing more than concise expression of a consistently observed pattern of behaviour. Moore's law is also a concise expression of a consistently observed pattern of behaviour; it is thus a law.

  • by Marc Desrochers ( 606563 ) on Wednesday September 19, 2007 @10:55AM (#20667689)
    It will be just in time for the arrival of cold fusion.
  • by MyLongNickName ( 822545 ) on Wednesday September 19, 2007 @10:55AM (#20667691) Journal
    Moore's second law: "Moore's first law will only work for 10-15 more years".
    Moore's third law: "Moore's second law applies from the time it is quoted not from when it was originally uttered".
    • by irtza ( 893217 )
      Wait, I thought the third law was "Moore's second law takes effect 10-15 years before Moore's first law no longer holds true". I hope I get this straight before the test.... there is going to be a test, right?
    • I thought the second law was:

      The number of experts predicting the end of the first law doubles every 18 months.
  • The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built o

    • Pretty much what he is saying now. So a corollary might be that when Moore stops predicting, Moore's law only has 10 years to run. Which means we all better hope he doesn't die any time soon.
  • Again? (Score:5, Interesting)

    by dylan_- ( 1661 ) on Wednesday September 19, 2007 @10:58AM (#20667731) Homepage
    There are always a few of these [techdirt.com].

    I do recall someone telling me that no CPU would ever run at more than 2GHz, as it would then start emitting microwave radiation...
    • by JoelKatz ( 46478 )
      When I was in college, I learned all the reasons the features on a CPU couldn't be significantly less than 1 micron in size. I also learned that 20Kbps was about the theoretical upper limit for modems.
      • I also learned that 20Kbps was about the theoretical upper limit for modems
        This wasn't too far off. I never got much more than this from a MODEM, since I lived in a rural community with quite noisy lines (26.4Kb/s, I think, was the most I got). Of course, now we don't use MODEMs for communication (well, technically we do, but not in the sense that the word is commonly used).
    • Re:Again? (Score:4, Interesting)

      by krray ( 605395 ) on Wednesday September 19, 2007 @11:25AM (#20668135)
      I do recall someone telling me that no CPU would ever run at more than 2GHz, as it would then start emitting microwave radiation...

      I remember having / making a similar claim myself way back when -- with the 486/33 and 486/66 being the hot system in the day. I predicted they'd have a hard time getting above ~80Mhz because of FM radio interference / shielding problems. Boy was I wrong.... :*)

      Today I predict "Moore's Law" to hold pretty true -- even in 10 or 15 years. IBM has been playing with using atoms as the gate / switch which will make today's CPU's look like Model T's.

      In the 90's they had http://www-03.ibm.com/ibm/history/exhibits/vintage/vintage_4506VV1003.html [ibm.com]
      Not too long ago they've done http://domino.watson.ibm.com/comm/pr.nsf/pages/news.20040909_samm.html [ibm.com]
      And recently it has been http://www.physorg.com/news107703707.html [physorg.com]

      This will both be a boom for storage and the chips themselves IMHO (not to mention my stock :).
    • I do recall someone telling me that no CPU would ever run at more than 2GHz, as it would then start emitting microwave radiation...
      Thinking of it, it would make sense. Really, how comes they don't? Are they just sufficiently shielded?
  • by Anonymous Coward
    We will start using a new technology without transistors, but which will exhibit a similar exponential gain so long as there is money to be made from it.

    In fact, now I come to think of it, ALL human endeavour exhibits exponential growth so long as there is money to be made from it. Technology is just one field where it's true. Sex (Malthus), agriculture, you name it - humans do it exponentially!

    I think I'll put that on my t-shirt, and call it 'Anonymous Coward's Law'.

       
  • by vlad_petric ( 94134 ) on Wednesday September 19, 2007 @11:02AM (#20667805) Homepage
    ... there's nothing fundamental about it. Instead, it's a self-fulfilling prophecy. The big players in the silicon world all use the "law" and its corollaries as their business plan. They'll likely discard a feature/product if it falls behind the curve in terms of speed. For the layperson, this "precision" may indeed create the appearance of an actual law, even though it's just an observation (similar to Malthus' "law")
    • -As someone already posted, a law IS an observation (law of gravitation: objects attracts each other depending on their relative mass and their distance. theory of gravitation: something about gravitons, or something else, no one is really sure what it really is).
      -Moore's law is just a particular case of learning law (any industry tend to regulary improve its production technique as long as the improvement has a positive ROI to justify investing money, wether the result is cheaper or better products is a ch
  • that means 2^5 ... that's 32 times more computing power on a single chip ... that's about 16 times the computing power I need.
    • Re: (Score:2, Funny)

      by mce ( 509 )
      Nobody will ever need more than twice the computing power he already has available on the moment he makes this assessment.
    • by kebes ( 861706 )
      Well, it's about 1/1000 of what I would "need."

      Of course, by "need" I mean what I would like to have. People keep talking about computers being "fast enough," but every time I have to wait for something to finish (whether it's a Photoshop filter, compiling code, running an optimization, or waiting for a Slashdot page with hundreds of comments to load), that's time I could have used a faster computer.

      If my computer were 1000 times faster, then things that currently take minutes would finish "instantly." It's
  • by gurps_npc ( 621217 ) on Wednesday September 19, 2007 @11:08AM (#20667879) Homepage
    Is that even if you are wrong, you are still right.

    Wow, that Moore guy was so smart he outsmarted Moore.

  • by smitty97 ( 995791 ) on Wednesday September 19, 2007 @11:09AM (#20667891)
    from tfa:

    "We're not far from that," Moore said on Monday. "Before we had our Hafnium breakthrough, we were down to the point where we were at five molecular layers in the gate structure of the transistors. When you clearly can't go below one...you get into other types of problems," Moore said.
    The law will continue, they just need breakthroughs with Quarternium, Eigthnium, etc..
  • by mshmgi ( 710435 ) on Wednesday September 19, 2007 @11:09AM (#20667893)
    Next year, they'll tell us that Moore's Law will end in 5-7.5 years.
  • by dave_mcmillen ( 250780 ) on Wednesday September 19, 2007 @11:09AM (#20667903)
    I have no idea if Moore's Law will really start to "fail" in a particular time scale (one of these times it's gotta be true, right?), but a related issue I find interesting is that CPU speeds don't seem to be being touted to computer buyers so heavily anymore. Walk into a big electronics store and look at their desktop offerings: where they used to prominently feature how many GHz they had inside (and people vaguely felt that more of these mysterious GHz was better), now the CPUs are given code names and numbers that don't reflect CPU speed: Check out this nifty X2, or the Turion 64, or ...

    The new hook for consumers is the number of "cores", and once again most people have probably picked up the vague sense that having more of them inside means the computer is better. I've been told by people who might be in a position to know that it's not that they can't keep cranking up CPU speeds, but that the cost/benefit (profit-wise) stops making sense at some point because of the huge cost of implementing a new fab at a finer length scale, and we're pretty much at that point. So it makes sense that cores are the new GHz, and Moore's Law will have less and less direct impact on the end computer buyer from now on.

    Maybe there's a Core Law to be formulated about how often the average number of processors per computer can be expected to double?

    • Re: (Score:3, Interesting)

      by Cutie Pi ( 588366 )
      As the parent implied, Moore's Law will likely not end because of technological constraints but rather economic ones.

      We reached a wall a few years ago in terms of transistor speed, mostly due to the thin gate oxides giving rise to significant leakage current, which translates into heat. The upcoming high-k metal gate technology mitigates but doesn't eliminate this problem. Thus, Intel and the like are putting those smaller transistors to work in redudant cores rather than faster, monolithic circuits. Howeve
    • CPU speed isn't necessarily a draw, but power consumption is. Power consumption is something most computer buyers care a huge amount. Mind you, most computer buyers call their computer a 'phone' these days for some strange historical reason.
    • by cowscows ( 103644 ) on Wednesday September 19, 2007 @12:47PM (#20669285) Journal
      Moore's Law doesn't really have anything to do with MHz or GHz, or clock speeds at all. It's more about the number of transistors crammed into a cost effective chip. For a while there, one of the main things that intel decided to do with all those transistors was to push the clock-speed as fast as they could. This certainly made computers more powerful, and it was an easy number to work into advertisements and such.But at the end of the day, it wasn't the only way that processors could be improved. To bring in a dreaded car analogy, they were making a car go faster by adding more cylinders to the engine, while mostly ignoring things like aerodynamics and efficiency in other parts of the car. But eventually they reached a point where there wasn't any room in the engine compartment for more cylinders, so now they're looking at making the rest of the vehicle more efficient.

      The transistor count will keep going up, Moore's Law will continue. It's just that those new transistors will be used a little differently.
    • Re: (Score:3, Informative)

      If you think that Moore's Law is about the frequency of the processor, you are badly mistaken (but it's mostly not your fault as this is what it has been summarized to in the media).

      Moore's Law is a law of economics, scale and progress.
      The gist is that computing power at a given cost will double every 18 months. It does not matter if this progression is achieved by cranking the frequency (MHz rule) or by increasing the number of transistor and parallel processing (Core rule). This is all about the economics
  • by Colin Smith ( 2679 ) on Wednesday September 19, 2007 @11:09AM (#20667905)
    Moore's law will continue until THE SINGULARITY takes US ALL!!!!!!

    Or at least, that's what the singularity nuts claim. Sorry people, there are limits on this planet.
     
  • he forgets his law has two variables, the number of transistors and the cost. The number of transistors might stop doubling, but there is still huge change to come with the transistors being fixed and the cost changing.

    I mean, a 8 core chip would be an improvement right now, but so would a 4 core chip at half the price. Think about a world where a 80 core chip exists, and how it would change the world. It would do so again when it went from 999 dollars to 99 dollars to 9 dollars to 99 cents to 9 cents to 9
  • Not yet, (Score:4, Insightful)

    by SharpFang ( 651121 ) on Wednesday September 19, 2007 @11:17AM (#20668031) Homepage Journal
    The law speaks about number of transistors. Considering current size of a typical CPU die (about 1cmx1cmx1mm) and assuming a "reasonable" maximum size of some 10cmx10cmx10cm we have about 15 years of doubling the SIZE of the CPU (with some challenges like heat dissipation, but nothing nearly as difficult as increasing the density further) and that's not considering increasing the density any more. So even if the density reaches its limits, the CPUs may simply grow in size for a good while.
    • by mce ( 509 )
      Sorry, but 10cmx10cmx10cm is not reasonable. Not only because you run into functional yield issues, but also because you can only fit so many dies of that size and form factor on a single wafer. And it ain't many. Manufacturing cost would skyrocket even in case the yield would be 100%, which it never will be.
    • A typical processor is not 100mm^2, they're more like 140mm^2 (core2duo is about 143mm^2 according to a quick google search). Do you have any idea how expensive a 1000mm^2 die would be? That would cut yields down, and also be really hard to package, not to mention that if you packed 1000mm^2 with 65nm transistors the thing would consume 300 Amps.

      Sure I suppose it's possible to make a 1000mm^2 die, but it would cost $25,000 USD and probably either be a super-many-core or run really slow (think longest wire
      • Sure I suppose it's possible to make a 1000mm^2 die, but it would cost $25,000 USD
        Of course, according to Moore's Law, in 18 months it would cost $12,500. That's rather the point. Even if we stop being able to shrink dies, the process technology will become better understood and cheaper, and so we will be able to make more of them for a fixed investment.
    • Re:Not yet, (Score:4, Insightful)

      by Chirs ( 87576 ) on Wednesday September 19, 2007 @11:35AM (#20668287)
      You've neglected to consider power issues. The 10cm cube you mention is 10000x the volume of the "typical" current size you mention. Assuming power scales linearly with volume, that would require approximately 300KW of power just for the cpu. That works out to about 1250Amps of current at 240V.

      Nobody wants to increase the size of cpus...defects scale more than linearly with area, so there is a strong incentive to keep the die area down. Also, as the physical size increases you run into other problems...power transfer, clock pulse transfer, etc.
  • by Random832 ( 694525 ) on Wednesday September 19, 2007 @11:20AM (#20668073)
    I predict the number of predictions of the end of Moore's Law will double every six months.
  • by Urban Garlic ( 447282 ) on Wednesday September 19, 2007 @11:22AM (#20668093)
    Luckily, there are enough geeky pedants on slashdot to make up for the fact that the editors have actually messed up this totemic bit of geek lore.

    Moore was/is a technology manager, and his law is a management law. It says the number of transistors that can be economically placed on an integrated circuit, i.e. the transistor density of the price/performance "sweet spot", will increase exponentially, doubling roughly every two years.

    The original [wikipedia.org] refers to "complexity for minimum component cost", which emphasizes the economic aspect of it even more strongly.

    Moore's law has never been about what's possible, it's always been about what's cheap.
  • Computing power will continue grow in direct relation to finite amount of knowledge we have regarding physics. For each advancement in our knowledge of particle physics the more apt we are to apply it toward electronics in general.
  • Nope, nope, and nope (Score:3, Informative)

    by Ancient_Hacker ( 751168 ) on Wednesday September 19, 2007 @11:31AM (#20668227)
    First of all you've misquoted Moore's law.

    Secondly it's not so much a "law", as a consequence of how long it takes to amortize the cost of a fab plant.

    Thirdly, it's tied to 2-D circuit layouts. If and when 3-D IC technology becomes practical, then all we need is 2^1/3 percent or about 22% linear shrink every year, which is somewhat more maintainable for a few more generations.

  • by hattig ( 47930 ) on Wednesday September 19, 2007 @11:32AM (#20668241) Journal
    That's 32 times as many transistors... whereas today you can get 4 cores on a CPU in under 300mm^2, you'll be getting 128 cores in 2017 (simplistic, you'll get a variety of generic cores, and application specific cores, and per-core improvements will increase their size, so say 32 generic cores and 32 application specific cores).

    If it's 16 years, thats 256 times as many transistors. 256 generic cores and 256 application specific cores in 2024? Let's not even imagine the per-core speeds! It's all pretty exciting, and I'm being conservative with the figures here.

    Of course, applications will grow to utilise this stuff, but more and more tasks are getting to the point of 'fast enough', even despite the bloating efforts of their creators. Even if there is a 10 year hiatus in process improvements after 2024, it'll take some time for the applications to catch up apart from certain uses. If those uses are common enough, there will be hardware available for it instead. Of course if only Intel and IBM have fabs that can make these products, because the fabs cost $20b each...
  • by gelfling ( 6534 ) on Wednesday September 19, 2007 @11:38AM (#20668329) Homepage Journal
    It's not a law, it's simply an observation that within Intel, that's more or less the rate of progress. As we saw with the P-4 chip the problem we bumped into was not Moore's Law, but the laws of thermodynamics. So we found a good enough reason to go to multicore CPU's. Eventually though you do bump into Albert Einstein. In 1 billionth of a second, light travels about 1 foot so the entire circuit length from end to end, in order to have a switching frequency of 1 billionth of a second, has to be less than one foot.
  • Why buy a computer this year, when I can get a faster one next year? [thetoque.com]

    The fuzzy logic behind not buying a computer due to Moore's Law.
  • We're already near the end of Moore's Law. The problem is not feature size, it's getting rid of the heat. CPUs are already hitting heat and power limits, which is why CPU speeds stalled out around 3GHz.

    Feature size alone matters for memory devices, and we can expect continued progress in memory density. Even for DRAM, getting rid of the heat is becoming a problem, so the future is with devices that don't require refresh cycles. We'll see progress in flash memories and static memory technologies.

  • the number of transistors on integrated circuits would double every two years.

    The solution is simple. Just make integrated circuit dies twice as big every two years.

  • The end of Moore's law! New solar panels with double efficiency! Flying cars now only 5 years away!!!!

    Are these articles being generated by a script or what?
  • by Trailer Trash ( 60756 ) on Wednesday September 19, 2007 @01:17PM (#20669781) Homepage
    Moore's 2nd law is that Moore's 1st law is going to come to an end in about 10 years. Always.
  • by mschuyler ( 197441 ) on Wednesday September 19, 2007 @01:58PM (#20670445) Homepage Journal
    Moore is being short-sighted about his own law. It's not about silicon. If you extraploate backwards from the first integrated chip you see that "Moore's Law" has been in effect for over 100 years. It started with manual switches, then moved to electric motor switching, then to vacuum tubes, then to transistors, then to integrated circuits. Every one of those mediums has been subject to and demonstrates Moore's Law. Graph it and you'll see. It's a perfect logarithmic line. Every time the method itself peaks of its own accord a new medium is found which can continue the progress. (Any familiar with the growth of telco equipment can see this in the switching systems: Electric switches to step systems to crossbar to ESS.) If IC does run out, there is a future of possibilities: holographic, quantum, bio, etc. Moore's Law is like the Energizer Bunny. It just keeps going.
  • Sounds About Right (Score:3, Interesting)

    by YetAnotherBob ( 988800 ) on Thursday September 20, 2007 @12:09AM (#20677033)
    IC's today are made photographically, on a flat surface. Manufacturers keep working to reduce the area needed for a component, be it transistor, resistor, capacitor or trace wire. We already know from lab work what the minimum possible sizes are for each basic component. We've come up on the minimum possible size several times in the past. Each time, it was related to the possibilities of the light source we were using. Now, we are up there in the extreme UV range, and have minimum feature sizes that are actually smaller than the wavelength used. The best commercial plants use a 45 nM wavelength. At about 30 nM, the traces (on chip wires) become unstable, and may no longer be conductors. That is a fundamental limit that clever plant engineering will not be able to surmount. Current commercial plants are using a 60 to 90 nM min. feature size, if memory serves. That means we have about 6 or 7 doublings (each doubling is about a 70% reduction in feature size and takes 2 to 3 years t realize.) That gives us 12 to 20 years.

    Going to still smaller wavelengths means that the photons pack more punch. It's like trying to play billiards by shooting the cue ball with a high powered rifle. You get pieces of cue ball everywhere. When random photon collisions are pushing random atoms by several dozen radii, your nice ordered atomic lattice becomes a horrid mess. we are nearing the limits of what nature allows for photo lithography now.

    Increasing chip size is not a viable solution, as the full wafer is used now. Increase chip size, and yield drops quickly. Yes, they could double the size of the chip to increase transistor count, but that would mean increasing the cost of the chip by 4X. That's not he direction we want chip cost to go.

    Off in the distance, there are more real hard boundaries, beyond which no amount of effort will yield additional benefits. One of those is component size. Minimum transistor size is 7 atoms (it's been done). Minimum diode size is about 5. Minimum trace size varies with material. The best I've seen is benzene, at about 6 atoms width. Keep in mind that at room temperature, benzene is a gas. It's going to be very hard to make wires of the stuff. We really need a solid. Aluminum, silver, gold, all have been used, and all need to be 30 to 60 atoms wide or more, and several thick to be even a poor conductor. Some creative metallo-insulator engineered materials might allow for smaller trace sizes, but probably not. Please note that this is still smaller than buckytubes, which are also as tall as they are wide, creating other connection problems, so don't peddle that as a panacea. That means that the trace sizes required will probably be the final limit. Real capacitors are larger than the traces, but their size is really controlled by the number of electrons needed to operate the transistor/switch. I'm still betting on the traces as establishing the limit.

    Heat dissipation is also a problem. It gets to be more of a problem as densities go up. Current best designs are operating half way to melt now. switching to silicon carbide would let us go hotter, say 400 to 800 C. Diamond/graphite bases would let it get higher still, though diamond heated to 1,200 in an oxygen atmosphere isn't going to last very long. Need some creative packaging there. Heat dissipation is the real reason we can't go 3D. The systems that tried to be true 3D, or near to it, all relied on the chips being immersed in some coolant and having channels for the coolant through the chip. Liquid nitrogen cooled some that IBM did a few years ago. bubbles were a problem. move the coolant fast enough to transport the heat before bubbling and erosion is a problem.

    Some of these issues can be fixed, some can never be fixed. So, when we are fully 30 nM size with our components, it all stops. It's a problem with the wiring. Solve that, and we would be close to being able to compute with atoms. But, with what we think we can do now, the shrinkage stops in about 20 years.

    Enjoy it while you can.

    Looks like you

FORTRAN is not a flower but a weed -- it is hardy, occasionally blooms, and grows in every computer. -- A.J. Perlis

Working...