Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Hardware

Moore's Law Disputed 252

Kumiorava writes "Transistors can be packed to same chip two times more in every 18 months. This Moore's law has been repeated already over 30 years. Computers become faster, IT economy grows, but Moore's law doesn't apply. That has been proven by researcher Ilkka Tuomi. You can read the research from First Monday article The Lives and Death of Moore's Law." 'tho, to be fair, it seems to me that Moore's Law has lasted a lot longer then the throng of people who keep predicting its death.
This discussion has been archived. No new comments can be posted.

Moore's Law Disputed

Comments Filter:
  • YADOMLA (Score:5, Funny)

    by Xner ( 96363 ) on Monday January 06, 2003 @10:19AM (#5025329) Homepage
    (Yet Another Death of Moore's Law Article)

    My guess is that the reports of the death of Moore's Law will turn out to be greatly exaggerated.

    • Re:YADOMLA (Score:2, Interesting)

      by haus ( 129916 )
      Then again I thought that the Micheal Vick was full of it when he predicted that he was going into Lambeau Field in January and beat the Packers.

      Now I still do not think that we are going to see the end to Moore's Law in the near future. But as I have stated, I have been wrong before.
    • Actually it is slightly different this one. They claim Moore's Law has always been dead.

      Nice with a new refreshing touch.
    • Re:YADOMLA (Score:2, Funny)

      by Anonymous Coward
      Yes the number of people saying that Moore's law is bunk seems to double about every 18 months.
    • Re:YADOMLA (Score:4, Informative)

      by smallpaul ( 65919 ) <paul@prescod.CURIEnet minus physicist> on Monday January 06, 2003 @12:34PM (#5026249)

      I think that the article makes the point that the death of Moore's law _as Moore stated it_ is inevitable:

      "Moore noted that the complexity of minimum cost semiconductor components had doubled per year since the first prototype microchip was produced in 1959. This exponential increase in the number of components on a chip became later known as Moore's Law. In the 1980s, Moore's Law started to be described as the doubling of number of transistors on a chip every 18 months. At the beginning of the 1990s, Moore's Law became commonly interpreted as the doubling of microprocessor power every 18 months. In the 1990s, Moore's Law became widely associated with the claim that computing power at fixed cost is doubling every 18 months."

      Once we reach quantum boundaries, the first statement of Moore's law will fail. There may be something like Moore's law in the future, but it will be just another restatement:

      ""Speculations on the extended lifetime of Moore's Law are therefore often centered on quantum computing, bio-computing, DNA computers, and other theoretically possible information processing mechanisms. Such extensions, obviously, extend beyond semiconductor industry and the domain of Moore's Law. Indeed, it could be difficult to define a "component" or a "chip" in those future devices.""

  • by Da Fokka ( 94074 ) on Monday January 06, 2003 @10:20AM (#5025336) Homepage
    First the 2nd law of thermodynamics fails, then Moore's Law... When will things start fulling upward?
  • Bad article title (Score:4, Interesting)

    by coug_ ( 63333 ) on Monday January 06, 2003 @10:23AM (#5025353) Homepage
    The linked article does not dispute Moore's Law, it merely does the following (from the article):


    "The present paper argues that Moore's Law has not been a driver in the development of microelectronics or information technology. "


    A better title might have been: "Moore's Law - Not All It's Cracked Up To Be"

    • by MrWa ( 144753 )
      The linked article does not dispute Moore's Law, it merely does the following (from the article):
      "The present paper argues that Moore's Law has not been a driver in the development of microelectronics or information technology. "

      I guess that depends on what you mean by driver. In the hardware world, engineers and managers - especially at Intel - are acutely aware of the impact Moore's Law. It has become the primary driver for the rapid advancement of processor speed. The paper basically says this same thing.

      Whether Moore's Law has accurately described the rate at which processors have advanced is insanely trivial to study: did the number double in x amount of time? To say that Moore's Law is wrong misses the point that it was an estimate that has been adopted by the industry, the press, and the public to express expectations of processor advancement and a simple measure to view that advancement. It isn't a law like gravity, nor is it a law like the speed limit: it is a driver in the development of microprocessor technology, though.

    • Finally, someone who actually read the damned article. I agree - further, the only point the guy ever made seemed to be that Moore and crew fudged the doubling time from 1 year, to 2 years, maybe even three. Whatever.

      Looks to me like some jackass with no credibility is trying to make a name for himself by "publishing" a junk article in a "peer-reviewed" online journal by "proving" that Moore's law isn't a fundamental phenomenon. Well, duh. Hell, I wouldn't be surprised if he posted his own article to /.

      • by King Babar ( 19862 ) on Monday January 06, 2003 @12:02PM (#5026041) Homepage
        Finally, someone who actually read the damned article. I agree - further, the only point the guy ever made seemed to be that Moore and crew fudged the doubling time from 1 year, to 2 years, maybe even three. Whatever.

        No. Wrong. Sorry, try reading the *whole* article again. The BIG major point of the article, which he point out at the very beginning, by the way, is just this:

        Moore's Law has never really existed in any form that is consistent or interesting to us.

        It isn't "just" that the doubling times was fudged (although when you're talking about a presumably exponential process a little fudge goes a *long* way). The above bold point really breaks up into three major claims:

        1. Moore's Law lacks a consistent formulation.
        2. Possible choices of formulations that appear to be most consistent with the 1965 original or 1975 revised presentations of the law do not fit the data.
        3. Extensions, of either the tech-savvy, popular, or raw economic (price/performance) variety, do not work empirically, either.

        Seriously, it *is* a really big deal when an idea as big and as potentially important as Moore's Law turns out to have little or no substance. It is always a rude awakening when you find out that a growth process that appears to be exponential has hit some limit. It may be worse in some ways to find out that not only were you not looking at some coherent or unitary process, but that none of the obvious possibilities really ever seemed to show an exponential growth curve for more than 5 years or so.

        Looks to me like some jackass with no credibility is trying to make a name for himself by "publishing" a junk article in a "peer-reviewed" online journal by "proving" that Moore's law isn't a fundamental phenomenon. Well, duh.

        I don't think you read this very carefully. I don't think the author cares at all about fundamental phenomena, just whether there is any testable content to various formulations of Moore's Law, and if there is something you can test, do the empirical data fit the law. Very, very embarassingly, (in my opinion) nobody much bothered to do this before, and the actual data lend very little support to any statement more concrete than "technology has improved significantly and rapidly since the invention of the IC".

        • Miss the point (Score:5, Insightful)

          by siskbc ( 598067 ) on Monday January 06, 2003 @12:23PM (#5026179) Homepage
          The BIG major point of the article, which he point out at the very beginning, by the way, is just this:
          Moore's Law has never really existed in any form that is consistent or interesting to us.

          Right...but since nothing else was ever claimed for Moore's law by anyone with intelligence, I hardly see the point. Yes, I read the article. Yes, what you say is right. Moore's law has never been strictly correct. I'm kind of surprised you thought otherwise.

          Hell, it's never been a law, in that there is no fundamental, scientific *reason* for there to be *any* link between the number of transistors on a chip, processing power, or whatever, and time. Intel *could* have ratcheted up the doubling times if they wanted, say in response to competition. Like what's happened in the last ~4 years thanks to AMD. That alone should have made it obvious that Moore's law is bunk.

          Very, very embarassingly, (in my opinion) nobody much bothered to do this before, and the actual data lend very little support to any statement more concrete than "technology has improved significantly and rapidly since the invention of the IC".

          To me, that's like saying it's embarassing that no one has ever done a test to prove that concrete is harder than styrofoam. No one bothered because it's so trivially obvious. The only people who considered Moore's law to be anything but a marketing construct over the last 30+ years are journalists, most of whom have no tech training.

          It is always a rude awakening when you find out that a growth process that appears to be exponential has hit some limit.

          Now, *that* wasn't in the article. He just proved that Moore's law never really had a point. He gives *no* technical reason why whatever validity it has now will cease to be. Nothing regarding power consumption/loss, tunnelling across junctions, etc. In fact, I saw nothing technical in the "article" whatever. Partially, that's fitting, since Moore's "law" isn't technical. But for the claim it has some technical, fundamental limit, such proof is needed.

          So I'll stay with my original point - this article used 10 pages to prove the mundane. Also,what most people will assume the article proved wasn't in the article at all.

        • by ergo98 ( 9391 )
          Seriously, it *is* a really big deal when an idea as big and as potentially important as Moore's Law turns out to have little or no substance.

          Is this a joke? Moore's law isn't E or the speed of sound: It's a general hypothesis about the rate of technological progress. No one expects there to be an absolute correlation, and really any correlation that there has been has largely been perceived as humorous in the context of the "law" (it isn't a "law", of course, but is rather an "observation").

          Should we go back and re-engineer all of the processors because of this amazing new research into Moore's Law?
    • Of course it's not a driver of the development! It's a side-effect. Moore's Law exists because of our thirst for data processing. We need to speed up entertainment (video games, embeded entertainment devices, etc); information processing (dsp, telemetry processing, etc); and many other applications both theoretical and concrete.

      As long as the market demands bigger, faster, stronger, new methods and materials will continue to be developed.
    • You're right, that's true that the title was that way. But is it not also true that if Moore's Law were not being actively met, we would likely have cooler-running, more efficient (yet faster than the previous) processors designed more along the lines of Astro and Crusoe rather than P4 and Athlon? Personally, I'd like it if chipmakers would strive for some efficiency and cooler-running chips. As much as I love Athlons (i run two at home), I'd rather heat my house with the central heating system than with my CPUs. As it is, since I spend all my home-time with the exception of sleep and bathroom time in my computer room anyway, I don't even have to turn on my heater.
  • by Cy Guy ( 56083 ) on Monday January 06, 2003 @10:23AM (#5025354) Homepage Journal
    Likely I'm not the first to propose this, but based on my monitoring of the IT industry I would propose this corollary to Moore's Law
    Every six month's some pundit will predict that reached have reached the end of Moore's Law...and that the pundit's prediction will be posted on SlashDot...and that within three months some innovation will occur that ensure the continutity of Moore's Law... and finally, that SlashDot will post a story about that innovation.


    • I had something like that, but I called it Mohr's Corollary to Moore's law.
    • by vr ( 9777 )
      you should probably add something about the story on slashdot being posted multiple times.
      • you should probably add something about the story on slashdot being posted multiple times.

        Yeah, and it will probably be posted multiple times.
    • Re:Cy Guy's Law (Score:4, Insightful)

      by Junks Jerzey ( 54586 ) on Monday January 06, 2003 @10:59AM (#5025617)
      Every six month's some pundit will predict that reached have reached the end of Moore's Law

      I know, you're being funny, but I think the difference this time around is that we're in the land of Monster Heat Sinks, Active Cooling, and 70W CPUs. Chip designers *know* how to make things go faster, at the expense of more transistors, but it's the power consumption and heat dissipation problems that are stopping them.
      • Re:Cy Guy's Law (Score:2, Informative)

        by madcow_ucsb ( 222054 )

        Also, if there's one thing that's been drilled into my head in the VLSI classes I've taken, it's that the parasitics associated with the interconnect are what really limit the speed, to a much greater extent than transistor numbers/characteristics.

        So even if we didn't care about power, and heat could magically dissipate itself, the circuit could still only go as fast as the metal inside it would allow.

    • by sharkey ( 16670 ) on Monday January 06, 2003 @11:11AM (#5025695)
      and finally, that SlashDot will post a story about that innovation.

      Twice. In a 24 hour period.
    • I agree that the law/behavior you suggest occurs. If you read the article, you'll see that this is really a different sort of article however.

      The article doesn't say that Moore's law won't continue. It says, and attempts to show empirically, that the ill-defined Moore's Law never really was in effect to begin with; that the data in many cases doesn't really support Moore's Law(!) This is a new and distinctly different sort of claim.

      --LP

      P.S. I hate to bitch. Well, not always. But sigh: "2002-12-14 19:29:50 Moore's Law: the data doesn't fit (articles,hardware) (rejected)"
  • Well, eventually... (Score:5, Interesting)

    by tiltowait ( 306189 ) on Monday January 06, 2003 @10:23AM (#5025358) Homepage Journal
    It will stop, right? I mean, if the marathon record gets 10 minutes shorter every few years, for example, that doesn't necessarily mean that 100 years from now we'll be running a 20 minute marathon.

    Aren't there limits to materials and stuff like that, or do we come up with Infinite Probability Drives, Dimensional Transfunctioners, Flux Capacitors, Heisenberg Compensators, Ludicrous Speeds....
    • by Carbonite ( 183181 ) on Monday January 06, 2003 @10:38AM (#5025478)
      I mean, if the marathon record gets 10 minutes shorter every few years, for example, that doesn't necessarily mean that 100 years from now we'll be running a 20 minute marathon.

      Just a track and field nitpick:

      The marathon world record is usually broken by seconds, not 10 minutes. Since 1908, the record has never been broken by more than seven minutes. The improvement the last five times the record has been broken:

      2002: 4 seconds
      1999: 23 seconds
      1998: 45 seconds
      1988: 22 seconds
      1985: 47 seconds

      The current record is held by Khalid Khannouchi of the US. On April 14, 2002, he ran the London marathon in 2:05:38, breaking his old record by 4 seconds.

      You can see the whole progession here:

      http://www.kajakstandf.org/wr_progression/men/ma ra thon.shtml

    • That's not 100% accurate. The "Law" applies to traditional silicon transistors. When some technology come along to displace silicon, (optical semiconductors, quantum computers, etc) then all bets are off. There is no reason at that point that all processing couldnt be done simultaneously in a very short time frame. At that point, I/O would be the limiting factor.
    • by Zathrus ( 232140 ) on Monday January 06, 2003 @10:57AM (#5025601) Homepage
      Poor example... you've compared an expanding rule of thumb (Moore's Law) vs a contracting one (time to run a marathon). Furthermore you did exponential vs subtractive.

      On a purely theoretical level you could take an expanding series out to infinity. And you'll reach it fairly quickly since, in this case, the series is exponential in nature and not merely additive or multiplicative. Ok, yes, you can never "reach" infinity, but you get the idea. With a subtractive series that has a hard limit (in this case, 0) you're going to reach the limit at some point, and that's it.

      Moore's Law isn't a law anyway... it's a rule of thumb. And eventually we'll hit the limit of physics - a single quantum changing states in picoseconds (if that long - I dunno, I'm not a physicist). We'll probably hit other limits well before then, but who knows -- everytime someone thinks we're up against the wall someone else discoveries a way around the wall and we keep on going for another year or two. Keep in mind that we're using, by and large, the exact same semi-conductor process that was invented by TI back in 1954. There have been thousands or even millions of refinements in the process, but we haven't switched to a non-silicon substrate, moved to light based computing, quantum computing, or anything else.
      • Poor example... you've compared an expanding rule of thumb (Moore's Law) vs a contracting one (time to run a marathon).

        Oh, but Moore's Law can easily be rephrased so that instead of an expanding rule (density of transistors) it describes a contracting rule (area used by a single transistor and its interconnects).
        • Not really. It says that the number of transistors will double -- it says nothing about how this is accomplished.

          It's entirely possible that some researcher could discover a method that would allow production of larger dies without an increase in cost... I'll admit that this is deeply unlikely, and it wouldn't help ramp up speeds, but it could allow continued growth of transistor counts.

          Alternately someone could finally figure out how to do three dimensional dies effectively... which could certainly help perpetuate Moore's Law, increase speeds, etc. all while keeping the density constant.
      • Keep in mind
        that we're using, by and large, the exact same semi-conductor process that was invented by TI back in 1954. There have been thousands or even
        millions of refinements in the process, but we haven't switched to a non-silicon substrate, moved to light based computing, quantum computing,
        or anything else.


        Actually the original transistors used by TI were Germanium not silicon. And they were Bipolar Junction transistors, not the CMOS transistors used in most chips today. And lastly, there have been huge changes to manufacturing, such as self-aligned gate technology, thermal oxide deposition, etc. etc.
      • The world is run by idiots because they're more efficient than hamsters.

        I don't know, sometimes I think we'd be better off putting the hampsters in charge.

        -
    • The problem is that every time we hit a wall, there's enough economic incentive for SOMEONE to find a way around the wall.

      The possible growth is probably limited somewhat, but by limited we're talking about a scale that we're not even close to. Quantum Computing and Nanotech are currently leading us down some interesting paths, and who knows what's next.

      I suspect that you'll never be able to perform more parallel logic operations in a given volume than a smallish multiple of the number of atoms in that volume. Atomic nuclei have properties that we understand well enough that controling them for purposes of logic is currently believable SF. To think that we'll be able to control a single electron or a proton, and its component quarks enough to make it perform logic for us is something I'm not YET willing to accept, but even still I think that we'll be sharply limited in how far down that ramp we can go. At some point we'll need a different model of the universe before we're allowed to extract any meaningful data.
    • by K8Fan ( 37875 ) on Monday January 06, 2003 @11:57AM (#5026009) Journal

      Right, it was silly to call it a law. It's not a law, it was Intel's marketing plan - i.e. "We plan to double chip density every 18 months". By stating it the way he did instead, the Intel CEO provided a goal for the troops, and a very quotable phrase for the pundits. Possibly the most successful memitic infection ever.

    • you must be right.

      People spending all there time going after your analogy, but never addressing your point.

      It will end, even if we keep up with being able to us less electrons to throw a gate, eventually we will be down to 1 electron on a transistor thats the size of 5 atoms, and that pretty much puts an end to it, inless we stop using electrons, but then that would be such a radical change, Moores law wouldn't apply anyways.
    • One of the interesting things about the "law", tho, is its self-fulfilling nature. I believe, and many of you probably do, too, that computing power will continue to double every year and a half or so, for a long time. Why? Well, because people are pretty damn ingenious when they have a goal to meet. Moore's Law, as it is often interpreted (not as it was originally expounded) is that goal.

      As an aside, a story that strikes me as somewhat a propos. Where I used to rock climb (in the Shawanagunks, when I was 30 pounds lighter) there was a story of a route noone had managed to climb before. Two world-class climbers were attempting to be the first. Despite many efforts, neither had been able to make it. One of the climbers arrived one morning and met a friend, to try one last attempt before declaring it impossible. His friend gave him the bad news: his rival had climbed it the day before. Not willing to be outdone, the climber went up the route first try. Only afterwards did his friend congratulate him: he had lied about his rival climbing it, he was the first. (The rival climbed it the next day.)

      It seems the history of human progress is littered with examples of fast followers: a new technology is developed, and immediately afterwards it is developed in many other places. I think that knowing it can be done is, perhaps, the biggest hurdle. Maybe Moore's Law bridges that hurdle for us.
  • by uberdave ( 526529 ) on Monday January 06, 2003 @10:24AM (#5025368) Homepage
    It was never a law (as in operating principle of existence). It was merely a trend in manufacturing. Keen observers could probably make note of similar trends in other industries. I.e. gas mileage of cars, etc.
    • Well said.

      Moore's law has never been anything but an observation. I guess after calling it a law for thirty years, people start confusing it with a foundation of the computing industry.

      I mean, most people have started thinking Windows is an OS right? (sorry.. after all, this is slashdot)

      • Well said.

        Moore's law has never been anything but an observation. I guess after calling it a law for thirty years, people start confusing it with a foundation of the computing industry.

        Please read the article. Pretty please. It is waaay more serious than that. If the author of the article is correct, Moore's Law, in either its orginal, revised, or vastly mutated forms does not really fit ANY concrete observational data we have. This is important because exponential and sub-exponential growth rates are very different things.

  • by nolife ( 233813 ) on Monday January 06, 2003 @10:25AM (#5025374) Homepage Journal
    Let me introduce the Slashdot law [google.com]. This law is inversely proportional to the decline of Moore's law.

    Already taking over 60 seconds to load up..

  • Moore's Theory (Score:5, Informative)

    by Lemmeoutada Collecti ( 588075 ) <obereon&gmail,com> on Monday January 06, 2003 @10:25AM (#5025378) Homepage Journal
    <RANT>
    The oft quoted 'Moore's Law' as some have said before, is not in fact a law at all, but instead a theory proposed by Moore based on the economic and technological trends of his time. He by no means meant to imply that this measurement be used as a benchmark of the technology industry. The fact that is is not only known, but hotly debated in the industry shows not the accuracy of the 'law', but instead the success of the marketing campaigns based off that quote. To be quite realistic, some manufacturers have pushed out technology that has not been completely tested in order to compete in the marketing game of Moore's Law, and thus we have cheap, unreliable PC's. (Don't get me wrong, this is only one of many reasons for this effect!)
    </RANT>
  • Who Cares? (Score:5, Informative)

    by aardwolf64 ( 160070 ) on Monday January 06, 2003 @10:26AM (#5025385) Homepage
    Moore's Law has never really been a hard and fast law. It's more of a rule of thumb... I've read a few books that mention it, and some of them even disagree on the time period in which the double takes place. Some say a year, while some say 18 months. I've also seen articles which claim as a part of "Moore's Law" that the prices also cut in half.

    Defying Moore's Law isn't like defying gravity. We know that at some point, miniturization will no longer be possible. It's hard to double the number of transistors in one space when they're on the atomic level. Do you think we could do that in 18 months?
    • Re:Who Cares? (Score:2, Informative)

      by muyuubyou ( 621373 )
      maybe, but did you read the article? It's very good indeed.

      It's about the evolution of the microchip and Moore's Law's deviations.
  • by Duds ( 100634 ) <dudley@NospAm.enterspace.org> on Monday January 06, 2003 @10:28AM (#5025401) Homepage Journal
    The number of people incorrectly predicting its demise will double every 18 months.
  • Wait wait wait (Score:4, Interesting)

    by sielwolf ( 246764 ) on Monday January 06, 2003 @10:28AM (#5025405) Homepage Journal
    I thought the General version of Moore's Law was "The speed of a computer will double every 18 months or so".

    Fine, originally it was "transistors" but I thought that if dual CPUs became a defacto standard in 12 months that would count towards Moore's Law instead of being illegal since the transistors aren't all on the same die.

    It just sounds like nit-picking bullshit. I've always thought of Moore's Law as "the IT industry will find a way of doubling computing power every 18 months" not some stupid unit of measure.

    Shit, if superior engineering can double computation with the same number of transistors (via better design) shouldn't that count? It just sounds like someone getting into a huff about it and having too much time on their hands to fiddle with Excel.
    • Oh well. I thought it was that every 18 months you'd need a machine twice as powerful to run the current version of the Micro$oft OS.

      Stephen

    • If people are making billion dollar investments based on "Moore's law" and "Moore's law", don't you think it is a problem if "Moore's law" is a fluffy thing that shifts around every decade and has no single coherent definition? If nobody understand what it is or why it works or what might prevent it from working in the future?
    • I belive its is the number of transistors that can be put into the same wafer space will double every 18 months.
      meaning:
      today, you can fit 10,000,000 trinsiter in a square inch of wafer, in 18 months you'll be able to put 20,000,000 transistors into the same space. This should make computers more powerfull, but not neccesarily more 'faster'. Clock speed isn't everything.
  • by tbspit ( 460062 ) on Monday January 06, 2003 @10:30AM (#5025424) Homepage
    Every 18 months, computer software will be made to take twice the processing power for the same task.
  • by taylor ( 11728 ) on Monday January 06, 2003 @10:33AM (#5025442) Journal
    Another factor is the great disparity between actual processing power (often measured in FLOPS etc) and the number of transistors on a chip. For a while, transistors numbers were doubling every 12 months, but computing power was only doubling every 24 months. Why? The need for pipelining and data management meant more and more of the chip had to be dedicated to pre- and post-processing of the actual calculation, along with intelligent caching and the related works of predictive streams.

    An alternative approach has been to build specialized hardware to put all those transistors to use, at the expense of turning your general purpose computer into a very special purpose machine. This has been used, sometimes to great effect, in for example N-body calculations (GRAPE 1-6 [u-tokyo.ac.jp]), yielding 50 or more TFlops of performance for the general computer cost of a 500 GFlop machine. It provides yet another example of the misappropriation of Moore's law.
    • The main reason clock speed isn't increasing as fast as transistor density is cache. The transistors are just more useful as fast, on-chip memory. The PA-RISC computer I'm typing this on (HP B-2600) has 4MB on on-chip cache! Now back to work!
  • by MtViewGuy ( 197597 ) on Monday January 06, 2003 @10:35AM (#5025457)
    I think while we may be starting to reach the point that the laws of physics may limit how much faster a CPU can go, don't forget that other parts of the computer are getting major speed boosts, too.

    First, there is the connection between chipsets on the motherboard. AMD's Hypertransport and others could make big differences on overall motherboard speed.

    Second, system memory speeds are getting quite a bit faster, too. Developments in DDR-SDRAM technology could eventually result in throughput 2-3 times what we have now with DDR333 technology.

    Third, expansion slots are getting faster, too. There are now standards upcoming for both PCI and AGP that will substantially increase data throughput on expansion slots.

    Fourth, mass storage devices are getting faster, too. IDE hard drives have now reached ATA-133 speed, and future IDE hard drives using the new Serial ATA connection will eventually reach the equivalent of ATA-600 speed! SCSI interface hard drives are benefiting from Ultra 160 and Ultra 320 speeds, too. Even optical recorders are getting faster, too; we've reach 48X speeds for CD-R writers, and DVD recorders will go past 12X speeds some time in 2004.

    Fifth, hot-docked external connections are getting faster, too. USB 2.0 support 480 megabits/second connections, and the next-generation of IEEE-1394 connectors will support 800 megabits/second connections.

    Finally, graphics cards have seen VERY dramatic performance increases for 3-D graphics. Today's ATI Radeon 9700 Pro and the upcoming nVidia GeForce FX chipset graphics can achieve 3-D rendering that no one could have dreamed of even five years ago.

    In short, CPU's will probably reach their limits before 2010 but overall system speed will still increase dramatically thanks to other system components speeding up.
  • Which came first? (Score:5, Interesting)

    by markholmberg ( 631311 ) on Monday January 06, 2003 @10:38AM (#5025475)
    The idea of the paper is to show that Moore's law can't be used to predict trends in economics.

    So

    a) "Moore's law" shows us the effect of demand vs. supply

    b) It does not mean that the demand (or demanded quantity) would increase infinitely

    c) You can not call it a law because the variations have been too big (first it was one year, then two, now 18 months) and as the formula is that of exponential growth, those variations mean huge differences at the number of transistors over a period of, say, five years.

    In short, this article looks at the economics (as in macroeconomics) side of Moore's law. It doesn't claim that you couldn't pack more transistors or whatever on a microchip.

    You could also claim that Moore's law might actually hinder economic development as Intel wants to obey the law. What results is that we are actually saying that "wow, Intel is keeping up with the R&D forecasts stated in their company strategy". Yipee.

    Okay, a shitty explanation but please read the paper and look at the idea behind it before saying it's total bullshit.
  • Adobe's Law (Score:5, Funny)

    by Mr_Silver ( 213637 ) on Monday January 06, 2003 @10:44AM (#5025515)
    However big, fast and/or powerful your computer is Adobe Photoshop will always take an age to start up.
  • RTFA, please (Score:5, Insightful)

    by Anonymous Coward on Monday January 06, 2003 @10:49AM (#5025547)
    The guy isn't a person saying that Moore's law is doomed, he clearly points out that it doesn't exist in the first place. The claims of transistor counts doubling every 18 month, and processing power doubling 18 month, and the like are all historical inaccuracies, that Moore himself didn't claim. He also uses numbers to show that Moore's law has in fact NOT been valid.

    It is also shown that Moore's law is often used as an reason by people who don't know better, and those who don't bother to verify their facts. The main point of the article though is that any Moore's law is not the driving force in the IT industry. It all comes to supply and demand. Unlike slashdotters, who seem to like pulling figures out of their ass, this guy actually has real and valid numbers which prove his point.

    Before you make rediculous comments, please, RTFA.
    • but unfortunately it's been, well, you know [slashdot.org].
    • The author messes up by paying too much attention to the constant: that is, whether the doubling time is 18 months, 2 years, or some other number. He also worries too much about whether it's an exact exponential or not. It's not. So what? The most amazing thing is that a doubling time exists, meaning that we have exponential growth.

      Moore's Law should be read as saying that various measures of transistor density on chips grows as O(exp(t)); this has held for 40 years. Of course, no exponential growth can continue forever.

      Much of the recent history of the electronics industry has consisted of treating Moore's Law like a human law, that is, it is the marching order for the entire industry. Everyone from the fabs to the electronic design software houses to the microprocessor manufacturers to the systems houses plans in terms of generations of exponentially increasing density. Even the computer science notion "all problems can be solved by adding an extra level of indirection" implicitly assume that since the processors are getting faster all the time, we can make the code slower if we get more function out of it.

      Keeping this exponential scaling process going is a massive undertaking; those interested in the problems at the cutting edge might want to look at the International Technology Roadmap for Semiconductors [itrs.net].

      In any case, Moore's law is doomed in the long term. I think it's got another decade or so of life, though, as the researchers have a pretty good handle on the next couple of generations of scaling.

  • Gates Law (Score:2, Funny)

    by Ridgelift ( 228977 )
    The number of illicit, dishonest and monopolistic tactics employed by Microsoft doubles every 18 weeks :-| Have a Day
  • "it seems to me that Moore's Law has lasted a lot longer then the throng of people who keep predicting its death."

    You could say the same thing about Apple.
  • True... (Score:3, Funny)

    by artemis67 ( 93453 ) on Monday January 06, 2003 @10:59AM (#5025608)
    'tho, to be fair, it seems to me that Moore's Law has lasted a lot longer then the throng of people who keep predicting its death.

    I just saw a throng pass away last week!
  • Moore's Law is like Moore's love, hard and fast.
  • by Junks Jerzey ( 54586 ) on Monday January 06, 2003 @11:14AM (#5025720)
    Did anyone actually read the damn article?

    It's about how the entire concept of Moore's Law is vague and has been applied to all sorts of other things exhibiting exponential growth, even though Moore was not referring to them. And specifically Moore never gave the time frame of "18 months." He said "1 year" one time, then later said "2 years." And if you look at the data, the transistor count of chips doubles roughly every 26 months, not 18. The point of the article is that Moore's Law is more of a hazy myth than anything else.
    • by King Babar ( 19862 ) on Monday January 06, 2003 @11:37AM (#5025872) Homepage
      Did anyone actually read the damn article?

      It looks like about 3 people so far, but some read it more carefully than others. Please everybody who is reading this: read this article article because it is very important. Again, though, even people who have read the article (or skimmed it) appear not to have gotten the full message. So Junks Jerzey writes:

      And specifically Moore never gave the time frame of "18 months." He said "1 year" one time, then later said "2 years." And if you look at the data, the transistor count of chips doubles roughly every 26 months, not 18.

      It's much worse than that, actually. When he really pulls the gloves off and looks at the hard data over the entire 43-year history of the industry, he finds *no* simple doubling time for almost any measure of interest that has been claimed to be Moore's Law or any folk version of it. Even for transistor counts. What you can sometimes sort of show is iffy exponential fits to the data for 5-10 year periods. Strikingly, though, the doubling rates for several of the measures the author investigates have *slowed*. Improvements do keep on happening, but the pace of the improvement is not as consistent or rapid as you might have expected.

      Now the big deal about this is simple. Anybody who tries to project that our problems will be solved when X doubles in Y months is really walking on thin ice. It is also important because chip technology has often been held up as some special and amazing business whose success should be inspirational to us all, since it improves so fast. Clearly, improvements in raw components have been rapid (although not as rapid as you might expect), but the Big Changes caused by technology are rarely tightly coupled to the speed of improvement in underlying technology. Hey, the *big* change of the last decade is that your grandma now probably has email. I'm not sure it makes sense to calculate how many transistors that took.

  • Whew (Score:5, Funny)

    by digidave ( 259925 ) on Monday January 06, 2003 @11:15AM (#5025729)
    For a second I thought that the headline read "Murphy's Law Disputed". I was going to argue it bitterly.
  • by mwmurphy ( 631277 ) on Monday January 06, 2003 @11:19AM (#5025749) Homepage
    failed? Why does that not surprise me? I'm pretty sure Moore's law started as an offhanded comment based on a few points of data, and has been more of a guidepost that people make reality...not even close to being a law.

    Murphy's Law, now that's a law.

  • Godwin's law fails!

    USENET authorities are disturbed by the failure of a law that some thought to be a lynchpin of internet discussion: Godwin's Law. Simply stated, "As a Usenet discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one." Beginning last week observers began to notice something was wrong. Says one 'lurker', "I came across this thread on abortion, you see. I started reading--and that's when I noticed something strange. Every post in the thread simply got better and better as each participant read the other's arguments and replied calmly. It was then when it hit me--no Nazi references anywhere. I went back to read it again, and I was sure--Godwin's Law has been broken."

    The violation of Godwin's law is hailed by some as a doomsday scenario for USENET. "These threads will just keep going and going forever! There is nothing to stop them. Eventually it'll all just reach critical mass and collapse in on itself," says a popular USENET troll. Others don't see it as Godwin's law fails! USENET authorities are disturbed by the failure of a law that some thought to be a lynchpin of internet discussion: Godwin's Law. Simply stated, "As a Usenet discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one." Beginning last week observers began to notice something was wrong. Says one 'lurker', "I came across this thread on abortion, you see. I started reading--and that's when I noticed something strange. Every post in the thread simply got better and better as each participant read the other's arguments and replied calmly. It was then when it hit me--no Nazi references anywhere. I went back to read it again, and I was sure--Godwin's Law has been broken." The violation of Godwin's law is hailed by some as a doomsday scenario for USENET. "These threads will just keep going and going forever! There is nothing to stop them. Eventually it'll all just reach critical mass and collapse in on itself," says a popular USENET troll. Others don't see it as cataclyismic, put painful all the same. "World War II is a large part of the world's history--I don't want to see that forgotten," reads one post to alt.military.history., put painful all the same. "World War II is a large part of the world's history--I don't want to see that forgotten," reads one post to alt.military.history.
    • by Henry V .009 ( 518000 ) on Monday January 06, 2003 @11:22AM (#5025777) Journal
      Goddamn cut and paste from the spell checker--mod the above post down. Anyone else been having problems with ctrl-c, ctrl-p, and Phoenix?

      Godwin's law fails!

      USENET authorities are disturbed by the failure of a law that some thought to be a lynchpin of internet discussion: Godwin's Law. Simply stated, "As a Usenet discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one." Beginning last week observers began to notice something was wrong. Says one 'lurker', "I came across this thread on abortion, you see. I started reading--and that's when I noticed something strange. Every post in the thread simply got better and better as each participant read the other's arguments and replied calmly. It was then when it hit me--no Nazi references anywhere. I went back to read it again, and I was sure--Godwin's Law has been broken."

      The violation of Godwin's law is hailed by some as a doomsday scenario for USENET. "These threads will just keep going and going forever! There is nothing to stop them. Eventually it'll all just reach critical mass and collapse in on itself," says a popular USENET troll. Others don't see it as cataclysmic, put painful all the same. "World War II is a large part of the world's history--I don't want to see that forgotten," reads one post to alt.military.history.
      • Goddamn cut and paste from the spell checker--mod the above post down. Anyone else been having problems with ctrl-c, ctrl-p, and Phoenix?

        Yeah, I am! I was having this weird problem where every time I try that the web page starts coming out of my printer!! I fixed it by using Ctrl-V instead. ;)

        • Hehehe--got me there. But the ctrl-c, ctrl-v really seems to be having some problems. At first I thought it was some sort of stuck key problem on the keyboard--that didn't turn out to be it though.
  • Since it's a rule of thumb and not a law, why should we care if it's *about* to be broken? Let me know when you have 6 months or maybe 2 years of data showing that the cycle is lengthening or shortening. And since we know that progress will eventually double capabilities, isn't the length of the cycle the only thing that can change?
  • by Rai ( 524476 ) on Monday January 06, 2003 @11:49AM (#5025953) Homepage
    If Moore's Law does official die, will there be sightings of it afterwards like that of Elvis?
  • That has always been my favorite.

    The article, although very long and intense is well-written and very educational in the many interpretations of Moore's Law. A good read.

  • Moore's Law, Moore's Law, Moore's Law...Christmas!

    What "Law"? How about "Postulate", or "Theorem"...no, not theorem...how about "conjecture"? How about "cockamamie bullshit"? You could probably make a similar "law" that describes the performance of light bulb technology over the last 100 years..."well, lightbulbs become (sort-of) 5% less expensive to make and 5% brighter every decade! Whoopee!!!"

    Everybody's seen the graph. It's not linear. It's not exponential...it's just up. Hit or miss. No "law" involved here at all.

    The whole idea that Moore's Law is a Law is stupid from the get-go. Damnit, I wish I could remember the name of the...oh yeah, the IgNobel. They should give the IgNobel to the cat who disproved Moore's Law. I mean, come on people, duh!

    This is almost as stupid as those clone-aid wackos...
  • When Moore said it, he was observing the pace at which things were going. He didn't say it was always going to be that way. Why it ever started getting called a "Law" is probably the fault of some idiot in Intel Marketing.

    It's really just "Moores Semi-Accurate Observation That We Can Use To Help Figure Out How Fast Things Are Changing".

  • This question has been puzzling me for awhile... because, unlike other 'laws' that have fallen into disfavor, we never even expected Moore's "law" to ring true for even a relatively short duration. That we would call this set of observations a law in the first place strikes me as odd, considering that it's expression is dependent on so many other socio-economic factors.
  • "The number of /. readers that don't read the article will double every 1 to 2 years"
  • by jbischof ( 139557 ) on Monday January 06, 2003 @12:47PM (#5026346) Journal
    Why must we constantly hear about people guessing the future of Moore's Law. Is this even news anymore?

    Intel itself has already said that Moore's law is over, explained in slashdot here [slashdot.org]. Of course, other people [slashdot.org] are always predicting the end of it as well. Then again, some people think it will continue [slashdot.org].

    I really wish people would get over Moore's prediction and talk about relevant stuff. There is no way to predict how long unknown scientific breakthroughs will allow Moore's Prediction to remain true. There is one absolute though, the end will come some day, you can only store so many atoms in a certain amount of space according to the rules of quantum physics - that is the absolute barrier.

    Until it is actually abandoned I could do without hearing more of Moore's law.

  • Don't mess with mrs. Moore.
  • There are more factors to Moore's Law then just how many transistors can be put on piece of silicon. It's also based on how fast:
    * Motherboards are created to use the new higher speed processor
    * RAM bandwidth (compare SIMMs performance to DIMMs)
    * Software that utilizes the newer/more advanced features (8bit vs 16bit vs 32bit vs 64bit)
    * Additional load placed on a processor by a new GUI's look feel (eye candy slows your machine down)
    * Lack of advancement of storage devices (slow drives = slow machine)
    * Lack of advancement of IO ports and devices (My ISA video card isnâ(TM)t as fast as my PCI card)

    Processors do get faster. Nevertheless, other factors limit them.

    Try this. Compare the startup time of a fresh install of Windows 95c to a fresh install of Windows XP on the same hardware. You will find that the 95 system is much faster then the XP startup because there isn't nearly as much OS baggage slowing it down.
  • I really like Figure 7 in the paper; you clearly see a 5 to 7 year cycle in the averaged-over-the-year price changes. (Of course, there are smaller cycles that can occur within a year, but we don't see them in the graph.)

    What's most amazing is how many people believe that their boom/bust cycle is the only one - I see at least five in that graph :-)

  • by twisty ( 179219 ) on Monday January 06, 2003 @01:17PM (#5026515) Homepage Journal
    "It's not that transistors use less material, but that the atoms themselves in the material shrink!" declares Special Agent Fox Mulder, expert in conspiracy theories.

    "For eons we've wondered how come Dinosaurs were so much larger than modern mammals, but it's because the closer you get to the Big Bang, the largers those atoms were. I have something in my pocket that will astonish you..."

    Agent Mulder removes from his pocket an atom the size of a tennis ball. "This is an atom from the Dawn of Time itself. The Al Queda has been trying to get there hands on this puppy, because you can split it with a butter knife."

    (Portions of this post were lifted from a bit of Fan Video called "The Fed-EX Files" produced by a film crew in Montreal, Canada.)
  • There were all these irregular looking graphs. But when I looked at them they had axii with titles like
    "average cost of chips" by year.

    This is great is you want to guess what people are paying, less good if you are trying to estimate what they are buying. "Cost per chip" without identifying the chip is ... difficult to interpret. It could be sloppy labeling, but I'm afraid that after looking at a couple of those graphs I estimated that the article wasn't worth reading.

  • Used to be that the stock price of MicroSoft stock doubled every two years, hence a split. Bill Gates would get ten times richer every five years, with predictions of hime becoming the first trillionaire sometime in the first decade of the 21st century.
    Well, MSFT has been stagnant for the past four years. Bill gave over a third to charity and he's been stuck at $30-$40 billion for a while.
  • by cartman ( 18204 ) on Monday January 06, 2003 @04:58PM (#5028351)
    The author of the article seems to misunderstand completely the intent of Moore's law. The article notes a few things:

    1. Increases in transistor count do not precisely follow an exact, continuous, exponential mathematical function. Some years it grows faster, others slower, etc. WELL FUCKING DUH. The article seriously thinks this is original and insightful, but actually it was known to everyone. OBVIOUSLY, Intel releases new processor architectures on some years but not others, therefore the increase in transistor count will be faster on those years and slower on others.

    2. A few journalists have misrepresented Moore's law, by publishing versions that were not identical with what Moore actually said. AMAZING. A journalist misquotes, or misunderstands a technical issue? Who would have thought it possible? I'm glad we have this article to expose such shocking truths.

    ...Moore's law was always intended as a rough rule of thumb that applies relatively well over a long period of time. If anything, the article buttresses Moore's law. The article notes that the original micoprocessor in 1975 had 2,500 transistors, and that the P4 has ~40 million. If we assume a doubling time of 2 years, then Moore's law was substantially correct, within a 10% margin of error. This was far more accurate than I was expecting, and far more accurate than Moore was expecting.

  • by Animats ( 122034 ) on Monday January 06, 2003 @08:51PM (#5029929) Homepage
    For the entire history of Moore's Law, integrated circuits have been made by photolithography on flat silicon. The techniques for making ICs have improved enormously, but the underlying technology has not changed.

    Within a decade, that technology hits a wall - atoms and electrons are too big. That's the ultimate limit for photolithography on flat silicon. We may hit a fab limit, a device physics limit, or a power disspipation limit before that. Right now, power looks like the limiting factor. We're headed for hundreds of amps at fractions of a volt going into physically small ICs. Heat dissipation per unit area is approaching levels normally associated with cooking equipment. But somebody may find a way to get power dissipation down; it's been done before.

    Even after the size limit is reached, it may be possible to push on cost. IC cost per unit area has increased over time as fabs became more expensive. New fab technologies, or improvements to existing ones, might improve the situation. It's of course possible to build physically bigger parts, as well. (Wafer-scale integration turned out to be a dead end. You can make a RAM chip several inches across, and it's been done. But the chip, plus its massive stiffener, is bigger, more expensive, and harder to cool than the current packaging systems.)

    Alternative IC technologies are possible, but none of them seem to provide a lower cost per gate. Gallium is too rare. 3D layering doesn't bring cost down and makes cooling harder. Quantum computing is a long way from the desktop. Nanotechnology is still vaporware. Some of these technologies may eventually work, but to keep digital logic on the Moore's Law curve, they'd have to be further along than they are now.

    It's much like aircraft, circa 1970. Aviation people were talking about bigger supersonic transports, hypersonic transports, suborbital ballistic transports, and large VTOL craft as near-term possibilities. None of them were feasible. 30 years later, aircraft are about like they were in 1970.

    We're going to see a slowdown in IC progress within a decade.

Real programmers don't bring brown-bag lunches. If the vending machine doesn't sell it, they don't eat it. Vending machines don't sell quiche.

Working...