Moore's Law Disputed 252
Kumiorava writes "Transistors can be packed to same chip two times more in every 18 months. This Moore's law has been repeated already over 30 years. Computers become faster, IT economy grows, but Moore's law doesn't apply. That has been proven by researcher Ilkka Tuomi. You can read the research from First Monday article The Lives and Death of Moore's Law." 'tho, to be fair, it seems to me that Moore's Law has lasted a lot longer then the throng of people who keep predicting its death.
YADOMLA (Score:5, Funny)
My guess is that the reports of the death of Moore's Law will turn out to be greatly exaggerated.
Re:YADOMLA (Score:2, Interesting)
Now I still do not think that we are going to see the end to Moore's Law in the near future. But as I have stated, I have been wrong before.
Re:YADOMLA (Score:2)
Nice with a new refreshing touch.
Re:YADOMLA (Score:2, Funny)
Re:YADOMLA (Score:4, Informative)
I think that the article makes the point that the death of Moore's law _as Moore stated it_ is inevitable:
"Moore noted that the complexity of minimum cost semiconductor components had doubled per year since the first prototype microchip was produced in 1959. This exponential increase in the number of components on a chip became later known as Moore's Law. In the 1980s, Moore's Law started to be described as the doubling of number of transistors on a chip every 18 months. At the beginning of the 1990s, Moore's Law became commonly interpreted as the doubling of microprocessor power every 18 months. In the 1990s, Moore's Law became widely associated with the claim that computing power at fixed cost is doubling every 18 months."
Once we reach quantum boundaries, the first statement of Moore's law will fail. There may be something like Moore's law in the future, but it will be just another restatement:
""Speculations on the extended lifetime of Moore's Law are therefore often centered on quantum computing, bio-computing, DNA computers, and other theoretically possible information processing mechanisms. Such extensions, obviously, extend beyond semiconductor industry and the domain of Moore's Law. Indeed, it could be difficult to define a "component" or a "chip" in those future devices.""
There goes another one of my solid beliefs :) (Score:5, Funny)
obligatory simpsons reference (Score:5, Funny)
Sorry, couldnt help myself
Re:IN SOVIET RUSSIA (Score:3, Funny)
Re:There goes another one of my solid beliefs :) (Score:3, Interesting)
Sorry--I just had to be contrarian about a new "discovery" in physics for a bit.
Re:There goes another one of my solid beliefs :) (Score:2)
Bad article title (Score:4, Interesting)
"The present paper argues that Moore's Law has not been a driver in the development of microelectronics or information technology. "
A better title might have been: "Moore's Law - Not All It's Cracked Up To Be"
Re:Bad article title (Score:3, Insightful)
"The present paper argues that Moore's Law has not been a driver in the development of microelectronics or information technology. "
I guess that depends on what you mean by driver. In the hardware world, engineers and managers - especially at Intel - are acutely aware of the impact Moore's Law. It has become the primary driver for the rapid advancement of processor speed. The paper basically says this same thing.
Whether Moore's Law has accurately described the rate at which processors have advanced is insanely trivial to study: did the number double in x amount of time? To say that Moore's Law is wrong misses the point that it was an estimate that has been adopted by the industry, the press, and the public to express expectations of processor advancement and a simple measure to view that advancement. It isn't a law like gravity, nor is it a law like the speed limit: it is a driver in the development of microprocessor technology, though.
Re:Bad article....period (Score:2)
Looks to me like some jackass with no credibility is trying to make a name for himself by "publishing" a junk article in a "peer-reviewed" online journal by "proving" that Moore's law isn't a fundamental phenomenon. Well, duh. Hell, I wouldn't be surprised if he posted his own article to /.
Re:Bad article....period (Score:5, Insightful)
No. Wrong. Sorry, try reading the *whole* article again. The BIG major point of the article, which he point out at the very beginning, by the way, is just this:
Moore's Law has never really existed in any form that is consistent or interesting to us.
It isn't "just" that the doubling times was fudged (although when you're talking about a presumably exponential process a little fudge goes a *long* way). The above bold point really breaks up into three major claims:
Seriously, it *is* a really big deal when an idea as big and as potentially important as Moore's Law turns out to have little or no substance. It is always a rude awakening when you find out that a growth process that appears to be exponential has hit some limit. It may be worse in some ways to find out that not only were you not looking at some coherent or unitary process, but that none of the obvious possibilities really ever seemed to show an exponential growth curve for more than 5 years or so.
I don't think you read this very carefully. I don't think the author cares at all about fundamental phenomena, just whether there is any testable content to various formulations of Moore's Law, and if there is something you can test, do the empirical data fit the law. Very, very embarassingly, (in my opinion) nobody much bothered to do this before, and the actual data lend very little support to any statement more concrete than "technology has improved significantly and rapidly since the invention of the IC".
Miss the point (Score:5, Insightful)
Moore's Law has never really existed in any form that is consistent or interesting to us.
Right...but since nothing else was ever claimed for Moore's law by anyone with intelligence, I hardly see the point. Yes, I read the article. Yes, what you say is right. Moore's law has never been strictly correct. I'm kind of surprised you thought otherwise.
Hell, it's never been a law, in that there is no fundamental, scientific *reason* for there to be *any* link between the number of transistors on a chip, processing power, or whatever, and time. Intel *could* have ratcheted up the doubling times if they wanted, say in response to competition. Like what's happened in the last ~4 years thanks to AMD. That alone should have made it obvious that Moore's law is bunk.
Very, very embarassingly, (in my opinion) nobody much bothered to do this before, and the actual data lend very little support to any statement more concrete than "technology has improved significantly and rapidly since the invention of the IC".
To me, that's like saying it's embarassing that no one has ever done a test to prove that concrete is harder than styrofoam. No one bothered because it's so trivially obvious. The only people who considered Moore's law to be anything but a marketing construct over the last 30+ years are journalists, most of whom have no tech training.
It is always a rude awakening when you find out that a growth process that appears to be exponential has hit some limit.
Now, *that* wasn't in the article. He just proved that Moore's law never really had a point. He gives *no* technical reason why whatever validity it has now will cease to be. Nothing regarding power consumption/loss, tunnelling across junctions, etc. In fact, I saw nothing technical in the "article" whatever. Partially, that's fitting, since Moore's "law" isn't technical. But for the claim it has some technical, fundamental limit, such proof is needed.
So I'll stay with my original point - this article used 10 pages to prove the mundane. Also,what most people will assume the article proved wasn't in the article at all.
Re:Bad article....period (Score:2, Interesting)
Is this a joke? Moore's law isn't E or the speed of sound: It's a general hypothesis about the rate of technological progress. No one expects there to be an absolute correlation, and really any correlation that there has been has largely been perceived as humorous in the context of the "law" (it isn't a "law", of course, but is rather an "observation").
Should we go back and re-engineer all of the processors because of this amazing new research into Moore's Law?
Re:Bad article title (Score:2)
As long as the market demands bigger, faster, stronger, new methods and materials will continue to be developed.
Bad article title about a bad law (Score:2)
I forgot to make my point... (Score:2)
Cy Guy's Law (Score:4, Funny)
Re:Cy Guy's Law (Score:2)
Re:Cy Guy's Law (Score:3, Funny)
Re:Cy Guy's Law (Score:2)
Yeah, and it will probably be posted multiple times.
Re:Cy Guy's Law (Score:4, Insightful)
I know, you're being funny, but I think the difference this time around is that we're in the land of Monster Heat Sinks, Active Cooling, and 70W CPUs. Chip designers *know* how to make things go faster, at the expense of more transistors, but it's the power consumption and heat dissipation problems that are stopping them.
Re:Cy Guy's Law (Score:2, Informative)
Also, if there's one thing that's been drilled into my head in the VLSI classes I've taken, it's that the parasitics associated with the interconnect are what really limit the speed, to a much greater extent than transistor numbers/characteristics.
So even if we didn't care about power, and heat could magically dissipate itself, the circuit could still only go as fast as the metal inside it would allow.
Re:Cy Guy's Law (Score:5, Funny)
Twice. In a 24 hour period.
Re:Cy Guy's Law (Score:4, Funny)
Re:Cy Guy's Law (Score:2)
Twice! In a 15 minute period! Uphill! Both ways!
The issue is that Moore's Law never worked (Score:2)
The article doesn't say that Moore's law won't continue. It says, and attempts to show empirically, that the ill-defined Moore's Law never really was in effect to begin with; that the data in many cases doesn't really support Moore's Law(!) This is a new and distinctly different sort of claim.
--LP
P.S. I hate to bitch. Well, not always. But sigh: "2002-12-14 19:29:50 Moore's Law: the data doesn't fit (articles,hardware) (rejected)"
Well, eventually... (Score:5, Interesting)
Aren't there limits to materials and stuff like that, or do we come up with Infinite Probability Drives, Dimensional Transfunctioners, Flux Capacitors, Heisenberg Compensators, Ludicrous Speeds....
Re:Well, eventually... (Score:4, Informative)
Just a track and field nitpick:
The marathon world record is usually broken by seconds, not 10 minutes. Since 1908, the record has never been broken by more than seven minutes. The improvement the last five times the record has been broken:
2002: 4 seconds
1999: 23 seconds
1998: 45 seconds
1988: 22 seconds
1985: 47 seconds
The current record is held by Khalid Khannouchi of the US. On April 14, 2002, he ran the London marathon in 2:05:38, breaking his old record by 4 seconds.
You can see the whole progession here:
http://www.kajakstandf.org/wr_progression/men/m
Re:Well, eventually... (Score:2)
Re:Well, eventually... (Score:4, Interesting)
On a purely theoretical level you could take an expanding series out to infinity. And you'll reach it fairly quickly since, in this case, the series is exponential in nature and not merely additive or multiplicative. Ok, yes, you can never "reach" infinity, but you get the idea. With a subtractive series that has a hard limit (in this case, 0) you're going to reach the limit at some point, and that's it.
Moore's Law isn't a law anyway... it's a rule of thumb. And eventually we'll hit the limit of physics - a single quantum changing states in picoseconds (if that long - I dunno, I'm not a physicist). We'll probably hit other limits well before then, but who knows -- everytime someone thinks we're up against the wall someone else discoveries a way around the wall and we keep on going for another year or two. Keep in mind that we're using, by and large, the exact same semi-conductor process that was invented by TI back in 1954. There have been thousands or even millions of refinements in the process, but we haven't switched to a non-silicon substrate, moved to light based computing, quantum computing, or anything else.
Re:Well, eventually... (Score:2)
Oh, but Moore's Law can easily be rephrased so that instead of an expanding rule (density of transistors) it describes a contracting rule (area used by a single transistor and its interconnects).
Re:Well, eventually... (Score:2)
It's entirely possible that some researcher could discover a method that would allow production of larger dies without an increase in cost... I'll admit that this is deeply unlikely, and it wouldn't help ramp up speeds, but it could allow continued growth of transistor counts.
Alternately someone could finally figure out how to do three dimensional dies effectively... which could certainly help perpetuate Moore's Law, increase speeds, etc. all while keeping the density constant.
Re:Well, eventually... (Score:2)
that we're using, by and large, the exact same semi-conductor process that was invented by TI back in 1954. There have been thousands or even
millions of refinements in the process, but we haven't switched to a non-silicon substrate, moved to light based computing, quantum computing,
or anything else.
Actually the original transistors used by TI were Germanium not silicon. And they were Bipolar Junction transistors, not the CMOS transistors used in most chips today. And lastly, there have been huge changes to manufacturing, such as self-aligned gate technology, thermal oxide deposition, etc. etc.
Re:Well, eventually... (Score:2)
I don't know, sometimes I think we'd be better off putting the hampsters in charge.
-
Re:Well, eventually... (Score:2)
The possible growth is probably limited somewhat, but by limited we're talking about a scale that we're not even close to. Quantum Computing and Nanotech are currently leading us down some interesting paths, and who knows what's next.
I suspect that you'll never be able to perform more parallel logic operations in a given volume than a smallish multiple of the number of atoms in that volume. Atomic nuclei have properties that we understand well enough that controling them for purposes of logic is currently believable SF. To think that we'll be able to control a single electron or a proton, and its component quarks enough to make it perform logic for us is something I'm not YET willing to accept, but even still I think that we'll be sharply limited in how far down that ramp we can go. At some point we'll need a different model of the universe before we're allowed to extract any meaningful data.
Re:Well, eventually... (Score:5, Insightful)
Right, it was silly to call it a law. It's not a law, it was Intel's marketing plan - i.e. "We plan to double chip density every 18 months". By stating it the way he did instead, the Intel CEO provided a goal for the troops, and a very quotable phrase for the pundits. Possibly the most successful memitic infection ever.
Re:Well, eventually... (Score:2)
People spending all there time going after your analogy, but never addressing your point.
It will end, even if we keep up with being able to us less electrons to throw a gate, eventually we will be down to 1 electron on a transistor thats the size of 5 atoms, and that pretty much puts an end to it, inless we stop using electrons, but then that would be such a radical change, Moores law wouldn't apply anyways.
Re:Well, eventually... (Score:2)
As an aside, a story that strikes me as somewhat a propos. Where I used to rock climb (in the Shawanagunks, when I was 30 pounds lighter) there was a story of a route noone had managed to climb before. Two world-class climbers were attempting to be the first. Despite many efforts, neither had been able to make it. One of the climbers arrived one morning and met a friend, to try one last attempt before declaring it impossible. His friend gave him the bad news: his rival had climbed it the day before. Not willing to be outdone, the climber went up the route first try. Only afterwards did his friend congratulate him: he had lied about his rival climbing it, he was the first. (The rival climbed it the next day.)
It seems the history of human progress is littered with examples of fast followers: a new technology is developed, and immediately afterwards it is developed in many other places. I think that knowing it can be done is, perhaps, the biggest hurdle. Maybe Moore's Law bridges that hurdle for us.
It was never a "Law" (Score:5, Insightful)
Re:It was never a "Law" (Score:2)
Moore's law has never been anything but an observation. I guess after calling it a law for thirty years, people start confusing it with a foundation of the computing industry.
I mean, most people have started thinking Windows is an OS right? (sorry.. after all, this is slashdot)
Re:It was never a "Law" (Score:2)
Please read the article. Pretty please. It is waaay more serious than that. If the author of the article is correct, Moore's Law, in either its orginal, revised, or vastly mutated forms does not really fit ANY concrete observational data we have. This is important because exponential and sub-exponential growth rates are very different things.
Re:It was never a "Law" (Score:2)
Allright. I'll go see if Blockbuster has it on video.
Moore's Law and web servers (Score:5, Funny)
Already taking over 60 seconds to load up..
Moore's Theory (Score:5, Informative)
The oft quoted 'Moore's Law' as some have said before, is not in fact a law at all, but instead a theory proposed by Moore based on the economic and technological trends of his time. He by no means meant to imply that this measurement be used as a benchmark of the technology industry. The fact that is is not only known, but hotly debated in the industry shows not the accuracy of the 'law', but instead the success of the marketing campaigns based off that quote. To be quite realistic, some manufacturers have pushed out technology that has not been completely tested in order to compete in the marketing game of Moore's Law, and thus we have cheap, unreliable PC's. (Don't get me wrong, this is only one of many reasons for this effect!)
</RANT>
Who Cares? (Score:5, Informative)
Defying Moore's Law isn't like defying gravity. We know that at some point, miniturization will no longer be possible. It's hard to double the number of transistors in one space when they're on the atomic level. Do you think we could do that in 18 months?
Re:Who Cares? (Score:2, Informative)
It's about the evolution of the microchip and Moore's Law's deviations.
The Moores Law of Moores Law (Score:5, Funny)
Re:The Moores Law of Moores Law (Score:2)
I'm wondering though, I think spam obeys moore's law. I'm definetly getting double 18months ago.
Wait wait wait (Score:4, Interesting)
Fine, originally it was "transistors" but I thought that if dual CPUs became a defacto standard in 12 months that would count towards Moore's Law instead of being illegal since the transistors aren't all on the same die.
It just sounds like nit-picking bullshit. I've always thought of Moore's Law as "the IT industry will find a way of doubling computing power every 18 months" not some stupid unit of measure.
Shit, if superior engineering can double computation with the same number of transistors (via better design) shouldn't that count? It just sounds like someone getting into a huff about it and having too much time on their hands to fiddle with Excel.
Re:Wait wait wait (Score:2)
Oh well. I thought it was that every 18 months you'd need a machine twice as powerful to run the current version of the Micro$oft OS.
Stephen
Re:Wait wait wait (Score:2)
Re:Wait wait wait (Score:2)
meaning:
today, you can fit 10,000,000 trinsiter in a square inch of wafer, in 18 months you'll be able to put 20,000,000 transistors into the same space. This should make computers more powerfull, but not neccesarily more 'faster'. Clock speed isn't everything.
Another law... (Score:5, Funny)
Re:Another law... (Score:2)
Processing power vs. chip complexity (Score:5, Interesting)
An alternative approach has been to build specialized hardware to put all those transistors to use, at the expense of turning your general purpose computer into a very special purpose machine. This has been used, sometimes to great effect, in for example N-body calculations (GRAPE 1-6 [u-tokyo.ac.jp]), yielding 50 or more TFlops of performance for the general computer cost of a 500 GFlop machine. It provides yet another example of the misappropriation of Moore's law.
Re:Processing power vs. chip complexity (Score:2)
Other computer components speeding up (Score:4, Interesting)
First, there is the connection between chipsets on the motherboard. AMD's Hypertransport and others could make big differences on overall motherboard speed.
Second, system memory speeds are getting quite a bit faster, too. Developments in DDR-SDRAM technology could eventually result in throughput 2-3 times what we have now with DDR333 technology.
Third, expansion slots are getting faster, too. There are now standards upcoming for both PCI and AGP that will substantially increase data throughput on expansion slots.
Fourth, mass storage devices are getting faster, too. IDE hard drives have now reached ATA-133 speed, and future IDE hard drives using the new Serial ATA connection will eventually reach the equivalent of ATA-600 speed! SCSI interface hard drives are benefiting from Ultra 160 and Ultra 320 speeds, too. Even optical recorders are getting faster, too; we've reach 48X speeds for CD-R writers, and DVD recorders will go past 12X speeds some time in 2004.
Fifth, hot-docked external connections are getting faster, too. USB 2.0 support 480 megabits/second connections, and the next-generation of IEEE-1394 connectors will support 800 megabits/second connections.
Finally, graphics cards have seen VERY dramatic performance increases for 3-D graphics. Today's ATI Radeon 9700 Pro and the upcoming nVidia GeForce FX chipset graphics can achieve 3-D rendering that no one could have dreamed of even five years ago.
In short, CPU's will probably reach their limits before 2010 but overall system speed will still increase dramatically thanks to other system components speeding up.
Which came first? (Score:5, Interesting)
So
a) "Moore's law" shows us the effect of demand vs. supply
b) It does not mean that the demand (or demanded quantity) would increase infinitely
c) You can not call it a law because the variations have been too big (first it was one year, then two, now 18 months) and as the formula is that of exponential growth, those variations mean huge differences at the number of transistors over a period of, say, five years.
In short, this article looks at the economics (as in macroeconomics) side of Moore's law. It doesn't claim that you couldn't pack more transistors or whatever on a microchip.
You could also claim that Moore's law might actually hinder economic development as Intel wants to obey the law. What results is that we are actually saying that "wow, Intel is keeping up with the R&D forecasts stated in their company strategy". Yipee.
Okay, a shitty explanation but please read the paper and look at the idea behind it before saying it's total bullshit.
Adobe's Law (Score:5, Funny)
RTFA, please (Score:5, Insightful)
It is also shown that Moore's law is often used as an reason by people who don't know better, and those who don't bother to verify their facts. The main point of the article though is that any Moore's law is not the driving force in the IT industry. It all comes to supply and demand. Unlike slashdotters, who seem to like pulling figures out of their ass, this guy actually has real and valid numbers which prove his point.
Before you make rediculous comments, please, RTFA.
you know, i'd love to... (Score:2)
Don't waste time on TFA, author misses the point (Score:2)
The author messes up by paying too much attention to the constant: that is, whether the doubling time is 18 months, 2 years, or some other number. He also worries too much about whether it's an exact exponential or not. It's not. So what? The most amazing thing is that a doubling time exists, meaning that we have exponential growth.
Moore's Law should be read as saying that various measures of transistor density on chips grows as O(exp(t)); this has held for 40 years. Of course, no exponential growth can continue forever.
Much of the recent history of the electronics industry has consisted of treating Moore's Law like a human law, that is, it is the marching order for the entire industry. Everyone from the fabs to the electronic design software houses to the microprocessor manufacturers to the systems houses plans in terms of generations of exponentially increasing density. Even the computer science notion "all problems can be solved by adding an extra level of indirection" implicitly assume that since the processors are getting faster all the time, we can make the code slower if we get more function out of it.
Keeping this exponential scaling process going is a massive undertaking; those interested in the problems at the cutting edge might want to look at the International Technology Roadmap for Semiconductors [itrs.net].
In any case, Moore's law is doomed in the long term. I think it's got another decade or so of life, though, as the researchers have a pretty good handle on the next couple of generations of scaling.
Gates Law (Score:2, Funny)
Apple feels the same way... (Score:2)
You could say the same thing about Apple.
True... (Score:3, Funny)
I just saw a throng pass away last week!
Moore's Law (Score:2, Funny)
This is not predicting the death of Moores's Law! (Score:5, Interesting)
It's about how the entire concept of Moore's Law is vague and has been applied to all sorts of other things exhibiting exponential growth, even though Moore was not referring to them. And specifically Moore never gave the time frame of "18 months." He said "1 year" one time, then later said "2 years." And if you look at the data, the transistor count of chips doubles roughly every 26 months, not 18. The point of the article is that Moore's Law is more of a hazy myth than anything else.
Re:This is not predicting the death of Moores's La (Score:5, Interesting)
It looks like about 3 people so far, but some read it more carefully than others. Please everybody who is reading this: read this article article because it is very important. Again, though, even people who have read the article (or skimmed it) appear not to have gotten the full message. So Junks Jerzey writes:
It's much worse than that, actually. When he really pulls the gloves off and looks at the hard data over the entire 43-year history of the industry, he finds *no* simple doubling time for almost any measure of interest that has been claimed to be Moore's Law or any folk version of it. Even for transistor counts. What you can sometimes sort of show is iffy exponential fits to the data for 5-10 year periods. Strikingly, though, the doubling rates for several of the measures the author investigates have *slowed*. Improvements do keep on happening, but the pace of the improvement is not as consistent or rapid as you might have expected.
Now the big deal about this is simple. Anybody who tries to project that our problems will be solved when X doubles in Y months is really walking on thin ice. It is also important because chip technology has often been held up as some special and amazing business whose success should be inspirational to us all, since it improves so fast. Clearly, improvements in raw components have been rapid (although not as rapid as you might expect), but the Big Changes caused by technology are rarely tightly coupled to the speed of improvement in underlying technology. Hey, the *big* change of the last decade is that your grandma now probably has email. I'm not sure it makes sense to calculate how many transistors that took.
Whew (Score:5, Funny)
A Law Based on Three Points of Data (Score:3, Interesting)
Murphy's Law, now that's a law.
News Flash! (Score:2, Funny)
USENET authorities are disturbed by the failure of a law that some thought to be a lynchpin of internet discussion: Godwin's Law. Simply stated, "As a Usenet discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one." Beginning last week observers began to notice something was wrong. Says one 'lurker', "I came across this thread on abortion, you see. I started reading--and that's when I noticed something strange. Every post in the thread simply got better and better as each participant read the other's arguments and replied calmly. It was then when it hit me--no Nazi references anywhere. I went back to read it again, and I was sure--Godwin's Law has been broken."
The violation of Godwin's law is hailed by some as a doomsday scenario for USENET. "These threads will just keep going and going forever! There is nothing to stop them. Eventually it'll all just reach critical mass and collapse in on itself," says a popular USENET troll. Others don't see it as Godwin's law fails! USENET authorities are disturbed by the failure of a law that some thought to be a lynchpin of internet discussion: Godwin's Law. Simply stated, "As a Usenet discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one." Beginning last week observers began to notice something was wrong. Says one 'lurker', "I came across this thread on abortion, you see. I started reading--and that's when I noticed something strange. Every post in the thread simply got better and better as each participant read the other's arguments and replied calmly. It was then when it hit me--no Nazi references anywhere. I went back to read it again, and I was sure--Godwin's Law has been broken." The violation of Godwin's law is hailed by some as a doomsday scenario for USENET. "These threads will just keep going and going forever! There is nothing to stop them. Eventually it'll all just reach critical mass and collapse in on itself," says a popular USENET troll. Others don't see it as cataclyismic, put painful all the same. "World War II is a large part of the world's history--I don't want to see that forgotten," reads one post to alt.military.history., put painful all the same. "World War II is a large part of the world's history--I don't want to see that forgotten," reads one post to alt.military.history.
Re:News Flash! (Score:5, Funny)
Godwin's law fails!
USENET authorities are disturbed by the failure of a law that some thought to be a lynchpin of internet discussion: Godwin's Law. Simply stated, "As a Usenet discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one." Beginning last week observers began to notice something was wrong. Says one 'lurker', "I came across this thread on abortion, you see. I started reading--and that's when I noticed something strange. Every post in the thread simply got better and better as each participant read the other's arguments and replied calmly. It was then when it hit me--no Nazi references anywhere. I went back to read it again, and I was sure--Godwin's Law has been broken."
The violation of Godwin's law is hailed by some as a doomsday scenario for USENET. "These threads will just keep going and going forever! There is nothing to stop them. Eventually it'll all just reach critical mass and collapse in on itself," says a popular USENET troll. Others don't see it as cataclysmic, put painful all the same. "World War II is a large part of the world's history--I don't want to see that forgotten," reads one post to alt.military.history.
Re:News Flash! (Score:2)
Yeah, I am! I was having this weird problem where every time I try that the web page starts coming out of my printer!! I fixed it by using Ctrl-V instead. ;)
Re:News Flash! (Score:2)
Call me AFTER moore's "law" is broken (Score:2, Insightful)
Sighting after death? (Score:4, Funny)
Laws, Gravity is the most strictly enforced! (Score:2, Funny)
The article, although very long and intense is well-written and very educational in the many interpretations of Moore's Law. A good read.
Who was the jackass that called it a "Law" anyways (Score:4, Funny)
What "Law"? How about "Postulate", or "Theorem"...no, not theorem...how about "conjecture"? How about "cockamamie bullshit"? You could probably make a similar "law" that describes the performance of light bulb technology over the last 100 years..."well, lightbulbs become (sort-of) 5% less expensive to make and 5% brighter every decade! Whoopee!!!"
Everybody's seen the graph. It's not linear. It's not exponential...it's just up. Hit or miss. No "law" involved here at all.
The whole idea that Moore's Law is a Law is stupid from the get-go. Damnit, I wish I could remember the name of the...oh yeah, the IgNobel. They should give the IgNobel to the cat who disproved Moore's Law. I mean, come on people, duh!
This is almost as stupid as those clone-aid wackos...
Not much of a law at all (Score:2, Interesting)
It's really just "Moores Semi-Accurate Observation That We Can Use To Help Figure Out How Fast Things Are Changing".
How is Moore's law a *law*? (Score:2, Insightful)
geekoid's Law (Score:2)
no point in guessing. (Score:3, Insightful)
Intel itself has already said that Moore's law is over, explained in slashdot here [slashdot.org]. Of course, other people [slashdot.org] are always predicting the end of it as well. Then again, some people think it will continue [slashdot.org].
I really wish people would get over Moore's prediction and talk about relevant stuff. There is no way to predict how long unknown scientific breakthroughs will allow Moore's Prediction to remain true. There is one absolute though, the end will come some day, you can only store so many atoms in a certain amount of space according to the rules of quantum physics - that is the absolute barrier.
Until it is actually abandoned I could do without hearing more of Moore's law.
Moores first law. (Score:2)
Moore factors to Moore's Law (Score:2)
* Motherboards are created to use the new higher speed processor
* RAM bandwidth (compare SIMMs performance to DIMMs)
* Software that utilizes the newer/more advanced features (8bit vs 16bit vs 32bit vs 64bit)
* Additional load placed on a processor by a new GUI's look feel (eye candy slows your machine down)
* Lack of advancement of storage devices (slow drives = slow machine)
* Lack of advancement of IO ports and devices (My ISA video card isnâ(TM)t as fast as my PCI card)
Processors do get faster. Nevertheless, other factors limit them.
Try this. Compare the startup time of a fresh install of Windows 95c to a fresh install of Windows XP on the same hardware. You will find that the 95 system is much faster then the XP startup because there isn't nearly as much OS baggage slowing it down.
Cyclical prices in the memory industry (Score:2)
What's most amazing is how many people believe that their boom/bust cycle is the only one - I see at least five in that graph :-)
Dinosaurs and Atoms Shrink! (Score:3, Funny)
"For eons we've wondered how come Dinosaurs were so much larger than modern mammals, but it's because the closer you get to the Big Bang, the largers those atoms were. I have something in my pocket that will astonish you..."
Agent Mulder removes from his pocket an atom the size of a tennis ball. "This is an atom from the Dawn of Time itself. The Al Queda has been trying to get there hands on this puppy, because you can split it with a butter knife."
(Portions of this post were lifted from a bit of Fan Video called "The Fed-EX Files" produced by a film crew in Montreal, Canada.)
Interesting graphic titles (Score:2)
"average cost of chips" by year.
This is great is you want to guess what people are paying, less good if you are trying to estimate what they are buying. "Cost per chip" without identifying the chip is
MicroSoft "Moore's Law" (Score:2)
Well, MSFT has been stagnant for the past four years. Bill gave over a third to charity and he's been stuck at $30-$40 billion for a while.
The author misunderstands Moore's law (Score:3, Interesting)
1. Increases in transistor count do not precisely follow an exact, continuous, exponential mathematical function. Some years it grows faster, others slower, etc. WELL FUCKING DUH. The article seriously thinks this is original and insightful, but actually it was known to everyone. OBVIOUSLY, Intel releases new processor architectures on some years but not others, therefore the increase in transistor count will be faster on those years and slower on others.
2. A few journalists have misrepresented Moore's law, by publishing versions that were not identical with what Moore actually said. AMAZING. A journalist misquotes, or misunderstands a technical issue? Who would have thought it possible? I'm glad we have this article to expose such shocking truths.
Photolithography on flat silicon is nearing an end (Score:3, Insightful)
Within a decade, that technology hits a wall - atoms and electrons are too big. That's the ultimate limit for photolithography on flat silicon. We may hit a fab limit, a device physics limit, or a power disspipation limit before that. Right now, power looks like the limiting factor. We're headed for hundreds of amps at fractions of a volt going into physically small ICs. Heat dissipation per unit area is approaching levels normally associated with cooking equipment. But somebody may find a way to get power dissipation down; it's been done before.
Even after the size limit is reached, it may be possible to push on cost. IC cost per unit area has increased over time as fabs became more expensive. New fab technologies, or improvements to existing ones, might improve the situation. It's of course possible to build physically bigger parts, as well. (Wafer-scale integration turned out to be a dead end. You can make a RAM chip several inches across, and it's been done. But the chip, plus its massive stiffener, is bigger, more expensive, and harder to cool than the current packaging systems.)
Alternative IC technologies are possible, but none of them seem to provide a lower cost per gate. Gallium is too rare. 3D layering doesn't bring cost down and makes cooling harder. Quantum computing is a long way from the desktop. Nanotechnology is still vaporware. Some of these technologies may eventually work, but to keep digital logic on the Moore's Law curve, they'd have to be further along than they are now.
It's much like aircraft, circa 1970. Aviation people were talking about bigger supersonic transports, hypersonic transports, suborbital ballistic transports, and large VTOL craft as near-term possibilities. None of them were feasible. 30 years later, aircraft are about like they were in 1970.
We're going to see a slowdown in IC progress within a decade.
Yeltsin's Law (Score:4, Funny)
Re:of course not (Score:4, Interesting)
I'm not saying moore's law will last forever, but that's because of the physical limitations, not because the actual function hits an asymptote.
Re:of course not (Score:3, Insightful)
The function doesn't take real world problems into account.
Eventually the size of transistors will reach a near molecular level and be too expensive or impossible to make any smaller. (we are no where near this point yet)
OR
Eventually the transistors will be small enough for an arc to bridge them, even at low voltages. Then it goes from being a transistor to being bridged. This isn't good for logic circuits: )
These are two good reasons why the number of transistors you can squeeze into a given area of real estate is finite.
Of course, you can simply make the die bigger and lower the voltage if necessary. Even this has practical limitations.
Problem:
heat becomes an issue and the wires need to get bigger as the current rises (due to lower voltage and the higher current that results).
While the size of the die is not limited, eventually to keep up with moores law, the chip would get too big to be practical once the transistor minimum size limit is reached, and a couple of generations of the device had passed.
Don't take my word for it, I am not an EE or computer scientist. I am simply a professional programmer/hobbyist (with an electronics background) who likes to read a lot.
While we may not run into these issues in the next ten years, or even in your lifetime, it is a mathematical certainty that we will eventually get there. This is the fundamental problem with moores law: transistors can only get so small.
Of course by the time we reach this point, we will have found a better control device than the transistor, and a better logic device than computers and chips as we know them today.
Biocomputing comes to mind(no pun intended).
l8,
AC
Uhm, wrong (Score:3, Insightful)
Moore's Law, on the other hand, is merely a mathematical function, made to predict the evolution of microchip technology, and being an exponential one, it, per definition, does not have an asymptote.
You're falsely assuming that Moore's Law is an absolute reflection of the actual evolution of mcrochips, when it is in fact just a predition (although so far a pretty good one IMHO).
Re:Who cares? (Score:2)
It was AMD. Bet you feel fulfilled now
Re:Linux killed Moore's law (Score:3, Interesting)
Re:Linux killed Moore's law (Score:2)
If you only load those programs when the users wants to use them, you will help minimize system bloat, and if theuser wants all that stuff, then bloat would not matter.
Of course for me bloat has always meant how tight the code is, not its size. Clearly the more you want to do, the larger the code will be.