Fifty Years of Moore's Law 101
HughPickens.com writes: IEEE is running a special report on "50 Years of Moore's Law" that considers "the gift that keeps on giving" from different points of view. Chris Mack begins by arguing that nothing about Moore's Law was inevitable. "Instead, it's a testament to hard work, human ingenuity, and the incentives of a free market. Moore's prediction may have started out as a fairly simple observation of a young industry. But over time it became an expectation and self-fulfilling prophecy—an ongoing act of creation by engineers and companies that saw the benefits of Moore's Law and did their best to keep it going, or else risk falling behind the competition."
Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop, but the death of Moore's Law will spur innovation. "Someday in the foreseeable future, you will not be able to buy a better computer next year," writes Huang. "Under such a regime, you'll probably want to purchase things that are more nicely made to begin with. The idea of an "heirloom laptop" may sound preposterous today, but someday we may perceive our computers as cherished and useful looms to hand down to our children, much as some people today regard wristwatches or antique furniture."
Vaclav Smil writes about "Moore's Curse" and argues that there is a dark side to the revolution in electronics for it has had the unintended effect of raising expectations for technical progress. "We are assured that rapid progress will soon bring self-driving electric cars, hypersonic airplanes, individually tailored cancer cures, and instant three-dimensional printing of hearts and kidneys. We are even told it will pave the world's transition from fossil fuels to renewable energies," writes Smil. "But the doubling time for transistor density is no guide to technical progress generally. Modern life depends on many processes that improve rather slowly, not least the production of food and energy and the transportation of people and goods."
Finally, Cyrus Mody tackles the question: what kind of thing is Moore's Law? "Moore's Law is a human construct. As with legislation, though, most of us have little and only indirect say in its construction," writes Mody. "Everyone, both the producers and consumers of microelectronics, takes steps needed to maintain Moore's Law, yet everyone's experience is that they are subject to it."
Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop, but the death of Moore's Law will spur innovation. "Someday in the foreseeable future, you will not be able to buy a better computer next year," writes Huang. "Under such a regime, you'll probably want to purchase things that are more nicely made to begin with. The idea of an "heirloom laptop" may sound preposterous today, but someday we may perceive our computers as cherished and useful looms to hand down to our children, much as some people today regard wristwatches or antique furniture."
Vaclav Smil writes about "Moore's Curse" and argues that there is a dark side to the revolution in electronics for it has had the unintended effect of raising expectations for technical progress. "We are assured that rapid progress will soon bring self-driving electric cars, hypersonic airplanes, individually tailored cancer cures, and instant three-dimensional printing of hearts and kidneys. We are even told it will pave the world's transition from fossil fuels to renewable energies," writes Smil. "But the doubling time for transistor density is no guide to technical progress generally. Modern life depends on many processes that improve rather slowly, not least the production of food and energy and the transportation of people and goods."
Finally, Cyrus Mody tackles the question: what kind of thing is Moore's Law? "Moore's Law is a human construct. As with legislation, though, most of us have little and only indirect say in its construction," writes Mody. "Everyone, both the producers and consumers of microelectronics, takes steps needed to maintain Moore's Law, yet everyone's experience is that they are subject to it."
Andrew "bunnie" Huang argues that Moore's Law is (Score:1)
> Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop,
I think we've been hearing about the end of Moore's law for the last 15 years... inevitably, some process improvement comes along and it all keeps on going.
Yeah, it may "eventually" stop when transistors are built with just 3 atoms. Then will switch over to photonics or quantum, then some weird hyper-dimensional shit.
Re: Andrew "bunnie" Huang argues that Moore's Law (Score:2)
That "weird hyperdimensional shut" is the kind of innovation he is talking about. That stuff doesn't just fall from the sky. Lots of people have to innovate the he'll out of that stuff for years or decades. What website do you think you are reading, anyway?
Re: (Score:2)
>> weird hyper-dimensional shut
if you are thinking space time or something like Warp speed, not sure if their is enough power ever to achieve that in our life time ... that could happen. given I like to dream but the thought of trinary chip just seems like wishful thinking
if you are thinking LxWxH + trinary chips
Re: Andrew "bunnie" Huang argues that Moore's Law (Score:5, Insightful)
I think we've been hearing about the end of Moore's law for the last 15 years... inevitably, some process improvement comes along and it all keeps on going.
I don't think that it's necessarily "inevitable". Take aviation, for example. There was arguably exponential increases in the capability of aircraft for 55 years from 1903 to 1958, when the Boeing 707 was introduced. Ever since, further progress on economically viable aircraft has been pretty much limited to incremental increases in fuel economy and marketing strategies to keep costs down by keeping planes full.
Re: (Score:2)
I don't know, switching from aluminium and titanium to composite materials is, such as carbon fibers is a real big deal in aviation. But this is something that you don't see and thus don't recognize. Would you know that the A350 and 787 are almost entirely made of plastic?
I agree that Moore's Law is slowing, but i doubt that we will see a slowdown in innovation. We have already seen a shift from more powerful to smaller and more energy efficient. The number of applications that need raw power are getting le
Re: (Score:2)
All the plastic helps with the incremental increments in fuel economy: approximately 2X better over the past 57 years. I also neglected to mention safety, which has improved a good deal more than fuel economy. That's all OK, but it's nothing like the dramatic changes that happened previous to the 707. After nearly six decades, today's planes still look very similar to a 707, are about the same size, and go the same speed.
Re: (Score:2)
Yes, but where is the difference to CPUs? Many little breakthroughs in technology, most of them you don't see.
Re: (Score:2)
> Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop,
I think we've been hearing about the end of Moore's law for the last 15 years... inevitably, some process improvement comes along and it all keeps on going.
Yeah, it may "eventually" stop when transistors are built with just 3 atoms. Then will switch over to photonics or quantum, then some weird hyper-dimensional shit.
15 years ago they were talking about some weird 3 dimensional transistor shit.
Re: (Score:2)
>> Yeah, it may "eventually" stop when transistors are built with just 3 atoms
Funny I was thinking along the same lines, I recall when they got to 9 or 10 atoms as being the nearest they could be, then 2 or 3 years someone came out with 8, I do like Moore's Law as a benchmark of what can be achieved. And just not in chips but in data storage and power consumption.
I really wish I could find more benchmarks on progress. it's just fun to learn stuff like this.
Oh by the way... I guessing ( using Moore's L
Don't tell Kurzweill (Score:4, Funny)
That guy is going to be pissed when we don't get cold supercomputers with billions of times more power than the brain using reversible computing.
Re: (Score:2, Interesting)
That guy is going to be pissed when we don't get cold supercomputers with billions of times more power than the brain using reversible computing.
Kurzweil may or may not be nuts, but the data [singularity.com] seems to be going his way so far.
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Fully 3D circuitry is limited more by the requirement to have a single-crystal and the economics of circuit fabrication, than by power density. Furthermore, neuromorphic computing (which is advancing rapidly) has the potential to solve power density and yield issues, but Si wafers are still cheap compared to mask steps.
Hate to tell them, but... (Score:5, Funny)
Uh.... [google.com]
Well... [networkworld.com]
cough-cough [ajmc.com]
You see... [theguardian.com]
Aww screw it. [bloomberg.com]
Could there have been worse examples of "LOL those crazy promises!"?
Re: (Score:1)
Seeing every one of those except maybe the fossil fuels to renewable energies doesn't exist, yeah, there could have been worse examples.
Case and point, point me to where I can buy a fully autonomous car right now? How about I ask again 5 years from now? Also, where can I buy a ticket on even a supersonic aircraft, let alone a hypersonic one? The only existing supersonic aircraft was taken out of service years ago as it wasn't economically feasible. And what? There's a cure for cancer let alone a custom
Re: (Score:1)
Did Google say they were 5 years from having their cars on the market in 2010? Or are you just clumsily trying to apply the old saw about cold fusion to them?
Its cold fusion nonsense. Google didn't start their project until 2012: http://spectrum.ieee.org/robot... [ieee.org]
Re: (Score:1)
Google is 5 years out from having their cars on the market, just like they were 5 years ago.
Google is a lot closer than 5 years. The computing and sensing technology now exist to make this reasonable priced. The problem is an engineering problem - developing the algorithms to work properly as a driver.
Well within 5 years (try 2 years), both Google and Uber will be running low speed taxi services in dense city areas using their respective vehicles. You may not be able to purchase the vehicle or drive the freeways, but Uber and Google will replace a lot of Uber drivers and cabbies. Google is ver
Re: Hate to tell them, but... (Score:1)
Not nonsense. The plan is to start testing in Singapore this year. http://www.technologyreview.co... [technologyreview.com]
The software may be a bit farther along than it seems.
Re: (Score:2)
Well within 5 years (try 2 years), both Google and Uber will be running low speed taxi services in dense city areas using their respective vehicles.
If they are lucky, within five years they will have the algorithms necessary to self-drive a car. From there, expect another 5-10 debugging the software and making it safe enough for the public.
Go look up how long it takes to build flight-safe airplane software, and then realize that car software is much more complicated.
Re: (Score:2)
These are all things that people (especially reporters selling headlines) want very badly, but not necessarily things that will ever be able to become practical enough to make it out of R&D and into common use.
Re: (Score:2)
Yes, but people thought that these things would happen at a pace similar to the pace of computer technology development. It didn't, it took a lot longer.
Also a lot of these are still in development, not yet at the stage of a real product or with limited adoption.
Speed isn't all there is... (Score:2)
The idea of an "heirloom laptop" may sound preposterous today, but someday we may perceive our computers as cherished and useful looms to hand down to our children, much as some people today regard wristwatches or antique furniture."
It is preposterous... Even if it were impossible to make computers faster in any way in the future (extremely unlikely given the countless avenues there are to explore in terms of speed), even then the inovation in computers i not and would not be limited to speed, so no computer heirlooms wont ever happen, stupid person.
Re:Speed isn't all there is... (Score:5, Funny)
“It’s your father’s Sinclair ZX Spectrum. This is the weapon of a computer hacker. Not as clumsy or as random as an iphone, but a more elegant weapon for a more civilized age. For years, the hackers were the guardians of peace and justice in the internet. Before the dark times, before the NSA.”
Re: (Score:2)
âoeItâ(TM)s your fatherâ(TM)s Sinclair ZX Spectrum. This is the weapon of a computer hacker. Not as clumsy or as random as an iphone, but a more elegant weapon for a more civilized age.
Um, yeah. I had one of those, and elegant is not a word that was used to describe them, even when new. Being that I was alive back then, I can also assure you that it was not a more civilized age either. Crime and pollution were much worse than now. Racial prejudices were starting to die off, and sexual orientation prejudices were very prevalent.
For years, the hackers were the guardians of peace and justice in the internet. Before the dark times, before the NSA.â
I'll give you that. Hackers were pretty damn benevolent. Most cracking was meant to be more for humor or to see if you could do it, than anything harmful. But the
Re: (Score:2)
Um, yeah. I had one of those, and elegant is not a word that was used to describe them, even when new.
Elegant depends upon context, and I would argue that those computers were elegant in the context of their era. Difficult to use, sure. Yet compare that to the technology that preceded it. If you needed to type something out, typewriters sure were simple. Needed to make changes, then you needed to use a correction tape. Except that wasn't always appropriate, so you had that thing called drafts. What about doing calculations? There were machines of various sorts that could handle that, yet you had to
Re: (Score:1)
Re: (Score:2)
Um, yeah. I had one of those, and elegant is not a word that was used to describe them, even when new.
Elegant depends upon context, and I would argue that those computers were elegant in the context of their era. Difficult to use, sure. Yet compare that to the technology that preceded it. If you needed to type something out, typewriters sure were simple. Needed to make changes, then you needed to use a correction tape. Except that wasn't always appropriate, so you had that thing called drafts. {snip}Spreadsheets {snip} Accounting software {snip}
We're talking about a ZX Spectrum, right?
I know they had a Word-processor and probably a spreadsheet for the Speccy, but that's hardly an average use case.
Manic Miner and Jet Set Willy were elegant tho, elegant and awesome!
Re: (Score:3)
Let's compare 0 AD and 1000AD. Sure, there are some advances and changes, but by and large not too different. Jumping from one time to the next, technology is going to be the least of your concerns as far as difference.
Now let's go from 1000AD to 1500AD. Changes are a little more appa
Or you can say things are now slowing down (Score:2)
Before that it was jet planes and anti-biotics - mid 50s
Before that motor cars - 1900 or so
Before that railroads - 1830 or so
Now it may be that we are waiting for the next major breakthrough.
Re: (Score:2)
Re: (Score:3)
The last major, world changing thing, was the internet - some 25 years ago. Since then we've just seen it get better and better - but no real breakthroughs
Um, 25 years ago was... 1990. In that time we've gone from computers being a comparitive rarity (many people didn't even have a home PC) to nearly 80% of the population carrying round a computer in their pocket. No one had cellphones in 1990 to a first approximation. Now almost everyone does.
Re: (Score:2)
omg - what rock are you under?
I'll list some for you:
1) smart phones - world changing - 13 years ago with the blackberry when people started to use them
2) human genome sequencing - world changing - 15 years ago - completed 2000, but finalized in 2003
3) digital cameras - world changing - average people didn't start using them until 16 years ago, 1999.
4) LCD monitor - world changing - 17 years ago
5) rebirth of the electric cars - world changing - 7 years ago
6) Linux - 24 years ago
7) Amazon - 21 years go -- a
Interesting list (Score:2)
However the central experience of western life - of living in nuclear families in dispersed suburbs, travelling to work in
Never is a LONG time... (Score:3)
It doesn't matter how long (Score:2)
At some point it will cease to make sense to update your computer on a regular basis. I have a 10 year old one that is fine for internet browsing and word processing
Regular yes, heirloom no. The space between physical obsoleting to the point of uselessness has and will continue to increase, but it a whole generation through which zero innovation in computers happens? less a post-apocalyptic scenario, that's not going to happen.
...A nail clipper is extremely limited in it's purpose and possible number of designs, it has a very attainable optimal design after which no substantial improvement can be made. The current and most prevalent nail clipper design is extremely ele
Moore's Law is over (Score:5, Interesting)
Incidentally Moore's law died sometime last year technically, as Intel failed to ship its new node within "18-24 months" of its last one, meaning the density of transistors did not, for anyone, double within the time limits specified by Moore's Law. With the other foundries (TSMC/GloFlo/Samsung) still ramping up the same feature density size with finfet transistors that Intel had 3 years ago, and 10nm bringing even more difficulties than Intel's "14nm" it's a question how much longer feature size can continue to shrink at all, let alone somehow coming within the Moore's Law cadence of ever 18-24 months.
Re: (Score:2)
Why is the only factual statement in the comments not modded up?
Re: (Score:2)
Moore's Meta-Law (Score:5, Funny)
Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop
Moore's Meta-Law:
The number of people predicting the end of Moore's Law doubles every eighteen months!
Remember those memory cartridges on Star Trek TOS? (Score:4, Interesting)
I remember watching Star Trek (TOS) and thinking how fantastic it would be to have all that storage in that little cartridge the size of a matchbook; books, movies, medical records, the Encyclopedia Galactica, all on one little memory device. I never expected it happen in my lifetime.
Then in 1985 once the initial glow of the original Macintosh had worn off a little, my brother and I brainstormed on what our _ultimate_ computer would be: 1024x768 TrueColor display, a whole _8_ megabytes of memory, and a 50 Mhz 68000 series CPU. Wheee!
Now we have 128 GB microSD cards smaller than your fingernail. And that super-computer in your pocket that happens to make phone calls? It's more powerful than a 4 processor Cray YMP M90 circa 1992.
We've come a long way!
--aj;
Re:Remember Adolescensce of P-1? (Score:1)
For me it was reading The Adolescence of P1 in the late 1980s, with its mention of 'gigantic' 70 MB disc drives that gave me a laugh.
https://en.wikipedia.org/wiki/The_Adolescence_of_P-1 [wikipedia.org]
Re: (Score:2)
Unfortunately high density Flash memory has a retention of months to years unless it is scrubbed. That makes it great for SSDs which are regularly used but useless for archival purposes or even as a replacement for magnetic and optical removable media in many applications.
Re: (Score:2)
Re: (Score:2)
The manufacturers do not like to advertise this so specifications are in short supply. I ran some of my own tests on various unused USB Flash drives I had laying around and none of them retained data more than a year whether powered or unpowered so I assume they do no background scrubbing. SSDs generally have better documentation and will specify something like 1 year of unpowered retention. Beware of "typical" specifications which have almost no meaning.
Koomey's law (Score:5, Interesting)
There is another interesting aspect to Koomey's law: it hints at an answer to the question "for how long can this continue?" The hinted answer is "until 2050", because by 2050 computations will require so little energy that they will face a fundamental thermodynamic constraint—Landauer's principle [wikipedia.org]. The only way to avoid that constraint is with reversible computing [wikipedia.org].
Re: (Score:2)
Reversible computing requires infinite storage. Won't and can't happen.
Re: (Score:1)
Re: (Score:2)
Reversible computing is nothing more than making everything a one-to-one function.
Current computing is merely functional, but not one-to-one. Operations such as XOR are not one-to-one functions because XOR(0,1) = 1 and XOR(1,0) = 1; given the output 1 and the function XOR you cannot recover the inputs.
Reversible computing makes all operations one-to-one, and thus reversible. This is achieved by storing some of the inputs for many-to-one functions. If you want to reverse more than one step (the whole poin
Re: (Score:1)
B=A XOR B (leaving A unchanged) is a reversible operation & is what I meant. More generally, B=f(A) XOR B is reversible (in fact, self-inverse), where f can be any (even irreversible) function.
Sure, you need to save the input to otherwise-irreversible steps, but the point is that you can erase a known value, & since there was some method to compute the intermediate values in the first place, they can be removed from memory in reverse order. (This is a known method—I did not come up with it.) T
Re: (Score:2)
With XOR you don't need any additional storage, but there are functions (whether they be at the transistor level or at the application level) where many variables result in a much smaller output. You need additional storage in these cases. You are also not always free to overwrite variables if you can recover them, because many functions may take a single variable as input. Recovering X may involve stupidly long chains, and you need storage to go back through the whole chain for as far as you want to rec
Re: (Score:1)
Hmm. I suppose that can be true in an iterative setting (needing to store some data from every iteration), & that the only hope of avoiding that is rewriting the whole loop to be fully reversible so it does not consume space every iteration. (It cannot take more space than linear in the run time, at any rate.) I was imagining recursive functions with stack allocation for each, but I should know better since I use tail recursion all the time. So I guess I was only right about iteration- & tail-recurs
Re: (Score:1)
Re: (Score:2)
Yes, you can trade off time with storage, but it's still on the order of n for storage, which is infinite in the general sense and impractical in the real-world sense.
Think of all the non-reversible operations a single CPU core @ 3 GHz chews through in a year.
Re: (Score:1)
Re: (Score:2)
Well in the 35 years until 2050, there will be approximately 23 more Moore's law doublings, which means computing chips will be around 8.4 million times more powerful than now. So around 60 iPhones 41's in 2050 will have the same computing power as all of the 500 million iPhones currently on the planet.
That should allow us to do a lot of cool stuff.
As an aside, I consider Moore's law as more a product of the geometric progression of chip lithography. You increase feature resolution by a linear amount and yo
Re: (Score:2)
Mooers' law may apply if Moore's law is false (Score:1)
When Moore's Law Slows Down (Score:3)
Regarding Andrew âoebunnieâ Huang ridiculous article....
As commercial success and product differentiation starts to depend less on quickly leveraging the latest hardware and moreso on algorithmic improvements, companies will not magically become more inclined to publish source code. When the path to improved performance involves massive man-hours optimizing code, small teams & startups will not somehow gain an advantage.
Click baiting "open source" and an interactive graph might bring a lot of page views, but the entire premise is truly absurd.
Heirloom laptop concept makes me wanna puke (Score:1)
I am feeling sick and sad that my generation could be the failure that couldn't keep up with Moore's law and is looking for excuses and marketing incompetence as innovation.
Specially considering that we can't even fucking go to the moon anymore, and the motherfuckers who did it used fucking 64kb computers.
2 out of 2. We are self-appointed lazy losers full of ourselves and deserve no respect from our ancestors.
Re: (Score:2)
Yeah, I wonder why this generation hasn't discovered new elements or new fundamental forces, or new Euclid's theorems. Stupid generation.
Are you seriously this stupid?
Re: (Score:2)
It was still inevitable... (Score:2)
Chris Mack begins by arguing that nothing about Moore's Law was inevitable. "Instead, it's a testament to hard work, human ingenuity, and the incentives of a free market.
Humans working hard and having ingenuity, and being incentivized by the free market are all things that are sort of inevitable in themselves. I don't mean to diminish those positive features of humanity, but I think it's ok to take them for granted in the sense that I don't think it is likely for those things to stop being features of humanity barring some kind of catastrophe.
Was Moore's Law going to be as true as it was with 100% probability? No, some stuff could have gone wrong. Some people might have
(R)evolutional progress & what people make of (Score:1)
From the summary:
But the doubling time for transistor density is no guide to technical progress generally. Modern life depends on many processes that improve rather slowly, not least the production of food and energy and the transportation of people and goods.
A lot of progress depends on information technology, though. For example our understanding of biochemical processes. Or the capability of satellites that monitor what's going on with our planet. Or our understanding of quantum effects in semiconductor materials, in turn the basis for IC's, LED lighting, and a whole slew of other applications. Our use of smartphones & related communication technology. Or even something as "low-tech" as logistics.
Make computation cheaper, a
technically Moore's law is still in effect (Score:1)
Yes I know technically the number of transistors on a chip is still doubling every 18 months or so; and yes that means cheaper chips that use less power. Yes that is all fine and good. But kids today don't seem to remember back when having twice as many transistors pretty much meant having twice the computing power. That 486 could do twice as much at the same clock speed as the 386 -- and the 486 was eventually going to be sold at higher clock speeds. And you didn't need to recompile anything to take advant
Re:technically Moore's law is still in effect (Score:5, Insightful)
I don't think it's that your older. Home computing was very much in its infancy in the 80s, and only started growing up in the 90s. As with all things, it was a period of wild optimism, rapid change, rapid improvements and huge variety. Now it's settling down and becoming much more boring as all the low hanging fruit has gone and larger and more expensive operations are required to squeeze out the remaining performance.
The exact same thing happened in both the automobile and aeroplane industries as well, but I was born long after they entered the boring phase.
In the early 1900s, any yahoo with a bicycle garage, a couple of petrol engines a good supply of wood, some optimism and some giant brass ones could build and fly a primitive aircraft. And they did in huge quantities. There were all sorts of whacky things like rotary engines where the whole crank case rotates, wings that twisted, weird paterning and layouts of wings, on-wing gantries for in-flight servicing of broken down engines and so on and so forth.
Now it's about bumping 0.1% off the fuel burn by optimising for short-haul versus long haul flights and so on.
IOW, it's not "thing were better when we were kids", rather many industries have gone through these transitions and computing is no exception.
Ah, Moore's Law... (Score:2)
For the past thirty years, experts have told us that Moore's Law is likely to end within ten years.
What do the experts today think? Predictions are in: Moore's Law will probably end in about ten years [pcworld.com].
Good to see some things never change.
Like Grosch's Law? (Score:1)
Better software (Score:3)
Opened the flood gate (Score:1)
Re: (Score:1)
In 5 years? Why? Computers 5 years ago were really the same as todays computers. Same with computers 20 years ago.
Computers have been the same for many years now. Just faster.
And there is no such thing as AI or machine intelligence. There has been ZERO progress in that field.
Thanks (Score:1)
And in 18 months (Score:1)
Not the only dark side (Score:2)
Electronics are progressing faster then us meat puppets can deal with. We're going to have issues as electronics have the capability to take over more and more of what us humans do.
When you ask someone, what do you do? You generally get an answer of their job. it's part of our internal definition. what happens when you do nothing (and get paid nothing)?
Re: (Score:2)
Then you run into the predictions of the Technocracy movement [wikipedia.org], where the price system collapses and most people have no job and zero income. Without job there are no consumers, without consumers there are no jobs. It's inevitable as the amount of work an individual can do increases with technology.
Moore's Law ends when.. (Score:2)
...we all give up.
Even if we have to invest exponentially more resources into shrinking transistors, the industry is very likely to continue to invest. They will give up when the R&D costs are high enough that there is no longer any profit. But marketing has really pushed people to upgrade to new devices that they don't need, if marketing continues to do their job then we'll see Moore's Law working for quite some time to come.
What exactly is Moore's Law? (Score:2)
I've heard the term for years and thought I understood it. However, this thread seems to contain a lot of debate on exactly what Moore's Law means... I don't believe it actually has anything to do with cpu power doubling or transistor density. Can somebody clarify a precise definition?
Here is my interpretation...
If I buy a CPU today for X dollars in 18 months a CPU will exist that contains roughly twice the number of transistors that will also cost X dollars to purchase.