Moore's Law Will Die Without GPUs 250
Stoobalou writes "Nvidia's chief scientist, Bill Daly, has warned that the long-established Moore's Law is in danger of joining phlogiston theory on the list of superseded laws, unless the CPU business embraces parallel processing on a much broader scale."
An observation (Score:5, Informative)
Moore's is not a law, but an observation!
I am The Law (Score:5, Informative)
I didn't realise Moore's Law was purely the driving force behind CPU development and not just an observation on semiconductor development. Surely we just say Moore's Law held until a certain point, then someone else's Law takes over?
As for Phlogiston theory - it was just that, a theory which was debunked.
Moores law will apply until it doesn't (Score:5, Informative)
Once transistors get below a certain size, of course it will end. Parallel or serial doesn't change things. We either have more processors in the same space, more complex processors or simply smaller processors. There's no "saving" to be done.
Re:An observation (Score:5, Informative)
Let's not play fast-and-loose with the word "law." (Score:5, Informative)
I'm probably being overly pedantic about this, but of course the word "law" in "Moore's Law" is almost tongue-in-cheek. There's no comparison between a simple observation that some trend or another is exponential--most trends are over a limited period of time--and a physical "law." Moore is not the first person to plot an economic trend on semilog paper.
There isn't even any particular basis for calling Moore's Law anything more than an observation. New technologies will not automatically come into being in order to fulfill it. Perhaps you can call it an economic law--people will not bother to go through the disruption of buying a new computer unless it is 30% faster than the previous one, therefore successive product introductions will always be 30% faster, or something like that.
In contrast, something like "Conway's Law"--"organizations which design systems ... are constrained to produce designs which are copies of the communication structures of these organizations"--may not be in the same category as Kepler's Laws, but it is more than an observation--it derives from an understanding of how people work in organizations.
Moore's Law is barely in the same category as Bode's Law, which says that "the radius of the orbit of planet #N is 0.4 + 0.3 * 2^(N-1) astronomical units, if you call the asteroid belt a planet, pretend that 2^-1 is 0, and, of course, forget Pluto, which we now do anyway."
Re:Who would have thunk it (Score:2, Informative)
Re:Nvidia says GPUs are the future? (Score:5, Informative)
Marketing guy?
Before going to nvidia maybe two years ago, Bill Daly was a professor in (and the chairman of) the computer science department at Stanford. He's a fellow of the ACM, IEEE, an AAAS.
http://cva.stanford.edu/billd_webpage_new.html
You might criticize this position, but don't dismiss him as a marketing hack. NVidia managed to poach him from Stanford to become their chief scientist because he believed in the future of GPUs as a parallel processing tool, not that he began drinking the kool-aid because he had no other options.
Re:Moores law will apply until it doesn't (Score:4, Informative)
but parallel is not a magic bullet. Unless one can chop the data worked on into independent parts that do not influence each other, or do so minimally, the task is still more or less linear and so will be done at core speed.
the only benefit for most users is that one is more likely to be doing something while other, unrelated, tasks are done in the background. But if each task wants to do something with storage media, one is still sunk.
Infinite? (Score:3, Informative)
Dude, this is clearly some sense of the word "infinite" of which I haven't been previously aware. A couple things: 1) atoms -> electrons -> quarks is three levels, which is not exactly infinity. 2) I'm not sure if this is what you meant, but electrons are not made of quarks. They're truly elementary particles. 3) No one thinks there's anything below quarks - the Standard Model may have some issues, but no one seriously questions the elementary status of quarks. 4) you can't do anything with quarks anyway - practically speaking, you can't even see an individual quark. They're tightly bound to each other in the form of hadrons.
I think that in practice, we're going to run into problems before we even get to the level of atoms. Lithographic processes can only get you so far - we're already into the extreme ultraviolet, so to get smaller features we're going to have start getting into x-rays/gamma rays, which have rather unfortunate health and safety issues associated with them, not to mention the difficult engineering problems involved in generating tightly focused beams. And even if you can solve that problem, you have to deal with noise introduced by electrons just leaking from one lead to another. I think 246 doublings is way, way generous.
Re:Misleading headline (Score:3, Informative)
He doesn't say that is should be done via the GPU.
He says Intel and AMD need to focus on Parallelism. This is true.
The GPU/CPU comment was driven by the author of the article. Clearly as an attempt to drum up some sort of flame war to drive hits to the article.
Now, I would assume part of his job is to figure out how to properly do that with GPUs; however at no place is he implying only Nvidia can do this and it can only be dong on the GPU.