Moore's Law Will Die Without GPUs 250
Stoobalou writes "Nvidia's chief scientist, Bill Daly, has warned that the long-established Moore's Law is in danger of joining phlogiston theory on the list of superseded laws, unless the CPU business embraces parallel processing on a much broader scale."
Objectivity? (Score:5, Insightful)
The industry has moved away from "more horsepower than you'll ever need!" to "uses less power than you can ever imagine!" Perpetuating Moore's Law isn't an industry requirement, it's a prediction by a guy who was in the chip industry.
Nvidia says GPUs are the future? (Score:5, Insightful)
So, a graphics card manufacturer says that graphics cards are the future? And this is news?
inevitable (Score:5, Insightful)
Umm? (Score:5, Insightful)
Moore's law, depending on the exact formulation you go with, posits either that transistor density will double roughly every two years or that density at minimum cost/transistor increases at roughly that rate.
It is pretty much exclusively a prediction concerning IC fabrication(a business that NVIDIA isn't even in, TSMC handles all of their actual fabbing), without any reference to what those transistors are used for.
Now, it is true that, unless parallel processing can be made to work usefully on a general basis, Moore's law will stop implying more powerful chips, and just start implying cheaper ones(since, if the limits of effective parallel processing mean that you get basically no performance improvements going from X billion transistors to 2X billion transistors, Moore's law will continue; but instead of shipping faster chips each generation, vendors will just ship smaller, cheaper ones).
In the case of servers, of course, the amount of cleverness and fundamental CS development needed to make parallelism work is substantially lower, since, if you have an outfit with 10,000 apache instances, or 5,000 VMs or something, they will always be happy to have more cores per chip, since that means more apache instances for VMs per chip, which means fewer servers(or the same number of single/dual socket servers instead of much more expensive quad/octal socket servers) even if each instance/VM uses no parallelism at all, and just sits at one core = one instance.
Re:An observation (Score:3, Insightful)
It is also a modestly self-fulfilling prediction, as planners have had it in mind as they were setting targets and research investments.
Re:An observation (Score:3, Insightful)
Yeah, pretty much this. It's akin to the oil companies advertising the fact that you should use oil to heat your home...otherwise, you're wasting money!
How to solve it - rename "Moore's Law"? (Score:4, Insightful)
Moore's Law isn't exactly "a law". It isn't like "the law of gravity" where it is a certain thing that can't be ignored*. It's more "Moore's Observation" or "Moore's General Suggestion" or "Moore's Prediction". Any of those are only fit for a finite time and are bound to end.
* Someone's bound to point out some weird branch of Physics that breaks whatever law I pick or says it is wrong, but hopefully gravity is quite safe!
Re:Who would have thunk it (Score:3, Insightful)
Parallel processing *is* the way to go if we ever desire to solve the problem of AI.
Human brains have a low clock speed, and each processor (neuron) is quite small, but there are a lot of them working at once.
Just because he might be biased doesn't mean he's wrong.
--
BMO
Re:Moores law will apply until it doesn't (Score:4, Insightful)
Hence the need to embrace parallel processing. But the trend seems to be heading toward multiple low power RISC cores, not offloading processing to the video card.
Re:An observation (Score:4, Insightful)
yep, the "law" basically results in one of two things, more performance for the same price, or same performance for cheaper price.
thing is tho that all of IT is hitched on the higher margins the first option produces, and do not want to go the route of the second. The second however is what netbooks hinted at.
The IT industry is used to be boutique pricing, but is rapidly dropping towards commodity.
Heat and power consumption. (Score:5, Insightful)
What's a 'law'? (Score:3, Insightful)
As for a scientific law, 'laws' in science are like version numbers in software:
There's no agreed-upon definition whatsoever, but for some reason, people still seem to attribute massive importance to them for some reason.
If anything a 'law' is a scientific statement that dates from the 18th or 19th century, more or less.
Hooke's law is an empirical approximation.
The Ideal Gas law is exact, but only as a theoretical limit.
Ohm's law is actually a definition (of resistance).
The Laws of Thermodynamics are (likely) the most fundamental properties of nature that we know of.
The only thing these have in common is that they're from before the 20th century, really.
Re:Objectivity? (Score:5, Insightful)
If your CPU is running at 60%, you need more or faster memory, and faster main storage, not a faster CPU. The CPU is being starved for data. More parallel processing would mean that your CPU would be even more underutilized.
Closed source computation won't fly (Score:3, Insightful)
Perhaps nVidia's chief scientist wrote his piece because nVidia wants its very niche CUDA/OpenCL computational offering to expand and become mainstream. There's a problem with that though.
The computational ecosystems that surround CPUs can't work with hidden, undocumented interfaces such as nVidia is used to producing for graphics. Compilers and related tools hit the user-mode hardware directly, while operating systems fully control every last register on CPUs at supervisor level. There is no room for nVidia's traditional GPU secrecy in this new computational area.
I rather doubt that the company is going to change its stance on openness, so Dr. Daly's statement opens up the parallel computing arena very nicely to its traditional rival ATI, which under AMD's ownership is now a strongly committed open-source company.
Re:An observation (Score:3, Insightful)
The law states that the number of transistors on a chip that you can buy for a fixed investment doubles every 18 months. CPUs remaining the same speed but dropping in price would continue to match this prediction as would things like SoCs gaining more domain-specific offload hardware (e.g. crypto accelerators).
Actually, parallel processing is completely external to Moore's Law, which refers only to transistor quantity/size/cost, not what they are used for.
So while he's right that for CPU makers to continue to realize performance benefits, parallel computing will probably need to become the norm, it doesn't depend upon nor support Moore's Law. We can continue to shrink transistor size, cost, and distance apart without using parallel computing; similarly by improving speed with multiple cores we neither depend upon nor ensure any improvement in transistor technology.
Parallel processing isn't magic. (ReMoores law... (Score:3, Insightful)
Nine processors can't render an image of a baby in one system clock tick, sonny.
Re:Heat and power consumption. (Score:3, Insightful)
The article has Dally advocating more efficient processing done in parallel. The potential benefits of this are obvious if you've compared the power consumption of a desktop computer decoding h.264@1080p in software (CPU) and in hardware (GPU). My own machine, for example, consumes less than 10W over idle (+16%) when playing 1080p, and ~30W over idle (+45%) using software decoding. And no fires. See also the phenomenon of CUDA and PS3s being used as mini supercomputers, again, presumably without catching on fire a lot.
What was your point again?
He's conflicted, but he's still right (Score:3, Insightful)
Obviously there's a conflict-of-interest here, but that doesn't mean the guy is necessarily wrong. It just means you should exercise skepticism and independent judgment.
In my independent judgment, I happen to agree with the guy. Clockspeeds have been stalled at ~ 3Ghz for nearly a decade now. There are only so many ways of getting more per clock cycle and radical parallelization is a good answer. Many research communities, such as fluid dynamics, are already performing real computational work on the GPU, and the entire industry is shifting towards a GPGPU paradigm. Programming languages are also being written to further take advantage of parallelization. In my humble opinion, we're approaching the where every computation that can be processed in parallel will be. For what's it's worth, I actually think both Intel and AMD/ATI are doing a much better job at this than Nvidia.
Re:Moores law will apply until it doesn't (Score:3, Insightful)
Parallel is a decently magic bullet. The number of interesting computing tasks I've seen that cannot be partitioned into parallel tasks has been quite small. That's why 100% of the top 500 supercomputers are parallel devices.
Re:CPU's are not holding back Moore's Law (Score:3, Insightful)
Well, without the slowness of windows, we wouldn't need faster computers, so there'd be nothing driving innovation.
Re:Moores law will apply until it doesn't (Score:3, Insightful)
I don't understand ... user interaction on the desktop uses typically 1% of a modern cpu.
Virus scans are getting to be more cpu bound if you have an SSD.