Nvidia CEO "Not Afraid" of CPU-GPU Hybrids 228
J. Dzhugashvili writes "Is Nvidia worried about the advent of both CPUs with graphics processor cores and Larrabee, Intel's future discrete graphics processor? Judging by the tone adopted by Nvidia's CEO during a financial analyst conference yesterday, not quite. Huang believes CPU-GPU hybrids will be no different (and just as slow) as today's integrated graphics chipsets, and he thinks people will still pay for faster Nvidia GPUs. Regarding Larrabee, Huang says Nvidia is going to 'open a can of whoop-ass' on Intel, and that Intel's strategy of reinventing the wheel by ignoring years of graphics architecture R&D is fundamentally flawed. Nvidia also has some new hotness in the pipeline, such as its APX 2500 system-on-a-chip for handhelds and a new platform for VIA processors."
Re:Not scared... no kidding? (Score:5, Insightful)
Yes I know they got bought by AMD, but they still exist and they still make GPUs AFAIK.
And if your argument is that nVidia is better than ATI, let me remind you that ATI/nVidia and intel/AMD keep leapfrogging each other every few years.
Let's Face It (Score:3, Insightful)
Did anyone expect him to surrender? (Score:5, Insightful)
Is Creative still around? Last I heard, they were making MP3 players...
Re:Multi Core GPUs (Score:3, Insightful)
The PC is just a toy (Score:4, Insightful)
Re:CPU and GPU intergation. (Score:2, Insightful)
"Bus and train integration is quite logical progression of technology. There are things the plane is not optimal and same goes to the bus. It seems that when combined, they prove successful. So let's put wings on a bus."
Now, I think there are plenty of good reasons why CPU/GPU integration is a good idea (as well as a few good reasons why it's not), but there's nothing logical about the statement you made. Just because a CPU does something well and a GPU does something different well, it doesn't necessarily follow that slapping them together is a better idea than having them be discrete components.
The key insight is that the modern CPU and the modern GPU are starting to converge in a lot of areas of functionality. The main difference is that CPUs are optimized for serial processing of at most a few threads of arbitrarily complex software, while GPUs are optimized for massively parallel processing of large numbers of pixels using similar, fairly simple programs (shaders).
Now, the logic core needed to perform these two tasks is highly specific, which is why we have separate CPUs and GPUs to begin with. But there's a lot to be gained by integrating the two more closely. You can share memory interfaces, for example, and perhaps more relevantly for the high-end graphics segment, you can tightly couple CPU and GPU operations across a bus that's going to be a hundred times faster than anything PCI Express can provide, and with latency to die for.
In short, I agree with your basic point, but I don't think you made a very good case for it.
Re:Not scared... no kidding? (Score:4, Insightful)
Competitive enough anyway. Long as I'm still on AGP, I'm still getting ATI cards (nVidia's agp offerings have classically been highly crippled beyond just running on AGP). But sure, I'm a niche, and truth be told, my next system will probably have nVidia.
But gamer video cards aren't everything, and I daresay not even the majority. If you have a flatscreen TV, chances are good it's got ATI parts in it. Then there's laptops and integrated video, nothing to sneeze at.
Re:Ray tracing for the win (Score:3, Insightful)
I don't buy the 'raytracing is so much better than raster' argument. I do agree that it makes it algorithmically simpler to create near-photorealistic renders, but that doesn't mean that raster's only redeeming quality is that it's less burdensome for simpler scenes.
Re:Ray tracing for the win (Score:1, Insightful)
Worse than that, people like me would be quite happy using our 4MB ISA graphics cards, if some sod hadn't gone and invented PCI.
In fact, 3/4 of all computer users would probably be happy using text mode and printing in 10 pitch courier if it wasnt for the noise those damned daisy-wheel printers made.
NVidia are about to get shafted, and, as someone who cannot get his NVidia card to work properly in FreeBSD, or Win2k, I say "good riddance". (It works with Ubuntu 7.10 if anyone actually cares)
Re:Intel? (Score:3, Insightful)
nVidia should be very careful that they don't make the same mistake as Creative. Twenty years ago, if you wanted sound from a PC, you bought a Soundblaster. Ten years ago, if you wanted good sound in games, you bought a Soundblaster (or, if you had more taste, a card that did A3D), and it would offload the expensive computations from the CPU and give you a better gaming experience. Now, who buys discrete sound cards? The positional audio calculations are so cheap by today's standards that you can do them all on the CPU and barely notice.
Re:Translation: "nVidia needs a better top manager (Score:3, Insightful)
Maybe he was out of chairs?
Re:Intel graphics suck... (Score:3, Insightful)
By the way, onboard video uses about as much RAM as a browser will use (And about as much as Win98 needs to boot in, but I digress.), hardly a drop in the bucket with 1GB sticks being so cheap now. If 8-32MB of RAM is that much of a problem for you, you have more problems than poor video.
Re:Not scared... no kidding? (Score:3, Insightful)
Quite true. With the 8500 and 8600 models, and now the 9500, nVidia trounces AMD even on budget cards.
But, nVidia got pummeled prior to their acquisition of Yahoo!^H^H^H^H^H^H Voodoo, and the two were quite neck and neck for a long time. So it's more of "the tables have turned (again)" rather than "they have no competition."
Until AMD completely quits making higher-end video cards, nVidia will have to keep on doing something to stay competitive. Same thing with Firefox - I don't think IE8 would have looked any different than IE5 without something biting at their heels-slash-completely surpassing them.
Re:Just like the FPU (Score:1, Insightful)
Re:Ray tracing for the win (Score:4, Insightful)
Perhaps the limitation is in the ability of the humans to model the scene rather than the ability of the computer to render it.
Re:Ray tracing for the win (Score:5, Insightful)
The reason we can so easily tell the difference between CGI creatures and real creatures is not the photorealism of it, but the animation. Evaluate a screen cap of Lord of the Rings with Gollum in it, and then evaluate that entire scene in motion. The screen cap will look astonishingly realistic compared to the video.
Computers are catching up to the computational challenges of rendering scenes, but humans haven't quite yet figured out how to program every muscle movement living creatures make. Attempts for complete realism in 3D animation still fall somewhere in the Uncanny Valley [wikipedia.org].
Re:Did anyone expect him to surrender? (Score:1, Insightful)
Re:Not scared... no kidding? (Score:3, Insightful)
I haven't, but I have heard of Carmack, and Carmack "seems to think that Intel's direction using traditional ray tracing methods is not going to work [slashdot.org]." I didn't understand anything in that article, but assuming that the blurb was correct (and Carmack didn't seem to refute it in the 3 times he replied to that story), then I'd say that they may not be "less and less interested" but maybe they are "less and less right about the direction to take." Take your pick.
And while my little blurb may have been fundamentally incorrect, while I haven't heard anyone say they were looking forward to the new Radeon, I have heard even less people were looking forward to the new Intel graphical chipset. Have you?
Splitting hairs on this seems kinda useless. nVidia is really it right now in the graphics world, at least as far as the public is concerned, and I don't see that changing in the near future.