Nvidia CEO "Not Afraid" of CPU-GPU Hybrids 228
J. Dzhugashvili writes "Is Nvidia worried about the advent of both CPUs with graphics processor cores and Larrabee, Intel's future discrete graphics processor? Judging by the tone adopted by Nvidia's CEO during a financial analyst conference yesterday, not quite. Huang believes CPU-GPU hybrids will be no different (and just as slow) as today's integrated graphics chipsets, and he thinks people will still pay for faster Nvidia GPUs. Regarding Larrabee, Huang says Nvidia is going to 'open a can of whoop-ass' on Intel, and that Intel's strategy of reinventing the wheel by ignoring years of graphics architecture R&D is fundamentally flawed. Nvidia also has some new hotness in the pipeline, such as its APX 2500 system-on-a-chip for handhelds and a new platform for VIA processors."
Ray tracing for the win (Score:5, Informative)
Ray vs raster. The reason we have so much tech in Raster is because processing was not sufficient to do ray. If it had been we'd have never started down the raster branch of development because it just doesn't work as well. The results are not as realistic with raster. Shadows don't look right. You can't do csg. You get edge effects. There are a thousand work-arounds for things like reflections of reflections, lens effects and audio reflections. Raster is a hack and when we have the CPU to do the real time ray tracing rendering raster composition will go away.
Raster was a way to make some fairly believable (if cartoonish) video games. They still require some deliberate suspension-of-disbelief. Only with raytracing do you get the surreal Live-or-memorex feeling of not being able to tell a rendered scene from a photo, except for the fact that the realistic scene depicts something that might be physically impossible.
NOTHING to do with existing games. (Score:5, Informative)
If Intel is right, there won't be much of an effect on existing games.
Intel is focusing on raytracers, something Crytek has specifically said that they will not do. Therefore, both Crysis and any sequels won't really see any improvement from Intel's approach.
If Intel is right, what we are talking about is the Crysis-killer -- a game that looks and plays much better than Crysis (and maybe with a plot that doesn't completely suck [penny-arcade.com]), and only on Intel hardware, not on nVidia.
Oh, and Beryl has been killed and merged. It's just Compiz now, and Compiz Fusion if you need more.
Re:He should be afraid (Score:4, Informative)
Re:Not scared... no kidding? (Score:3, Informative)
AMD is in a world of hurt right now, with Intel consistently maintaining a lead over them in the CPU segment, and NVIDIA maintaining a lead over them in the GPU segment. They're doing some interesting, synergistic things between the CPU and GPU sides, but who knows if that'll pan out. Meanwhile, they're being forced to compete on price alone, which is never a position you want to be in.
(The driver quality situation hasn't exactly helped them any, either, although I'm looking forward to good things post-acquisition, especially now that open source drivers are becoming a reality.)
ouch (Score:3, Informative)
If you don't believe Intel will ever compete with Nvidia, now is probably a good time to buy. NVDA has a forward P/E of 14. That's a "value stock" price for a leading tech company... you don't get opportunities like that often. NVDA also has no debt on the books, so the credit crunch does not directly affect them.
Sigh (Score:3, Informative)
So they already HAVE low power GPUs. However you can't have both low power and super high performance. If you want something that performs like an 8800, well, you need an 8800.
Re:Translation: "nVidia needs a better top manager (Score:3, Informative)
Yeah, them slanty-eyed furriners just can't speak English right, can they?
Huang is over 40 years old and has lived in the US since he was a child. Idiot.
Re:Let's Face It (Score:2, Informative)
Re:Not scared... no kidding? (Score:2, Informative)
Re:Ray tracing for the win (Score:5, Informative)
Raytracing doesnt magically get you better image quality. EXCEPT for shadows, the results look just like rasterization. As usual, people mix up raytracing with path tracing, photon mapping, radiosity, and other GI algorithms. Note: GI can be applied to rasterization as well.
So, which "benefits" are left? Refraction/reflection, haze, and any kind of ray distortion - SECONDARY ray effects. Primary rays can be fully modeled with rasterization, which gives you much better performance because of the trivial cache coherency and simpler calculations. (In a sense, rasterization can be seen as a cleverly optimized primary-ray-pass). This is why hybrid renderers make PERFECT sense. Yes, I know ray bundles, they are hard to get right, and again: for primary rays, raytracing makes no sense.
"Suspension of disbelief" is necessary with raytracing too. You confuse the rendering technique with lighting models, animation quality and so on. "edge effects" is laughable, aliasing WILL occur with raytracing as well unless you shoot multiple rays per pixel (and guess what... rasterizers commonly HAVE MSAA).
Jeez, when will people stop thinking all this BS about raytracing? As if it were a magical thingie capable of miracously enhancing your image quality....
Raytracing has its place - as an ADDITION to a rasterizer, to ease implementation of the secondary ray effects (which are hard to simulate with pure rasterization). This is the future.
Re:I think AMD has a better plan (Score:3, Informative)
And if you decide to bump it up a notch and buy a 3450 it operates in Hybrid Crossfire so your onboard graphics aren't totally disabled. Explain to me how that isn't cool?