Nvidia CEO "Not Afraid" of CPU-GPU Hybrids 228
J. Dzhugashvili writes "Is Nvidia worried about the advent of both CPUs with graphics processor cores and Larrabee, Intel's future discrete graphics processor? Judging by the tone adopted by Nvidia's CEO during a financial analyst conference yesterday, not quite. Huang believes CPU-GPU hybrids will be no different (and just as slow) as today's integrated graphics chipsets, and he thinks people will still pay for faster Nvidia GPUs. Regarding Larrabee, Huang says Nvidia is going to 'open a can of whoop-ass' on Intel, and that Intel's strategy of reinventing the wheel by ignoring years of graphics architecture R&D is fundamentally flawed. Nvidia also has some new hotness in the pipeline, such as its APX 2500 system-on-a-chip for handhelds and a new platform for VIA processors."
CPU and GPU intergation. (Score:1, Interesting)
A side note maybe we'll see a Nvidia GPU based Folding@home release some day, but at least ATI latest GPUs have a new client to play with:
http://folding.typepad.com/news/2008/04/gpu2-open-beta.html [typepad.com]
Multi Core GPUs (Score:2, Interesting)
He should be afraid (Score:5, Interesting)
My Mac mini has a maximum load of 110W. That's the Core 2 Duo CPU, the integrated GMA950, 3GB of RAM, a 2.5" drive and a DVD burner, not to mention FireWire 400 and four USB 2.0 ports under maximum load (the FW400 port being 8W alone).
Granted the GMA950 sucks compared to nVidia's current offerings, however do they have any plans for low-power GPUs? I'm pretty sure the whole company can't survive on the FPS-crazed game players revenues alone.
They should start thinking about asking intel to integrate their (current) laptop GPUs into intel CPUs.
I think AMD has a better plan (Score:5, Interesting)
Can of Whoop Ass?? (Score:3, Interesting)
The risk for NVidia isn't that Intel will surpass them, or even necessarily approach their best performance. The risk is that Intel might start catching up, cutting (further) into NVidia's market share.
AMD's acquisition of ATI seems to imply that they see tight integration of graphics to be at least cheaper for a given level of performance, or higher performance for a given price. Apply that same reasoning to Intel, since they certainly aren't likely to let AMD have that advantage all to themselves.
Now try to apply that logic to NVidia - what are they going to do, merge with a distant-last-place x86 maker?
Re:Ray tracing for the win (Score:3, Interesting)
For instance, modern GUIs often use the 3d hardware to handle window transforms, blending and placement. These are fundamentally polygonal objects for which triangle transformation and rasterization is a perfectly appropriate tool and ray tracing would be silly.
The current polygon model will never vanish completely, even if high-end graphics eventually go to ray tracing instead.
Re:Let's Face It (Score:3, Interesting)
ATI and Nvidia do not. I know who I'm rooting for to come up with a good hardware...
Just like the FPU (Score:5, Interesting)
Such FPU's do not exist today.
I think Nvidia should be worried about this.
The problem... (Score:3, Interesting)
Re:Ray tracing for the win (Score:3, Interesting)
The fact is that "artificial" raster shadows, lighting and reflections typically look more impressive than the "more realistic" results of ray tracing. This alone explains why raster will maintain its dominance, and why ray tracing will not catch on.
Re:Ray tracing for the win (Score:3, Interesting)
Re:Ray tracing for the win (Score:1, Interesting)
Are you sure about that?
4MB is not enough to store 1280x1024 at 32bpp. I also believe that extra video card memory can be used in 2D to store extra bitmaps.
ISA also has a bandwidth of under 4 MB/s, which is not enough for 320x240 16bpp 30fps video
If you want to talk about those old graphics cards, try turning off all 2D acceleration and see how smooth moving windows and scrolling is. That's why they did window outlines.
CPU+GPU is mostly a cost-cutting measure (Score:3, Interesting)
Nobody expects the CPU+GPU to yield gaming performance worth a damn, because the two big companies that are looking into this amalgam both have underperforming graphics technology. Do they both make excellent budget solutions ? Yes they certainly do, but for those who crave extreme speed, the only option is NVidia.
That said, not everyone plays shooters. Back in my retail days, I'd say I moved 50 times more bottom-end GPUs than top-end ones. Those Radeon 9250s were $29.99 piles of alien poop, but cheap poop is enough for the average norm. The only people who spent more than $100 on a video card were teenagers and comic book guys (and of course, my awesome self).
Okay, here is more detail about why he is foolish. (Score:4, Interesting)
Quote from the article: "Nvidia CEO Jen-Hsun Huang was quite vocal on those fronts, arguing hybrid chips that mix microprocessor and graphics processor cores will be no different from systems that include Intel or AMD integrated graphics today."
My opinion: There would be no need for all the talk if there were no chance of competition. Everyone knows there will be new competition from Intel Larabee and AMD/ATI. Everyone knows that "no different" is a lie. Lying exposes the Nvidia CEO as a weak man.
"... he explained that Nvidia is continuously reinventing itself and that it will be two architectural refreshes beyond the current generation of chips before Larrabee launches."
The entire issue is that Intel+Larabee and AMD+ATI will make Nvidia irrelevant for most users. The GPU will be on the motherboard. Nvidia will sell only to gamers who are willing to pay extra, a lot extra.
"Huang also raised the prospect of application and API-level compatibility problems with Larrabee. Intel has said Larrabee will support the DirectX 10 and OpenGL application programming interfaces just like current AMD and Nvidia GPUs, but Huang seemed dubious Intel could deliver on that front."
Intel, in this case, is Intel and Microsoft working together. Both are poorly managed companies in many ways, but they are both managed well enough to insure that the Microsoft product works with the Intel hardware. Sure, it is an easy guess that Microsoft will release several buggy versions, because Microsoft has a history of treating its customers as though they were beta testers, but eventually everything will work correctly.
'[NVidia VP] Tamasi went on to shoot down Intel's emphasis on ray tracing, which the chipmaker has called "the future for games." '
Ray tracing is certainly the future for games, there is no question about that. The question is when, because the processor power required is huge. It's my guess, but an easy guess, that Mr. Tamasi is lying; he is apparently trying to take advantage of the ignorance of financial analists.
"Additionally, Tamasi believes rasterization is inherently more scalable than ray tracing. He said running a ray tracer on a cell phone is "hard to conceive."
This is apparently another attempt to confuse the financial analyists, who often have only a pretend interest in technical things. Anyone understanding the statement knows it is nonsense. No one is suggesting that there will be ray-tracing on cell phones. My opinion is that this is another lie.
"We're gonna be highly focused on bringing a great experience to people who care about it," he explained, adding that Nvidia hardware simply isn't for everyone."
That was a foolish thing to say. That's the whole issue! In the future, Nvidia's sales will drop because "Nvidia hardware simply isn't for everyone." Most computers will not have separate video adapters, whereas they did before. Only powerful game machines will need to by from Nvidia.
'Huang added, "I would build CPUs if I could change the world [in doing so]." ' Later in the article, it says, "Nvidia is readying a platform to accompany VIA's next-generation Isaiah processor, which should fight it out with Intel's Atom in the low-cost notebook and desktop arena"
Translation: Before, every desktop computer needed a video adapter, which came from a company different than the CPU maker, a company like Nvidia. Now, the video adapters will be mostly supplied by CPU makers. In response, Nvidia will start making low-end CPUs. It is questionable whether Nvidia can compete with Intel and AMD making any kind of CPU.
Re:Multi Core GPUs (Score:3, Interesting)
Re:Ray tracing for the win (Score:1, Interesting)
Raytracing, on the other hand, requires no hacking whatsoever to produce the same quality of results as the totally massaged rasterized scene, just a little tweaking. I've seen scenes that I could not pick out as fake rendered over ten years ago with PovRay; the newer versions do even better. I don't know where you've seen raytracing suck compared to rasterized images, as that doesn't match anything I've ever seen. Even if I take you at your word on that, though, it still doesn't matter. Once raytracing is able to run fast enough to get full screen resolution at 60 fps, it's going to be a lot harder to justify the immense design and effort that goes into new rasterization tricks when you can achieve the same evolution by inching up the quality knob (level of antialiasing, number of caustics rays, etc.) on the raytracer every time a new breed of processors come out.
Yes, rasterization might be faster for the same scene at the same level of quality. It might even continue to be faster forever, I don't know. But it's irrelevant - I'm sure a lot of people thought MIDI and MOD game music would never disappear back in the day, because there was so little space on a disk that you would never want to waste some of it on actual recorded music. The optimizations that help performance in the early years of a technology are always discarded once you reach a point where the "real thing" comes relatively cheap. Once you can simulate reality close enough that a casual observer can't tell the difference, optimizations will no longer be aimed at processor speed, but will focus on ease of creation, and raytracing has a massive edge there.
As to why brilliant guys like Carmack see no future in raytracing? Simple - they are knee deep in the extremely difficult optimizations required to get tomorrow's results out of today's machines, and this closeness with hacking the guts of an imperfect system makes it really hard to imagine a day when that imperfect system is unnecessary. That's fine while today's results suck. But there will come a day when squeezing another factor of two out of your graphics card's polygon count won't help you because you're already close enough to reality that nobody cares anymore.
That's when we turn to physics. And the rigid body experts will reign supreme, talking about how large scale molecular physical simulation will never overtake their methods coupled to special purpose soft-body solvers. And they'll be right for ten years, and then we'll have enough processing power that it doesn't make sense to make "stupid" simplifications like the rigid body assumtion, and eventually physics will be a solved problem as we start simulating what really happens as opposed to a high level approximation of it. God knows what we'll turn to after that...
Meanwhile, in the corporate world (Score:3, Interesting)
I have done CAD/CAM for ages, and my P3-750 with a Quadro4 700XGL isn't noticeably slower than a P4-3.4 with a Radeon X300SE running Unigraphics NX 5. I have a P3-500 with a QuadroFX-1000 card that freaking flies running CATIA V5. Again, in contrast, my 1.8GHz P4 laptop with integrated Intel graphics sucks balls running either UG or CATIA.
Speaking for the workstation users out there, please keep making high performance GPUs, Nvidia.