Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia CEO "Not Afraid" of CPU-GPU Hybrids 228

J. Dzhugashvili writes "Is Nvidia worried about the advent of both CPUs with graphics processor cores and Larrabee, Intel's future discrete graphics processor? Judging by the tone adopted by Nvidia's CEO during a financial analyst conference yesterday, not quite. Huang believes CPU-GPU hybrids will be no different (and just as slow) as today's integrated graphics chipsets, and he thinks people will still pay for faster Nvidia GPUs. Regarding Larrabee, Huang says Nvidia is going to 'open a can of whoop-ass' on Intel, and that Intel's strategy of reinventing the wheel by ignoring years of graphics architecture R&D is fundamentally flawed. Nvidia also has some new hotness in the pipeline, such as its APX 2500 system-on-a-chip for handhelds and a new platform for VIA processors."
This discussion has been archived. No new comments can be posted.

Nvidia CEO "Not Afraid" of CPU-GPU Hybrids

Comments Filter:
  • by symbolset ( 646467 ) on Friday April 11, 2008 @04:24PM (#23040638) Journal

    Ray vs raster. The reason we have so much tech in Raster is because processing was not sufficient to do ray. If it had been we'd have never started down the raster branch of development because it just doesn't work as well. The results are not as realistic with raster. Shadows don't look right. You can't do csg. You get edge effects. There are a thousand work-arounds for things like reflections of reflections, lens effects and audio reflections. Raster is a hack and when we have the CPU to do the real time ray tracing rendering raster composition will go away.

    Raster was a way to make some fairly believable (if cartoonish) video games. They still require some deliberate suspension-of-disbelief. Only with raytracing do you get the surreal Live-or-memorex feeling of not being able to tell a rendered scene from a photo, except for the fact that the realistic scene depicts something that might be physically impossible.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Friday April 11, 2008 @04:25PM (#23040654) Journal

    Until Intel can show us Crysis

    If Intel is right, there won't be much of an effect on existing games.

    Intel is focusing on raytracers, something Crytek has specifically said that they will not do. Therefore, both Crysis and any sequels won't really see any improvement from Intel's approach.

    If Intel is right, what we are talking about is the Crysis-killer -- a game that looks and plays much better than Crysis (and maybe with a plot that doesn't completely suck [penny-arcade.com]), and only on Intel hardware, not on nVidia.

    Oh, and Beryl has been killed and merged. It's just Compiz now, and Compiz Fusion if you need more.

  • by forsey ( 1136633 ) on Friday April 11, 2008 @04:36PM (#23040818)
    Actually nVidia is working a new technology called HybridPower which involves a computer with both an on board and discrete graphics card, where the low power on board card is used most of the time (when you are just in your OS environment of choice), but when you need the power (for stuff like games) the discrete card boots up.
  • by Anonymous Coward on Friday April 11, 2008 @04:36PM (#23040820)
    ATI/AMD hasn't been competitive with NVIDIA for two product cycles. That doesn't look likely to change in the near future, either; ATI/AMD's next generation GPU architecture isn't looking so hot.

    AMD is in a world of hurt right now, with Intel consistently maintaining a lead over them in the CPU segment, and NVIDIA maintaining a lead over them in the GPU segment. They're doing some interesting, synergistic things between the CPU and GPU sides, but who knows if that'll pan out. Meanwhile, they're being forced to compete on price alone, which is never a position you want to be in.

    (The driver quality situation hasn't exactly helped them any, either, although I'm looking forward to good things post-acquisition, especially now that open source drivers are becoming a reality.)
  • ouch (Score:3, Informative)

    by Lord Ender ( 156273 ) on Friday April 11, 2008 @04:38PM (#23040840) Homepage
    NVDA was down 7% in the stock market today. As an Nvidia shareholder, that hurts!

    If you don't believe Intel will ever compete with Nvidia, now is probably a good time to buy. NVDA has a forward P/E of 14. That's a "value stock" price for a leading tech company... you don't get opportunities like that often. NVDA also has no debt on the books, so the credit crunch does not directly affect them.
  • Sigh (Score:3, Informative)

    by Sycraft-fu ( 314770 ) on Friday April 11, 2008 @05:14PM (#23041298)
    Of COURSE they do, in fact they already HAVE low power offerings. I'm not sure why people seem to think the 8800 is the only card nVidia makes. nVidia is quite adept at taking their technology and scaling it down. Just reduce the clock speed, cut off shader units and such, there you go. In the 8 series they have an 8400. I don't know what the power draw is, but it doesn't have any extra power connectors so it is under 75 watts peak by definition (that's all PCIe can handle). They have even lower power cards in other lines, and integrated on the motherboard.

    So they already HAVE low power GPUs. However you can't have both low power and super high performance. If you want something that performs like an 8800, well, you need an 8800.
  • by nuzak ( 959558 ) on Friday April 11, 2008 @05:50PM (#23041618) Journal
    Huang is obviously ethnic Chinese. It is likely he is imitating something he heard in a movie or TV show.

    Yeah, them slanty-eyed furriners just can't speak English right, can they?

    Huang is over 40 years old and has lived in the US since he was a child. Idiot.

  • Re:Let's Face It (Score:2, Informative)

    by hr.wien ( 986516 ) on Friday April 11, 2008 @06:15PM (#23041848)
    ATI have open specs. At least for a lot of their hardware. They are releasing more and more documentation as it gets cleaned up and cleared by legal. Open Source ATI drivers are coming on in leaps and bounds as a result.
  • by GigaplexNZ ( 1233886 ) on Friday April 11, 2008 @08:16PM (#23042846)
    I thought it was DirectX 9 that they were left out from, causing their FX range (5200 -> 5900) to be fairly useless when compared to ATI. They were legends in the DirectX 8 arena with the GeForce4 Ti series.
  • by ardor ( 673957 ) on Friday April 11, 2008 @09:17PM (#23043158)
    Wrong. All of it.

    Raytracing doesnt magically get you better image quality. EXCEPT for shadows, the results look just like rasterization. As usual, people mix up raytracing with path tracing, photon mapping, radiosity, and other GI algorithms. Note: GI can be applied to rasterization as well.

    So, which "benefits" are left? Refraction/reflection, haze, and any kind of ray distortion - SECONDARY ray effects. Primary rays can be fully modeled with rasterization, which gives you much better performance because of the trivial cache coherency and simpler calculations. (In a sense, rasterization can be seen as a cleverly optimized primary-ray-pass). This is why hybrid renderers make PERFECT sense. Yes, I know ray bundles, they are hard to get right, and again: for primary rays, raytracing makes no sense.

    "Suspension of disbelief" is necessary with raytracing too. You confuse the rendering technique with lighting models, animation quality and so on. "edge effects" is laughable, aliasing WILL occur with raytracing as well unless you shoot multiple rays per pixel (and guess what... rasterizers commonly HAVE MSAA).

    Jeez, when will people stop thinking all this BS about raytracing? As if it were a magical thingie capable of miracously enhancing your image quality....

    Raytracing has its place - as an ADDITION to a rasterizer, to ease implementation of the secondary ray effects (which are hard to simulate with pure rasterization). This is the future.
  • by scumdamn ( 82357 ) on Saturday April 12, 2008 @07:50AM (#23045750)
    The non-serious gamer market it TOTALLY a big hit. And the benefit of this chipset for many users is that you get decent 3D performance with a motherboard for the same price you would pay for a motherboard without the integrated graphics.

    And if you decide to bump it up a notch and buy a 3450 it operates in Hybrid Crossfire so your onboard graphics aren't totally disabled. Explain to me how that isn't cool?

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...