Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia CEO "Not Afraid" of CPU-GPU Hybrids 228

J. Dzhugashvili writes "Is Nvidia worried about the advent of both CPUs with graphics processor cores and Larrabee, Intel's future discrete graphics processor? Judging by the tone adopted by Nvidia's CEO during a financial analyst conference yesterday, not quite. Huang believes CPU-GPU hybrids will be no different (and just as slow) as today's integrated graphics chipsets, and he thinks people will still pay for faster Nvidia GPUs. Regarding Larrabee, Huang says Nvidia is going to 'open a can of whoop-ass' on Intel, and that Intel's strategy of reinventing the wheel by ignoring years of graphics architecture R&D is fundamentally flawed. Nvidia also has some new hotness in the pipeline, such as its APX 2500 system-on-a-chip for handhelds and a new platform for VIA processors."
This discussion has been archived. No new comments can be posted.

Nvidia CEO "Not Afraid" of CPU-GPU Hybrids

Comments Filter:
  • by Yvan256 ( 722131 ) on Friday April 11, 2008 @04:13PM (#23040470) Homepage Journal
    No competition? What? Did ATI die or something?

    Yes I know they got bought by AMD, but they still exist and they still make GPUs AFAIK.

    And if your argument is that nVidia is better than ATI, let me remind you that ATI/nVidia and intel/AMD keep leapfrogging each other every few years.
  • Let's Face It (Score:3, Insightful)

    by DigitalisAkujin ( 846133 ) on Friday April 11, 2008 @04:15PM (#23040494) Homepage
    Until Intel can show us Crysis in all it's GPU raping glory running on it's chipset in 1600x1200 with all settings to Ultra High Nvidia and ATI will still be kings of high end graphics. Then again, if all Intel wants to do is create a sub standard alternative to those high end cards just to run Vista Aero and *nix Beryl then they have already succeeded.
  • by WoTG ( 610710 ) on Friday April 11, 2008 @04:21PM (#23040590) Homepage Journal
    IMHO, Nvidia is stuck as the odd-man out. When integrated chipsets and GPU-CPU hybrids can easily handle full-HD playback, the market for discrete GPUs falls and falls some more. Sure, discrete will always be faster, just like a Porsche is faster than a Toyota, but who makes more money (by a mile)?

    Is Creative still around? Last I heard, they were making MP3 players...
  • Re:Multi Core GPUs (Score:3, Insightful)

    by Wesley Felter ( 138342 ) <wesley@felter.org> on Friday April 11, 2008 @04:23PM (#23040618) Homepage
    Modern GPUs already have 8-16 cores.
  • by klapaucjusz ( 1167407 ) on Friday April 11, 2008 @04:24PM (#23040626) Homepage
    If I understand them right, they're claiming that integrated graphics and CPU/GPU hybrids are just a toy, and that you want discrete graphics if you're serious. Ken Olsen famously said that "the PC is just a toy". When did you last use a "real" computer?
  • by Anonymous Coward on Friday April 11, 2008 @04:45PM (#23040976)

    CPU and GPU integration is quite logical progression of technology. There are things the GPU is not optimal and same goes to the CPU. It seems that when combined, they prove successful.
    Let's examine this statement:

    "Bus and train integration is quite logical progression of technology. There are things the plane is not optimal and same goes to the bus. It seems that when combined, they prove successful. So let's put wings on a bus."

    Now, I think there are plenty of good reasons why CPU/GPU integration is a good idea (as well as a few good reasons why it's not), but there's nothing logical about the statement you made. Just because a CPU does something well and a GPU does something different well, it doesn't necessarily follow that slapping them together is a better idea than having them be discrete components.

    The key insight is that the modern CPU and the modern GPU are starting to converge in a lot of areas of functionality. The main difference is that CPUs are optimized for serial processing of at most a few threads of arbitrarily complex software, while GPUs are optimized for massively parallel processing of large numbers of pixels using similar, fairly simple programs (shaders).

    Now, the logic core needed to perform these two tasks is highly specific, which is why we have separate CPUs and GPUs to begin with. But there's a lot to be gained by integrating the two more closely. You can share memory interfaces, for example, and perhaps more relevantly for the high-end graphics segment, you can tightly couple CPU and GPU operations across a bus that's going to be a hundred times faster than anything PCI Express can provide, and with latency to die for.

    In short, I agree with your basic point, but I don't think you made a very good case for it.
  • by nuzak ( 959558 ) on Friday April 11, 2008 @04:49PM (#23041016) Journal
    > ATI/AMD hasn't been competitive with NVIDIA for two product cycles

    Competitive enough anyway. Long as I'm still on AGP, I'm still getting ATI cards (nVidia's agp offerings have classically been highly crippled beyond just running on AGP). But sure, I'm a niche, and truth be told, my next system will probably have nVidia.

    But gamer video cards aren't everything, and I daresay not even the majority. If you have a flatscreen TV, chances are good it's got ATI parts in it. Then there's laptops and integrated video, nothing to sneeze at.

  • by koko775 ( 617640 ) on Friday April 11, 2008 @05:09PM (#23041262)
    Even raytracing needs hacks like radiosity.

    I don't buy the 'raytracing is so much better than raster' argument. I do agree that it makes it algorithmically simpler to create near-photorealistic renders, but that doesn't mean that raster's only redeeming quality is that it's less burdensome for simpler scenes.
  • by Anne Thwacks ( 531696 ) on Friday April 11, 2008 @05:15PM (#23041302)
    The real reason we have raster, is because more computers spend more hours rendering Word docs than rendering games images, by at least a factor of 100,000.

    Worse than that, people like me would be quite happy using our 4MB ISA graphics cards, if some sod hadn't gone and invented PCI.

    In fact, 3/4 of all computer users would probably be happy using text mode and printing in 10 pitch courier if it wasnt for the noise those damned daisy-wheel printers made.

    NVidia are about to get shafted, and, as someone who cannot get his NVidia card to work properly in FreeBSD, or Win2k, I say "good riddance". (It works with Ubuntu 7.10 if anyone actually cares)

  • Re:Intel? (Score:3, Insightful)

    by TheRaven64 ( 641858 ) on Friday April 11, 2008 @05:16PM (#23041312) Journal
    nVidia beating Intel in the GPU market would indeed be news. Intel currently have something like 40% of the GPU market, while nVidia is closer to 30%. Reading the quote from nVidia, I hear echoes of the same thing that the management at SGI said just before a few of their employees left, founded nVidia, and destroyed the premium workstation graphics market by delivering almost as good consumer hardware for a small fraction of the price.

    nVidia should be very careful that they don't make the same mistake as Creative. Twenty years ago, if you wanted sound from a PC, you bought a Soundblaster. Ten years ago, if you wanted good sound in games, you bought a Soundblaster (or, if you had more taste, a card that did A3D), and it would offload the expensive computations from the CPU and give you a better gaming experience. Now, who buys discrete sound cards? The positional audio calculations are so cheap by today's standards that you can do them all on the CPU and barely notice.

  • by ozbird ( 127571 ) on Friday April 11, 2008 @05:25PM (#23041396)
    This is a VERY SERIOUS problem for the entire world. There are apparently no people available who have both technical understanding and social sophistication.

    Maybe he was out of chairs?
  • by Nullav ( 1053766 ) <moc@noSPAM.liamg.valluN> on Friday April 11, 2008 @05:38PM (#23041516)
    Next you're going to tell me the sky is blue or that too much water can kill me. Onboard video isn't meant to be shiny, just to serve a basic need: being able to see what the hell you're doing. Rather than dismissing Intel because they (and many other board manufacturers) provide a bare-bones video solution, I'm interested in seeing what they'll pop out when they're actually trying.

    By the way, onboard video uses about as much RAM as a browser will use (And about as much as Win98 needs to boot in, but I digress.), hardly a drop in the bucket with 1GB sticks being so cheap now. If 8-32MB of RAM is that much of a problem for you, you have more problems than poor video.
  • by Z34107 ( 925136 ) on Friday April 11, 2008 @05:58PM (#23041690)

    Quite true. With the 8500 and 8600 models, and now the 9500, nVidia trounces AMD even on budget cards.

    But, nVidia got pummeled prior to their acquisition of Yahoo!^H^H^H^H^H^H Voodoo, and the two were quite neck and neck for a long time. So it's more of "the tables have turned (again)" rather than "they have no competition."

    Until AMD completely quits making higher-end video cards, nVidia will have to keep on doing something to stay competitive. Same thing with Firefox - I don't think IE8 would have looked any different than IE5 without something biting at their heels-slash-completely surpassing them.

  • by Anonymous Coward on Friday April 11, 2008 @06:15PM (#23041852)
    That's true, but FPUs didn't require 100+ GByte/sec dedicated memory systems, analog and digital video output circuitry or any of the other things that GPUs have other than FPUs... You're absolutely right that there's a possibility of things going like they did for the x87 and Weitek floating point chips, but it's also possible that having the GPU as an independent device will continue to make sense for a variety of other reasons going forward.
  • by mrchaotica ( 681592 ) * on Friday April 11, 2008 @07:05PM (#23042270)

    Perhaps the limitation is in the ability of the humans to model the scene rather than the ability of the computer to render it.

  • by 75th Trombone ( 581309 ) on Friday April 11, 2008 @07:35PM (#23042540) Homepage Journal
    Parent +1 Insightful.

    The reason we can so easily tell the difference between CGI creatures and real creatures is not the photorealism of it, but the animation. Evaluate a screen cap of Lord of the Rings with Gollum in it, and then evaluate that entire scene in motion. The screen cap will look astonishingly realistic compared to the video.

    Computers are catching up to the computational challenges of rendering scenes, but humans haven't quite yet figured out how to program every muscle movement living creatures make. Attempts for complete realism in 3D animation still fall somewhere in the Uncanny Valley [wikipedia.org].
  • by Anonymous Coward on Friday April 11, 2008 @07:41PM (#23042588)
    Actually, per size of the outfit, Porsche is quite a bit more profitable, AFAIK. They have been labelled 'most profitable car maker in the world for a while'. Extrinsic numbers don't say too much ("The US are much 'richer' than Luxemburg', big deal!), intrinsic quantities matter. Sorry for being completely off topic. Otherwise, I completely agree with the video part. Audio cards have become a non-issue and 'standard video' (ie. playing HD at full frame rate) will do so, too.
  • by OMNIpotusCOM ( 1230884 ) * on Friday April 11, 2008 @09:24PM (#23043204) Homepage Journal

    I haven't, but I have heard of Carmack, and Carmack "seems to think that Intel's direction using traditional ray tracing methods is not going to work [slashdot.org]." I didn't understand anything in that article, but assuming that the blurb was correct (and Carmack didn't seem to refute it in the 3 times he replied to that story), then I'd say that they may not be "less and less interested" but maybe they are "less and less right about the direction to take." Take your pick.

    And while my little blurb may have been fundamentally incorrect, while I haven't heard anyone say they were looking forward to the new Radeon, I have heard even less people were looking forward to the new Intel graphical chipset. Have you?

    Splitting hairs on this seems kinda useless. nVidia is really it right now in the graphics world, at least as far as the public is concerned, and I don't see that changing in the near future.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...