Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia CEO "Not Afraid" of CPU-GPU Hybrids 228

J. Dzhugashvili writes "Is Nvidia worried about the advent of both CPUs with graphics processor cores and Larrabee, Intel's future discrete graphics processor? Judging by the tone adopted by Nvidia's CEO during a financial analyst conference yesterday, not quite. Huang believes CPU-GPU hybrids will be no different (and just as slow) as today's integrated graphics chipsets, and he thinks people will still pay for faster Nvidia GPUs. Regarding Larrabee, Huang says Nvidia is going to 'open a can of whoop-ass' on Intel, and that Intel's strategy of reinventing the wheel by ignoring years of graphics architecture R&D is fundamentally flawed. Nvidia also has some new hotness in the pipeline, such as its APX 2500 system-on-a-chip for handhelds and a new platform for VIA processors."
This discussion has been archived. No new comments can be posted.

Nvidia CEO "Not Afraid" of CPU-GPU Hybrids

Comments Filter:
  • Intel? (Score:4, Funny)

    by icydog ( 923695 ) on Friday April 11, 2008 @03:05PM (#23040380) Homepage
    Did I hear that correctly? NVidia is going to beat Intel in the GPU department? What a breaking development!

    In other news, Aston Martin makes better cars than Hyundai!
    • by symbolset ( 646467 ) on Friday April 11, 2008 @03:24PM (#23040638) Journal

      Ray vs raster. The reason we have so much tech in Raster is because processing was not sufficient to do ray. If it had been we'd have never started down the raster branch of development because it just doesn't work as well. The results are not as realistic with raster. Shadows don't look right. You can't do csg. You get edge effects. There are a thousand work-arounds for things like reflections of reflections, lens effects and audio reflections. Raster is a hack and when we have the CPU to do the real time ray tracing rendering raster composition will go away.

      Raster was a way to make some fairly believable (if cartoonish) video games. They still require some deliberate suspension-of-disbelief. Only with raytracing do you get the surreal Live-or-memorex feeling of not being able to tell a rendered scene from a photo, except for the fact that the realistic scene depicts something that might be physically impossible.

      • Re: (Score:3, Interesting)

        by caerwyn ( 38056 )
        This is true to some extent, but raster will never completely go away- there are situations where raster is completely appropriate.

        For instance, modern GUIs often use the 3d hardware to handle window transforms, blending and placement. These are fundamentally polygonal objects for which triangle transformation and rasterization is a perfectly appropriate tool and ray tracing would be silly.

        The current polygon model will never vanish completely, even if high-end graphics eventually go to ray tracing instead.
        • If realism is the goal, global illumination techniques, of which ray tracing and ray casting are a part of, would be your best bet. Yes, rasters have their place. But this is a small place in the grand scheme of things.

          All bets are off if the intention is not photorealism. Some hybrid of the two may be best depending on the situation.
          • by mikael ( 484 )
            One level deep ray-tracing is no different from triangle rasterisation, even with supersampling. The problem is when you try and do reflections with an entire scene. A single character will consist of over 10,000 triangles, and the scene itself might consist of 1 million triangles. Then you would have to find the space to store all these coordinates, textures and tangent space information (Octrees?)
            There were experimental systems (PixelPlanes and PixelFlow [unc.edu]) which investigated this problem.
            • I see your 1 million triangles and raise you 349 million more [openrt.de]. (Disclaimer: I'm not affiliated with OpenRT, I just think this is a neat demonstration.)

              Textures, verticies, and the like take up memory with ray tracing just like rasterization. There's a bit more flexibility with a ray tracer, though, to store non-triangular primitives. Most fast real-time ray tracers just use triangles, though.

              As for acceleration structures, octrees have largely been superseded by kd-trees (preferably constructed using

      • Only with raytracing do you get the surreal Live-or-memorex feeling of not being able to tell a rendered scene from a photo, except for the fact that the realistic scene depicts something that might be physically impossible.

        Photos, maybe. But we're still a loooooong way off for real time video when you consider that it is still relatively easy to tell CGI from live action in the highest budget prerendered movies. At close-ups anyway.
        • Re: (Score:3, Interesting)

          by wattrlz ( 1162603 )
          There must be a conspiracy behind that. There's no way big-budget studios with seven and eight figure budgets and virtually limitless cpu cycles at their disposal could be releasing big-screen features that are regularly shown up by video games and decade old tv movies. Maybe it has something to do with greenscreening to meld the cgi with live action characters, perhaps it's some sort of nostalgia, or the think that the general public just isn't ready to see movie-length photo-realistic features, but ther
          • by mrchaotica ( 681592 ) * on Friday April 11, 2008 @06:05PM (#23042270)

            Perhaps the limitation is in the ability of the humans to model the scene rather than the ability of the computer to render it.

            • by 75th Trombone ( 581309 ) on Friday April 11, 2008 @06:35PM (#23042540) Homepage Journal
              Parent +1 Insightful.

              The reason we can so easily tell the difference between CGI creatures and real creatures is not the photorealism of it, but the animation. Evaluate a screen cap of Lord of the Rings with Gollum in it, and then evaluate that entire scene in motion. The screen cap will look astonishingly realistic compared to the video.

              Computers are catching up to the computational challenges of rendering scenes, but humans haven't quite yet figured out how to program every muscle movement living creatures make. Attempts for complete realism in 3D animation still fall somewhere in the Uncanny Valley [wikipedia.org].
      • Re: (Score:3, Insightful)

        by koko775 ( 617640 )
        Even raytracing needs hacks like radiosity.

        I don't buy the 'raytracing is so much better than raster' argument. I do agree that it makes it algorithmically simpler to create near-photorealistic renders, but that doesn't mean that raster's only redeeming quality is that it's less burdensome for simpler scenes.
        • Even raytracing needs hacks like radiosity.

          Hacks like radiosity? Indeed, a ray tracer needs to be supplemented with a global illumination algorithm in order to produce plausibly realistic images, but there are many global illumination algorithms to choose from, and radiosity is the only one I know of that can be implemented without tracing rays.

          Photon mapping, for instance, is quite nice, and performs comparatively well. (Normal computers aren't fast enough to do it in real time yet, but they'll get th

      • Re: (Score:3, Interesting)

        The results are not as realistic with raster. Shadows don't look right.
        As John Carmack mentioned in a recent interview, this is in fact a bonus, for shadows as well as other things.

        The fact is that "artificial" raster shadows, lighting and reflections typically look more impressive than the "more realistic" results of ray tracing. This alone explains why raster will maintain its dominance, and why ray tracing will not catch on.
      • by ardor ( 673957 ) on Friday April 11, 2008 @08:17PM (#23043158)
        Wrong. All of it.

        Raytracing doesnt magically get you better image quality. EXCEPT for shadows, the results look just like rasterization. As usual, people mix up raytracing with path tracing, photon mapping, radiosity, and other GI algorithms. Note: GI can be applied to rasterization as well.

        So, which "benefits" are left? Refraction/reflection, haze, and any kind of ray distortion - SECONDARY ray effects. Primary rays can be fully modeled with rasterization, which gives you much better performance because of the trivial cache coherency and simpler calculations. (In a sense, rasterization can be seen as a cleverly optimized primary-ray-pass). This is why hybrid renderers make PERFECT sense. Yes, I know ray bundles, they are hard to get right, and again: for primary rays, raytracing makes no sense.

        "Suspension of disbelief" is necessary with raytracing too. You confuse the rendering technique with lighting models, animation quality and so on. "edge effects" is laughable, aliasing WILL occur with raytracing as well unless you shoot multiple rays per pixel (and guess what... rasterizers commonly HAVE MSAA).

        Jeez, when will people stop thinking all this BS about raytracing? As if it were a magical thingie capable of miracously enhancing your image quality....

        Raytracing has its place - as an ADDITION to a rasterizer, to ease implementation of the secondary ray effects (which are hard to simulate with pure rasterization). This is the future.
    • ... In other news, Aston Martin makes better cars than Hyundai!
      In light of the often facetuos nature of any sentence containing the words, "British Engineering", the Comparison of Aston Martin's reputation for reliability with Hyundai's, and the comparison of their current parent company's reputations and stock prices... My word! That is news, indeed!
      • by Sciros ( 986030 )
        Aston Martin's privately owned. Bought from Ford by rich Kuwaitis for $850 million or something.

        Can't say that's necessarily a good thing, but I guess Ford wanted the money.

        And yeah, Hyundais are better built than Astons. But Astons are better in many other regards of course.
    • Re: (Score:3, Insightful)

      by TheRaven64 ( 641858 )
      nVidia beating Intel in the GPU market would indeed be news. Intel currently have something like 40% of the GPU market, while nVidia is closer to 30%. Reading the quote from nVidia, I hear echoes of the same thing that the management at SGI said just before a few of their employees left, founded nVidia, and destroyed the premium workstation graphics market by delivering almost as good consumer hardware for a small fraction of the price.

      nVidia should be very careful that they don't make the same mistake a

  • More details here in HotHardware's coverage: http://www.hothardware.com/News/NVIDIA_Gets_Aggressive_Dismisses_CPUGPU_Fusion/ [hothardware.com] Jen-Sun squarin' off!
  • Multi Core GPUs (Score:2, Interesting)

    by alterami ( 267758 )
    What AMD should really try to do is start combining their cpu technology and their graphics technology and make some multi core GPUs. They might be better positioned to do this than Intel or Nvidia.
    • Re: (Score:3, Insightful)

      Modern GPUs already have 8-16 cores.
    • That is basically what GPUs do already. You'll notice modern ones have something like 128 "stream processors".
      • by makomk ( 752139 )
        Yep, and I think they're essentially dumbed-down general CPU cores; apparently they don't even support vector operations and are scalar-only.
        • apparently they don't even support vector operations and are scalar-only.
          Yes, that's true. The reason is because it's hard to keep those vector units fully utilized. You get better utilization with more scalar units rather than fewer vector ones (for the same area of silicon).
        • Yep, and I think they're essentially dumbed-down general CPU cores; apparently they don't even support vector operations and are scalar-only.

          I'm not sure that statement makes sense. Sure, you could talk about the GPU operating on 128 scalar values, but you could just as well talk about it operating on a single 128-dimensional vector. (Or any combination thereof: N vectors of degree (128 / N).)

        • Re: (Score:3, Interesting)

          by et764 ( 837202 )
          Why does your CPU need vector operations if you have a vector of CPUs?
    • by Zebra_X ( 13249 )
      Lol... Just like 64 bit and multicore, AMD was talking about Fusion way before anyone else was. Intel has "stolen" yet another idea from AMD. Unfortunately, the reality is that AMD doesn't have the capital to refresh it's production lines as often as Intel - and I think to some extent the human capital to exectute on the big ideas.
  • Let's Face It (Score:3, Insightful)

    by DigitalisAkujin ( 846133 ) on Friday April 11, 2008 @03:15PM (#23040494) Homepage
    Until Intel can show us Crysis in all it's GPU raping glory running on it's chipset in 1600x1200 with all settings to Ultra High Nvidia and ATI will still be kings of high end graphics. Then again, if all Intel wants to do is create a sub standard alternative to those high end cards just to run Vista Aero and *nix Beryl then they have already succeeded.
    • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Friday April 11, 2008 @03:25PM (#23040654) Journal

      Until Intel can show us Crysis

      If Intel is right, there won't be much of an effect on existing games.

      Intel is focusing on raytracers, something Crytek has specifically said that they will not do. Therefore, both Crysis and any sequels won't really see any improvement from Intel's approach.

      If Intel is right, what we are talking about is the Crysis-killer -- a game that looks and plays much better than Crysis (and maybe with a plot that doesn't completely suck [penny-arcade.com]), and only on Intel hardware, not on nVidia.

      Oh, and Beryl has been killed and merged. It's just Compiz now, and Compiz Fusion if you need more.

      • When Intel has a ray tracing Crysis-killing demo, we'll be in 2013 playing things 10-100x more complex/faster on raster hardware.
    • Re: (Score:3, Interesting)

      by LurkerXXX ( 667952 )
      Intel has open specs on their integrated video hardware, so Open Source folks can write their own stable drivers.

      ATI and Nvidia do not. I know who I'm rooting for to come up with a good hardware...
      • Re: (Score:2, Informative)

        by hr.wien ( 986516 )
        ATI have open specs. At least for a lot of their hardware. They are releasing more and more documentation as it gets cleaned up and cleared by legal. Open Source ATI drivers are coming on in leaps and bounds as a result.
    • by Alioth ( 221270 )
      The thing is, nvidia's statement feels spookily like "famous last words".

      I'm sure DEC engineers poo-pooed Intel back in the early 90s when the DEC Alpha blew away anything Intel made by a factor of four. But a few short years later, Chipzilla had drawn level. Now Alpha processors aren't even made and DEC is long deceased.

      Nvidia ought not to rest on its laurels like DEC did, or Intel will crush them.
  • Some of the comments made were very interesting. He really slammed someone that I take was either an Intel rep, or otherwise associate. The best was when that rep/associate/whoever criticized Nvidia about their driver issues in Vista, and the slam-dunk response that I paraphase, "If we [Nvidia] only had to support the same product/application that Intel has [Office 2003] for the last 5 years then we probably wouldn't have as many driver issues as well. But since we have new products/applications that our dr
  • by WoTG ( 610710 ) on Friday April 11, 2008 @03:21PM (#23040590) Homepage Journal
    IMHO, Nvidia is stuck as the odd-man out. When integrated chipsets and GPU-CPU hybrids can easily handle full-HD playback, the market for discrete GPUs falls and falls some more. Sure, discrete will always be faster, just like a Porsche is faster than a Toyota, but who makes more money (by a mile)?

    Is Creative still around? Last I heard, they were making MP3 players...
    • by bmajik ( 96670 )

      just like a Porsche is faster than a Toyota, but who makes more money

      Porsche, actually, if you're referring to profit

      Porsche is done of the most profitable automakers. If you've ever looked at a Porsche options sheet, it will become clear why this is the case. They also have brilliant/lucky financial people.

      http://www.bloomberg.com/apps/news?pid=20601087&sid=aYvaIoPRz4Vg&refer=home [bloomberg.com]

      • by WoTG ( 610710 )
        Wow, point taken. I guess that wasn't the best example! I knew I should have said Ferrari.

        I wonder what portion of Porche's profit comes from Volkswagon?
    • by Miseph ( 979059 )
      Pretty decent ones, too. I managed to pick up a 1gb Zen Stone for under $10, and the only complaint I have is that it won't play ogg and there isn't a Rockbox port out there for it (yet). Since I dislike iTunes anyway (it has too many "features" that serve only to piss me off, and any UI that treats me like 5 year old just sets me on edge) it pretty much does everything that I want an mp3 player to do.

      Plus, as an added bonus, I don't have to pretend that I'm hip or trendy while I listen to it; if people thi
    • by sustik ( 90111 )
      Regarding full HD playback:

      ASUS M2N-VM DVI (with integrated video) + AMD64 BE-2300 (45W) plays true 1080p with max 80% cpu (one out of the two cores are used only!!). Tested on a trailer for Pirates of the Caribbean (since I have no Blue-ray player yet).

      This is on Linux 2.6, with recent mplayer using the nvidia driver.

      Quite happy with it!
  • He should be afraid (Score:5, Interesting)

    by Yvan256 ( 722131 ) on Friday April 11, 2008 @03:22PM (#23040608) Homepage Journal
    I, for one, don't want a GPU which requires 25W+ in standby mode.

    My Mac mini has a maximum load of 110W. That's the Core 2 Duo CPU, the integrated GMA950, 3GB of RAM, a 2.5" drive and a DVD burner, not to mention FireWire 400 and four USB 2.0 ports under maximum load (the FW400 port being 8W alone).

    Granted the GMA950 sucks compared to nVidia's current offerings, however do they have any plans for low-power GPUs? I'm pretty sure the whole company can't survive on the FPS-crazed game players revenues alone.

    They should start thinking about asking intel to integrate their (current) laptop GPUs into intel CPUs.
    • by forsey ( 1136633 ) on Friday April 11, 2008 @03:36PM (#23040818)
      Actually nVidia is working a new technology called HybridPower which involves a computer with both an on board and discrete graphics card, where the low power on board card is used most of the time (when you are just in your OS environment of choice), but when you need the power (for stuff like games) the discrete card boots up.
      • ...the low power on board card is used most of the time (when you are just in your OS environment of choice), but when you need the power (for stuff like games) the discrete card boots up.


        Cool, does that mean I can use my 3dfx voodoo2's pass-through cable again?
    • Nvidia already makes IGPs that are pretty low power; they don't even need fans.

      For ultimate low power, there's the future VIA/Nvidia hookup: http://www.dailytech.com/NVIDIA%20Promises%20Powerful%20Sub45%20Processing%20Platform%20to%20Counter%20Intel/article11452.htm [dailytech.com]
    • Sigh (Score:3, Informative)

      by Sycraft-fu ( 314770 )
      Of COURSE they do, in fact they already HAVE low power offerings. I'm not sure why people seem to think the 8800 is the only card nVidia makes. nVidia is quite adept at taking their technology and scaling it down. Just reduce the clock speed, cut off shader units and such, there you go. In the 8 series they have an 8400. I don't know what the power draw is, but it doesn't have any extra power connectors so it is under 75 watts peak by definition (that's all PCIe can handle). They have even lower power cards
      • by Yvan256 ( 722131 )
        Since you seem to know nVidia's lineup, what would you recommend for a fan-less PCI or AGP card good enough to run Final Fantasy XI in 1280x1024?

        And before you say "FF XI is old, any current card will do", let me remind you that it's a MMORPG with more than a few dozen players on the screen at once (in Jeuno, for example), and my target is to have around 20 FPS in worst-case scenario.

        I'm asking for PCI because I might try to dump my current AMD Athlon 2600+/KT6 Delta box (which is huge) with a fanless mini-
        • Hmmm, well I don't know that game in particular however in general I think a 7600GS should work ok. That's enough to run WoW at a decent rez with decent settings. It's also one of the few models available for AGP (the AGP market is getting real thin these days). http://www.newegg.com/Product/Product.aspx?Item=N82E16814121064 [newegg.com] is a link to buy it.

          If you got a system with PCIe, there's more options, but AGP is being phased out so there's less cards available for it.

          If you need more power, you'll have to get a
      • by jhol13 ( 1087781 )
        But why cannot e.g. 8800 shutdown most of the cores when they are not used? I.e. use only one or two for the compiz or Aero and use all with some whizbang game. Or drop the cores frequency depending on load?
  • by klapaucjusz ( 1167407 ) on Friday April 11, 2008 @03:24PM (#23040626) Homepage
    If I understand them right, they're claiming that integrated graphics and CPU/GPU hybrids are just a toy, and that you want discrete graphics if you're serious. Ken Olsen famously said that "the PC is just a toy". When did you last use a "real" computer?
    • by Alioth ( 221270 )
      And of course they and game developers slam Intel's product... but 99.9% of PCs are never used to play games.

      Nvidia's statement sounds like famous last words, too. I think their laurels are getting pressed too flat from resting on them. Just as Intel's CPUs eventually caught up and overtook the Alpha, the same might happen with their graphics chipsets.
  • So, if you have a hybrid chip, why not put it on a motherboard with a slot for a nice nvidia card. Then you'll get all the raytracing goodness from intel, plus the joyful rasterbation of nvidia's finest offerings. The word "coprocessor" exists for a reason. Or am I missing something here?
    • Or am I missing something here?
      Yeah, you're missing some money from your wallet. Most people won't waste their money on two different GPUs, just like they won't buy PPUs or Killer NICs.
  • "Huang says Nvidia is going to 'open a can of whoop-ass' on Intel..."

    This is a VERY SERIOUS problem for the entire world. There are apparently no people available who have both technical understanding and social sophistication.

    Huang is obviously ethnic Chinese. It is likely he is imitating something he heard in a movie or TV show. He certainly did not realize that only ignorant angry people use that phrase.

    Translating, that phrase, and the boasting in general, says to me: "Huang must be fired. nVid
    • only ignorant angry people use that phrase.
      Only ignorant angry people make such generalizations.
    • This is a VERY SERIOUS problem for the entire world. There are apparently no people available who have both technical understanding and social sophistication.

      Maybe he was out of chairs?
    • Re: (Score:3, Informative)

      by nuzak ( 959558 )
      Huang is obviously ethnic Chinese. It is likely he is imitating something he heard in a movie or TV show.

      Yeah, them slanty-eyed furriners just can't speak English right, can they?

      Huang is over 40 years old and has lived in the US since he was a child. Idiot.

  • ouch (Score:3, Informative)

    by Lord Ender ( 156273 ) on Friday April 11, 2008 @03:38PM (#23040840) Homepage
    NVDA was down 7% in the stock market today. As an Nvidia shareholder, that hurts!

    If you don't believe Intel will ever compete with Nvidia, now is probably a good time to buy. NVDA has a forward P/E of 14. That's a "value stock" price for a leading tech company... you don't get opportunities like that often. NVDA also has no debt on the books, so the credit crunch does not directly affect them.
  • by scumdamn ( 82357 ) on Friday April 11, 2008 @03:38PM (#23040850)
    Intel is and always has been CPU-centric. That's all they ever seem to focus on because it's what they do best. Nvidia is focusing 100% on GPUs because it's what they best. AMD seems to have it right with their combination of the two (by necessity) because they're focusing on a mix between the two. I'm seriously stoked about the 780G chipset they rolled out this month because it's an integrated chipset that doesn't suck and actually speeds up an ATI video card if you add the right one. Given, AMD isn't the fastest when it comes to either graphics or processors but at least they have a platform with a chipset, CPU, and graphics that work together. Chipsets have needed to be a bit more powerful for a long-ass time.
    • by Kjella ( 173770 )
      Only if you think the non-serious gamer market is a big hit. It's been a long time since I heard anyone complain about GPU performance except in relation to a game. Graphics cards run at the resolution you want, and play a lot of well... not GPU intensive games or rather non-GRU games at all, but games you could run on highly inefficient platforms like flash and still do alright. And for the games that do consider GPU-performance, more is usually always better. Sure, it's an integrated chipset but I've yet
      • Re: (Score:3, Informative)

        by scumdamn ( 82357 )
        The non-serious gamer market it TOTALLY a big hit. And the benefit of this chipset for many users is that you get decent 3D performance with a motherboard for the same price you would pay for a motherboard without the integrated graphics.

        And if you decide to bump it up a notch and buy a 3450 it operates in Hybrid Crossfire so your onboard graphics aren't totally disabled. Explain to me how that isn't cool?

  • Can of Whoop Ass?? (Score:3, Interesting)

    by TomRC ( 231027 ) on Friday April 11, 2008 @03:56PM (#23041108)
    Granted NVidia is way out ahead in graphics performance - but generally you can tell when that someone is getting nervous when they start in the belligerant bragging.

    The risk for NVidia isn't that Intel will surpass them, or even necessarily approach their best performance. The risk is that Intel might start catching up, cutting (further) into NVidia's market share.
    AMD's acquisition of ATI seems to imply that they see tight integration of graphics to be at least cheaper for a given level of performance, or higher performance for a given price. Apply that same reasoning to Intel, since they certainly aren't likely to let AMD have that advantage all to themselves.

    Now try to apply that logic to NVidia - what are they going to do, merge with a distant-last-place x86 maker?
  • NVidia may just put a CPU or two in their graphics chips. They already have more transistors in the GPU than Intel has in many of their CPUs. They could license a CPU design from AMD. A CPU design today is a file of Verilog, so integrating that onto a part with a GPU isn't that big a deal.

  • Just like the FPU (Score:5, Interesting)

    by spitzak ( 4019 ) on Friday April 11, 2008 @04:22PM (#23041360) Homepage
    Once upon a time the floating point was done on a seperate chip. You could buy a cheaper "non-professional" machine that emulated the fpu in software and ran slower. You could also upgrade your machine by adding the fpu chip.

    Such FPU's do not exist today.

    I think Nvidia should be worried about this.

    • by Kjella ( 173770 )
      The FPU is a sparring partner, the GPU is a pipeline. Except for a few special uses, the GPU doesn't need to talk back to the CPU. The FPU (that's floating point unit, doing non-integer math) very often returned a result that the CPU needed back. The result is that it makes sense to put the CPU and FPU very close, the same doesn't hold true for the GPU. It's substantially more difficult to cool 1x100W chip than 2x50W chips. Hell, even on a single chip Intel and nVidia could draw up "this is my part, this is
    • I think Nvidia should be worried about this.
      I think Nvidia should talk to IBM's Power division about doing a JV, preferably with a Cell CPU core. After all, the PS/3, the XBox 360, and the Wii all use PowerPC cores... and Nvidia developed the PS/3 GPU. Worst case x86 scenario is VIA buying Nvidia and doing it. If that's the way the market is going, so be it.
  • The problem... (Score:3, Interesting)

    by AdamReyher ( 862525 ) * <adam@@@pylonhosting...com> on Friday April 11, 2008 @04:22PM (#23041370) Homepage
    ...is that Nvidia is saying that Intel is ignoring years of GPU development. Umm, wait. Isn't a GPU basically a mini-computer/CPU by itself that exclusively handles graphics calculations? By making this statement, I think they've forgotten who Intel is. Intel has more than enough experience in the field to go off on their own and make GPUs. Is it something to be scared of? Probably not, because as he correctly points out, a dedicated GPU will be more powerful. However, it's not something that can be ignored. We'll just have to wait and see.
    • by slittle ( 4150 )
      If Intel wants into the market for real, I think ATI/nVidia should be plenty afraid.

      Intel not only has their own fabrication plants, but they're high end ones not three or four sizes behind. Apple (of all companies) switched to Intel's CPUs for a reason - they have the capacity and the technology to produce lots of fast, low power components and a market base large enough (CPUs and chips of all kinds) to keep their GPU technology ahead of the curve.

      With the market tending towards lower power, mobile comput
  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Friday April 11, 2008 @05:45PM (#23042100) Homepage
    Having the GPU built into the CPU is primarily a cost-cutting measure. Take one low-end CPU, add one low-end GPU, and you have a single-chip solution that consumes a bit less power than separate components.

    Nobody expects the CPU+GPU to yield gaming performance worth a damn, because the two big companies that are looking into this amalgam both have underperforming graphics technology. Do they both make excellent budget solutions ? Yes they certainly do, but for those who crave extreme speed, the only option is NVidia.

    That said, not everyone plays shooters. Back in my retail days, I'd say I moved 50 times more bottom-end GPUs than top-end ones. Those Radeon 9250s were $29.99 piles of alien poop, but cheap poop is enough for the average norm. The only people who spent more than $100 on a video card were teenagers and comic book guys (and of course, my awesome self).
  • I think this comment on the story is pretty insightful:

    When you start talking pre-emptively about your competitor's vapor, you're officially worried.
  • Lrrr: "This is Earth's most foolish GPU vendor. Why doesn't Nvidia, the largest of the GPU vendors, simply eat the other competitors?"
    Ndnd: "Maybe they're saving it for sweeps."
  • I can't for the life of me imagine someone with a thick Taiwanese accent saying "we're going to open up a can of whoop-ass on Intel".
  • by CompMD ( 522020 ) on Saturday April 12, 2008 @02:38AM (#23044824)
    The large corporations and engineering companies that have *THOUSANDS* of high-end workstations need graphics hardware compatible with complex, specialized software. I'm talking Unigraphics, CATIA, Patran, Femap, etc. You need to use the hardware certified by the software publisher otherwise you don't get support and you can't trust the work you are doing to be correct. And the vast majority of the cards that are up to the challenge are nvidia cards.

    I have done CAD/CAM for ages, and my P3-750 with a Quadro4 700XGL isn't noticeably slower than a P4-3.4 with a Radeon X300SE running Unigraphics NX 5. I have a P3-500 with a QuadroFX-1000 card that freaking flies running CATIA V5. Again, in contrast, my 1.8GHz P4 laptop with integrated Intel graphics sucks balls running either UG or CATIA.

    Speaking for the workstation users out there, please keep making high performance GPUs, Nvidia.

No spitting on the Bus! Thank you, The Mgt.

Working...