Nvidia CEO "Not Afraid" of CPU-GPU Hybrids 228
J. Dzhugashvili writes "Is Nvidia worried about the advent of both CPUs with graphics processor cores and Larrabee, Intel's future discrete graphics processor? Judging by the tone adopted by Nvidia's CEO during a financial analyst conference yesterday, not quite. Huang believes CPU-GPU hybrids will be no different (and just as slow) as today's integrated graphics chipsets, and he thinks people will still pay for faster Nvidia GPUs. Regarding Larrabee, Huang says Nvidia is going to 'open a can of whoop-ass' on Intel, and that Intel's strategy of reinventing the wheel by ignoring years of graphics architecture R&D is fundamentally flawed. Nvidia also has some new hotness in the pipeline, such as its APX 2500 system-on-a-chip for handhelds and a new platform for VIA processors."
Intel? (Score:4, Funny)
In other news, Aston Martin makes better cars than Hyundai!
Ray tracing for the win (Score:5, Informative)
Ray vs raster. The reason we have so much tech in Raster is because processing was not sufficient to do ray. If it had been we'd have never started down the raster branch of development because it just doesn't work as well. The results are not as realistic with raster. Shadows don't look right. You can't do csg. You get edge effects. There are a thousand work-arounds for things like reflections of reflections, lens effects and audio reflections. Raster is a hack and when we have the CPU to do the real time ray tracing rendering raster composition will go away.
Raster was a way to make some fairly believable (if cartoonish) video games. They still require some deliberate suspension-of-disbelief. Only with raytracing do you get the surreal Live-or-memorex feeling of not being able to tell a rendered scene from a photo, except for the fact that the realistic scene depicts something that might be physically impossible.
Re: (Score:3, Interesting)
For instance, modern GUIs often use the 3d hardware to handle window transforms, blending and placement. These are fundamentally polygonal objects for which triangle transformation and rasterization is a perfectly appropriate tool and ray tracing would be silly.
The current polygon model will never vanish completely, even if high-end graphics eventually go to ray tracing instead.
Re: (Score:2)
All bets are off if the intention is not photorealism. Some hybrid of the two may be best depending on the situation.
Re: (Score:2)
There were experimental systems (PixelPlanes and PixelFlow [unc.edu]) which investigated this problem.
memory, acceleration structures, big scenes (Score:2)
I see your 1 million triangles and raise you 349 million more [openrt.de]. (Disclaimer: I'm not affiliated with OpenRT, I just think this is a neat demonstration.)
Textures, verticies, and the like take up memory with ray tracing just like rasterization. There's a bit more flexibility with a ray tracer, though, to store non-triangular primitives. Most fast real-time ray tracers just use triangles, though.
As for acceleration structures, octrees have largely been superseded by kd-trees (preferably constructed using
Re: (Score:2)
Photos, maybe. But we're still a loooooong way off for real time video when you consider that it is still relatively easy to tell CGI from live action in the highest budget prerendered movies. At close-ups anyway.
Re: (Score:3, Interesting)
Re:Ray tracing for the win (Score:4, Insightful)
Perhaps the limitation is in the ability of the humans to model the scene rather than the ability of the computer to render it.
Re:Ray tracing for the win (Score:5, Insightful)
The reason we can so easily tell the difference between CGI creatures and real creatures is not the photorealism of it, but the animation. Evaluate a screen cap of Lord of the Rings with Gollum in it, and then evaluate that entire scene in motion. The screen cap will look astonishingly realistic compared to the video.
Computers are catching up to the computational challenges of rendering scenes, but humans haven't quite yet figured out how to program every muscle movement living creatures make. Attempts for complete realism in 3D animation still fall somewhere in the Uncanny Valley [wikipedia.org].
Re: (Score:3, Insightful)
I don't buy the 'raytracing is so much better than raster' argument. I do agree that it makes it algorithmically simpler to create near-photorealistic renders, but that doesn't mean that raster's only redeeming quality is that it's less burdensome for simpler scenes.
"hacks like radiosity" (Score:2)
Hacks like radiosity? Indeed, a ray tracer needs to be supplemented with a global illumination algorithm in order to produce plausibly realistic images, but there are many global illumination algorithms to choose from, and radiosity is the only one I know of that can be implemented without tracing rays.
Photon mapping, for instance, is quite nice, and performs comparatively well. (Normal computers aren't fast enough to do it in real time yet, but they'll get th
Re: (Score:3, Interesting)
The fact is that "artificial" raster shadows, lighting and reflections typically look more impressive than the "more realistic" results of ray tracing. This alone explains why raster will maintain its dominance, and why ray tracing will not catch on.
Re:Ray tracing for the win (Score:5, Informative)
Raytracing doesnt magically get you better image quality. EXCEPT for shadows, the results look just like rasterization. As usual, people mix up raytracing with path tracing, photon mapping, radiosity, and other GI algorithms. Note: GI can be applied to rasterization as well.
So, which "benefits" are left? Refraction/reflection, haze, and any kind of ray distortion - SECONDARY ray effects. Primary rays can be fully modeled with rasterization, which gives you much better performance because of the trivial cache coherency and simpler calculations. (In a sense, rasterization can be seen as a cleverly optimized primary-ray-pass). This is why hybrid renderers make PERFECT sense. Yes, I know ray bundles, they are hard to get right, and again: for primary rays, raytracing makes no sense.
"Suspension of disbelief" is necessary with raytracing too. You confuse the rendering technique with lighting models, animation quality and so on. "edge effects" is laughable, aliasing WILL occur with raytracing as well unless you shoot multiple rays per pixel (and guess what... rasterizers commonly HAVE MSAA).
Jeez, when will people stop thinking all this BS about raytracing? As if it were a magical thingie capable of miracously enhancing your image quality....
Raytracing has its place - as an ADDITION to a rasterizer, to ease implementation of the secondary ray effects (which are hard to simulate with pure rasterization). This is the future.
Re: (Score:2)
Re: (Score:2)
Can't say that's necessarily a good thing, but I guess Ford wanted the money.
And yeah, Hyundais are better built than Astons. But Astons are better in many other regards of course.
Re: (Score:3, Insightful)
nVidia should be very careful that they don't make the same mistake a
More Details On The Call, Here (Score:2, Offtopic)
Multi Core GPUs (Score:2, Interesting)
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I'm not sure that statement makes sense. Sure, you could talk about the GPU operating on 128 scalar values, but you could just as well talk about it operating on a single 128-dimensional vector. (Or any combination thereof: N vectors of degree (128 / N).)
Re: (Score:3, Interesting)
Re: (Score:2)
Let's Face It (Score:3, Insightful)
NOTHING to do with existing games. (Score:5, Informative)
If Intel is right, there won't be much of an effect on existing games.
Intel is focusing on raytracers, something Crytek has specifically said that they will not do. Therefore, both Crysis and any sequels won't really see any improvement from Intel's approach.
If Intel is right, what we are talking about is the Crysis-killer -- a game that looks and plays much better than Crysis (and maybe with a plot that doesn't completely suck [penny-arcade.com]), and only on Intel hardware, not on nVidia.
Oh, and Beryl has been killed and merged. It's just Compiz now, and Compiz Fusion if you need more.
Re: (Score:2)
Re: (Score:2)
Very true, and the plot and dialog were horrible.
Re: (Score:3, Interesting)
ATI and Nvidia do not. I know who I'm rooting for to come up with a good hardware...
Re: (Score:2, Informative)
Re: (Score:2)
I'm sure DEC engineers poo-pooed Intel back in the early 90s when the DEC Alpha blew away anything Intel made by a factor of four. But a few short years later, Chipzilla had drawn level. Now Alpha processors aren't even made and DEC is long deceased.
Nvidia ought not to rest on its laurels like DEC did, or Intel will crush them.
Interesting comments in the call (Score:2)
Did anyone expect him to surrender? (Score:5, Insightful)
Is Creative still around? Last I heard, they were making MP3 players...
Re: (Score:2)
Porsche, actually, if you're referring to profit
Porsche is done of the most profitable automakers. If you've ever looked at a Porsche options sheet, it will become clear why this is the case. They also have brilliant/lucky financial people.
http://www.bloomberg.com/apps/news?pid=20601087&sid=aYvaIoPRz4Vg&refer=home [bloomberg.com]
Re: (Score:2)
I wonder what portion of Porche's profit comes from Volkswagon?
Re: (Score:2)
Re: (Score:2)
Plus, as an added bonus, I don't have to pretend that I'm hip or trendy while I listen to it; if people thi
Re: (Score:2)
ASUS M2N-VM DVI (with integrated video) + AMD64 BE-2300 (45W) plays true 1080p with max 80% cpu (one out of the two cores are used only!!). Tested on a trailer for Pirates of the Caribbean (since I have no Blue-ray player yet).
This is on Linux 2.6, with recent mplayer using the nvidia driver.
Quite happy with it!
He should be afraid (Score:5, Interesting)
My Mac mini has a maximum load of 110W. That's the Core 2 Duo CPU, the integrated GMA950, 3GB of RAM, a 2.5" drive and a DVD burner, not to mention FireWire 400 and four USB 2.0 ports under maximum load (the FW400 port being 8W alone).
Granted the GMA950 sucks compared to nVidia's current offerings, however do they have any plans for low-power GPUs? I'm pretty sure the whole company can't survive on the FPS-crazed game players revenues alone.
They should start thinking about asking intel to integrate their (current) laptop GPUs into intel CPUs.
Re:He should be afraid (Score:4, Informative)
Re: (Score:2)
Cool, does that mean I can use my 3dfx voodoo2's pass-through cable again?
Re: (Score:2)
For ultimate low power, there's the future VIA/Nvidia hookup: http://www.dailytech.com/NVIDIA%20Promises%20Powerful%20Sub45%20Processing%20Platform%20to%20Counter%20Intel/article11452.htm [dailytech.com]
Sigh (Score:3, Informative)
Re: (Score:2)
And before you say "FF XI is old, any current card will do", let me remind you that it's a MMORPG with more than a few dozen players on the screen at once (in Jeuno, for example), and my target is to have around 20 FPS in worst-case scenario.
I'm asking for PCI because I might try to dump my current AMD Athlon 2600+/KT6 Delta box (which is huge) with a fanless mini-
Re: (Score:2)
If you got a system with PCIe, there's more options, but AGP is being phased out so there's less cards available for it.
If you need more power, you'll have to get a
Re: (Score:2)
The PC is just a toy (Score:4, Insightful)
Re: (Score:2)
Nvidia's statement sounds like famous last words, too. I think their laurels are getting pressed too flat from resting on them. Just as Intel's CPUs eventually caught up and overtook the Alpha, the same might happen with their graphics chipsets.
Can't we all just get along? (Score:2)
Re: (Score:2)
Translation: "nVidia needs a better top manager." (Score:2, Funny)
This is a VERY SERIOUS problem for the entire world. There are apparently no people available who have both technical understanding and social sophistication.
Huang is obviously ethnic Chinese. It is likely he is imitating something he heard in a movie or TV show. He certainly did not realize that only ignorant angry people use that phrase.
Translating, that phrase, and the boasting in general, says to me: "Huang must be fired. nVid
Re: (Score:2)
Re:Translation: "nVidia needs a better top manager (Score:3, Insightful)
Maybe he was out of chairs?
Re: (Score:3, Informative)
Yeah, them slanty-eyed furriners just can't speak English right, can they?
Huang is over 40 years old and has lived in the US since he was a child. Idiot.
Re: (Score:2)
Re: (Score:2)
Okay, here is more detail about why he is foolish. (Score:4, Interesting)
Quote from the article: "Nvidia CEO Jen-Hsun Huang was quite vocal on those fronts, arguing hybrid chips that mix microprocessor and graphics processor cores will be no different from systems that include Intel or AMD integrated graphics today."
My opinion: There would be no need for all the talk if there were no chance of competition. Everyone knows there will be new competition from Intel Larabee and AMD/ATI. Everyone knows that "no different" is a lie. Lying exposes the Nvidia CEO as a weak man.
"... he explained that Nvidia is continuously reinventing itself and that it will be two architectural refreshes beyond the current generation of chips before Larrabee launches."
The entire issue is that Intel+Larabee and AMD+ATI will make Nvidia irrelevant for most users. The GPU will be on the motherboard. Nvidia will sell only to gamers who are willing to pay extra, a lot extra.
"Huang also raised the prospect of application and API-level compatibility problems with Larrabee. Intel has said Larrabee will support the DirectX 10 and OpenGL application programming interfaces just like current AMD and Nvidia GPUs, but Huang seemed dubious Intel could deliver on that front."
Intel, in this case, is Intel and Microsoft working together. Both are poorly managed companies in many ways, but they are both managed well enough to insure that the Microsoft product works with the Intel hardware. Sure, it is an easy guess that Microsoft will release several buggy versions, because Microsoft has a history of treating its customers as though they were beta testers, but eventually everything will work correctly.
'[NVidia VP] Tamasi went on to shoot down Intel's emphasis on ray tracing, which the chipmaker has called "the future for games." '
Ray tracing is certainly the future for games, there is no question about that. The question is when, because the processor power required is huge. It's my guess, but an easy guess, that Mr. Tamasi is lying; he is apparently trying to take advantage of the ignorance of financial analists.
"Additionally, Tamasi believes rasterization is inherently more scalable than ray tracing. He said running a ray tracer on a cell phone is "hard to conceive."
This is apparently another attempt to confuse the financial analyists, who often have only a pretend interest in technical things. Anyone understanding the statement knows it is nonsense. No one is suggesting that there will be ray-tracing on cell phones. My opinion is that this is another lie.
"We're gonna be highly focused on bringing a great experience to people who care about it," he explained, adding that Nvidia hardware simply isn't for everyone."
That was a foolish thing to say. That's the whole issue! In the future, Nvidia's sales will drop because "Nvidia hardware simply isn't for everyone." Most computers will not have separate video adapters, whereas they did before. Only powerful game machines will need to by from Nvidia.
'Huang added, "I would build CPUs if I could change the world [in doing so]." ' Later in the article, it says, "Nvidia is readying a platform to accompany VIA's next-generation Isaiah processor, which should fight it out with Intel's Atom in the low-cost notebook and desktop arena"
Translation: Before, every desktop computer needed a video adapter, which came from a company different than the CPU maker, a company like Nvidia. Now, the video adapters will be mostly supplied by CPU makers. In response, Nvidia will start making low-end CPUs. It is questionable whether Nvidia can compete with Intel and AMD making any kind of CPU.
Typo corrections (Score:2)
Typing too fast.
Re: (Score:2)
The entire issue is that Intel+Larabee and AMD+ATI will make Nvidia irrelevant for most users. The GPU will be on the motherboard. Nvidia will sell only to gamers who are willing to pay extra, a lot extra.
AMD acquired ATI almost two years ago. How did this make Nvidia irrelevant? GPUs are ALREADY on motherboards, and have been for years. Plus, the cost of embedding CPU+GPU is not much cheaper than the discrete solutions since yield will be much lower. Moreover, CPU+GPU on one die has its risks. What if the GPU part fails? What if the CPU fails? You'll need to replace both.
Ray tracing is certainly the future for games, there is no question about that.
I beg to differ. Many questions surround this. Would you know more than John Carmack? [slashdot.org]
Before, every desktop computer needed a video adapter, which came from a company different than the CPU maker, a company like Nvidia. Now, the video adapters will be mostly supplied by CPU makers
Ok. You are showing your ignorance of the history
Re:Okay, here is more detail about why he is fooli (Score:2)
Don't generalize. "Everyone" doesn't think that. I don't think that. Hence your a liar.
Wow. GPUs have been on
ouch (Score:3, Informative)
If you don't believe Intel will ever compete with Nvidia, now is probably a good time to buy. NVDA has a forward P/E of 14. That's a "value stock" price for a leading tech company... you don't get opportunities like that often. NVDA also has no debt on the books, so the credit crunch does not directly affect them.
I think AMD has a better plan (Score:5, Interesting)
Re: (Score:2)
Re: (Score:3, Informative)
And if you decide to bump it up a notch and buy a 3450 it operates in Hybrid Crossfire so your onboard graphics aren't totally disabled. Explain to me how that isn't cool?
Can of Whoop Ass?? (Score:3, Interesting)
The risk for NVidia isn't that Intel will surpass them, or even necessarily approach their best performance. The risk is that Intel might start catching up, cutting (further) into NVidia's market share.
AMD's acquisition of ATI seems to imply that they see tight integration of graphics to be at least cheaper for a given level of performance, or higher performance for a given price. Apply that same reasoning to Intel, since they certainly aren't likely to let AMD have that advantage all to themselves.
Now try to apply that logic to NVidia - what are they going to do, merge with a distant-last-place x86 maker?
NVidia may just add a CPU. (Score:2)
NVidia may just put a CPU or two in their graphics chips. They already have more transistors in the GPU than Intel has in many of their CPUs. They could license a CPU design from AMD. A CPU design today is a file of Verilog, so integrating that onto a part with a GPU isn't that big a deal.
Just like the FPU (Score:5, Interesting)
Such FPU's do not exist today.
I think Nvidia should be worried about this.
Re: (Score:2)
Re: (Score:2)
The problem... (Score:3, Interesting)
Re: (Score:2)
Intel not only has their own fabrication plants, but they're high end ones not three or four sizes behind. Apple (of all companies) switched to Intel's CPUs for a reason - they have the capacity and the technology to produce lots of fast, low power components and a market base large enough (CPUs and chips of all kinds) to keep their GPU technology ahead of the curve.
With the market tending towards lower power, mobile comput
CPU+GPU is mostly a cost-cutting measure (Score:3, Interesting)
Nobody expects the CPU+GPU to yield gaming performance worth a damn, because the two big companies that are looking into this amalgam both have underperforming graphics technology. Do they both make excellent budget solutions ? Yes they certainly do, but for those who crave extreme speed, the only option is NVidia.
That said, not everyone plays shooters. Back in my retail days, I'd say I moved 50 times more bottom-end GPUs than top-end ones. Those Radeon 9250s were $29.99 piles of alien poop, but cheap poop is enough for the average norm. The only people who spent more than $100 on a video card were teenagers and comic book guys (and of course, my awesome self).
Best comment (Score:2)
When you start talking pre-emptively about your competitor's vapor, you're officially worried.
solution! (Score:2)
Ndnd: "Maybe they're saving it for sweeps."
Is there video of that? (Score:2)
Meanwhile, in the corporate world (Score:3, Interesting)
I have done CAD/CAM for ages, and my P3-750 with a Quadro4 700XGL isn't noticeably slower than a P4-3.4 with a Radeon X300SE running Unigraphics NX 5. I have a P3-500 with a QuadroFX-1000 card that freaking flies running CATIA V5. Again, in contrast, my 1.8GHz P4 laptop with integrated Intel graphics sucks balls running either UG or CATIA.
Speaking for the workstation users out there, please keep making high performance GPUs, Nvidia.
Re:Not scared... no kidding? (Score:5, Insightful)
Yes I know they got bought by AMD, but they still exist and they still make GPUs AFAIK.
And if your argument is that nVidia is better than ATI, let me remind you that ATI/nVidia and intel/AMD keep leapfrogging each other every few years.
Re: (Score:3, Informative)
AMD is in a world of hurt right now, with Intel consistently maintaining a lead over them in the CPU segment, and NVIDIA maintaining a lead over them in the GPU segment. They're doing some interesting, synergistic things between the CPU and GPU sides, but who knows if that'll pan out. Meanwhile, they're being for
Re:Not scared... no kidding? (Score:4, Insightful)
Competitive enough anyway. Long as I'm still on AGP, I'm still getting ATI cards (nVidia's agp offerings have classically been highly crippled beyond just running on AGP). But sure, I'm a niche, and truth be told, my next system will probably have nVidia.
But gamer video cards aren't everything, and I daresay not even the majority. If you have a flatscreen TV, chances are good it's got ATI parts in it. Then there's laptops and integrated video, nothing to sneeze at.
Re: (Score:2)
Re: (Score:2)
They sure are
Meantime I've purchased an AGP X1950 then a HD3850 that's fully functional, save for the more limited texture bandwidth from the AGP bus, and frankly that's just not noticeable.
Yeah, I probably could have upgraded mobos, but I'm a sucker for increme
Who would build a laptop with AGP? (Score:2)
Re: (Score:3, Insightful)
Quite true. With the 8500 and 8600 models, and now the 9500, nVidia trounces AMD even on budget cards.
But, nVidia got pummeled prior to their acquisition of Yahoo!^H^H^H^H^H^H Voodoo, and the two were quite neck and neck for a long time. So it's more of "the tables have turned (again)" rather than "they have no competition."
Until AMD completely quits making higher-end video cards, nVidia will have to keep on doing something to stay competitive. Same thing with Firefox - I don't think IE8 would have l
Re: (Score:2)
Re: (Score:2)
I don't know what they were smoking, but they pissed off Microsoft, too, and got left out of developing DirectX 8 IIRC. Because they didn't have their hands on the next version of DirectX, they were way behind the ball when the SDK proper was released.
But, I'm thrilled with the hardware they produce. And as long as AMD stays no more than one generation behind them, they won't be able to rest on their laurels, either.
Re: (Score:2, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Since the Voodoo 2 Nvidia have been making the best video cards performancevise all the time except when they let the 3DFX-guys make the FX5-series. TNT, TNT2, Geforce especially, Geforce2, Geforce3, Geforce4, had them on top. FX5 was shit and Radeon 9xxx was better, Nvidia did catch up in the next generation thought. So yes, Nvidia lost in one generation because they used other developers, but it's not like the game change the whole time. The x1950 was a nice card for the price thought.
And co
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
I haven't, but I have heard of Carmack, and Carmack " seems to think that Intel's direction using traditional ray tracing methods is not going to work [slashdot.org]." I didn't understand anything in that article, but assuming that the blurb was correct (and Carmack didn't seem to refute it in the 3 times he replied to that story), then I'd say that they may not be "less and less interested" but maybe they are "less and less right about the direction to take." Take your pick.
And while my little blurb may have been fund
Re: (Score:2, Insightful)
Let's examine this statement:
"Bus and train integration is quite logical progression of technology. There are things the plane is not optimal and same goes to the bus. It seems that when combined, they prove successful. So let's put wings on a bus."
Now, I think there are plenty of good reasons why CPU/GPU integration is a
Re: (Score:2)
That's why I think AMD/ATI, rathe
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
By the way, onboard video uses about as much RAM as a browser will use (And about as much as Win98 needs to boot in, but I digr
Re: (Score:2)
Upgrades? Just grab a new processor, tear up your thumbs removing the cheapshit heatsink you grabbed off eBay for $2, apply some thermal grease that absolutely refuses to come off your finger, replace the heatsink and