Larrabee Team Is Focused On Rasterization 87
Vigile writes "Tom Forsyth, a well respected developer inside Intel's Larrabee project, has spoken to dispel rumors that the Larrabee architecture is ignoring rasterization, and in fact claims that the new GPU will perform very well with current DirectX and OpenGL titles. The recent debate between rasterization and ray tracing in the world of PC games has really been culminating around the pending arrival of Intel's discrete Larrabee GPU technology. Game industry luminaries like John Carmack, Tim Sweeney and Cevat Yerli have chimed in on the discussion saying that ray tracing being accepted as the primary rendering method for games is unlikely in the next five years."
Re:Uh (Score:4, Funny)
*Sigh* (Score:5, Interesting)
Re: (Score:3, Interesting)
Re: (Score:2)
Re:*Sigh* (Score:4, Funny)
Funny, that's the same thing that happens when I buy ATI...
Re: (Score:2)
No need to take them seriously (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
They have the no-idea-how you mean.
Larrabee will be awesome ... for everybody! (Score:3, Interesting)
If your motherboard has Larrabee you could use it for the physics calculation while your add-in GPU does the graphics.
This makes a whole lot more sense than trying to get a single GPU to do both tasks.
Re:*Sigh* (Score:5, Insightful)
Hard to take them seriously? Are you kidding? The very low end is the massive majority of the market, and Intel has that well wrapped up. They are probably the #1 PC GPU manufacturer out there. If you want cheap or low power, you get an Intel GPU. Also, if you want 100% rock solid drivers that are supported out of the box and cream the competition in terms of stability (speaking about Linux here), you buy an Intel GPU.
So yeah, if you discount the market leader in terms of driver stability and volume of sales, and care only about speed then yes, Intel isn't competitive.
In my world, I will continue to take them seriously, since I always aim to but Intel graphics if I can. If they get faster, that's a nice bonus.
Re: (Score:2)
Are Intel GPUs stronger than Wii GPU? (Score:2)
Re: (Score:2)
Re: (Score:2)
It is impossible to target [Intel's 3D graphics] hardware[, which is less powerful in that of NVIDIA or ATI,] when developing AAA titles, however.
Are you claiming that Intel GPUs are less powerful than the Hollywood GPU of Nintendo's Wii console, or are you claiming that too few AAA titles come out for Wii?
Who said anything about the Wii? I must have missed that.
I brought up the Wii. I was using it as an example of a platform for which the major video game publishers publish titles, but whose GPU is less powerful than today's low-end to mid-range 3D video cards for PCs. Now why do think it's possible to develop games for Wii but not for PCs with Intel graphics?
About this 6200... (Score:3, Informative)
Yes, it's still nothing spectacular, but as long as I can play (with tweaked settings of course) Orange Box titles, Hellgate: London, Sins of Solar Empire and Mythos, I'm happy.
Re: (Score:1)
Re: (Score:2)
(again, it's nothing dramatic, but I guess it's enough for a lot of folks, including me - I'm in a market for a new Thinkpad R61 14", and cheapest one, with X3100, will do the job fine; plus I'm somehow under, perhaps misjujed, impression that Intel gfx will give longest battery time; anyway Lenovo doesn't deal with AMD...)
Re: (Score:2)
About those drivers... (Score:2)
Can anyone at Intel confirm that this will be the case with the new drivers? Or will ATI beat them to it? Because more than anything else, this is what will determine my next video card purchase: Rock solid open source drivers that have all the features of the Windows drivers.
Re: (Score:3, Informative)
Also, if you want 100% rock solid drivers that are supported out of the box and cream the competition in terms of stability (speaking about Linux here), you buy an Intel GPU.
I wouldn't go that far. I've had stability issues with my intel graphics. Some OpenGL screensavers and some games running under Wine will crash or lockup X, regardless of what settings I use in my xorg.conf (XAA vs EXE, Composite on/off). Furthermore, several extentions (like composite) that are fairly stable with NVidia drivers are still buggy as hell with the intel drivers.
I never had any stability issues whatsoever with the last NVidia card I bought. Then again, that card is now useless to me since NVid
Re: (Score:2)
Note to self: Must remember that this new form does not work in konqueror - 3rd post I've had to retype today.
Re: (Score:1)
I smell a fanboi...
Re:*Sigh* (Score:5, Interesting)
I happen to know a great many people that work at Intel. And I just happen to also do product testing and marketing focus groups for them. All centered around gaming.
This was a topic that intel did not take seriously 5-10 years ago. They take it deadly serious now.
I spoke with paul otellini on one occasion on the topic of intel gaming. It went more or less like this.
Paul- Which Intel chip do you have in your machine at home?
Me- It's an AMD actually.
Paul- You work for Intel, your family works here and you buy an AMD?
Me- I run what gives me the highest performance in what I do. It also happened to be cheaper, but thats secondary.
Paul- They only beat us in gaming! Our chips are better at EVERYTHING else.
Me- Gaming leads the market.
Paul- No it doesn't.
Me- No one upgrades twice a year to keep up with MS office. We upgrade to keep up with Carmack.
Paul- If I offered to give you a couple of our next gen processors, would you use them?
Me- I'd try them out, but if they can't beat my current machine I won't use them. Even if they are free. Neither will anyone I know. We literally spend a couple thousand dollars a year keeping our machines state of the art so we can squeeze an extra frame per second out of our systems. We aren't going to use anything that isn't the best.
You want me and my market segment to take you seriously? Take us seriously. We make up a small segment, but we are fanatical.
___
A couple years later, I got an email from him.
It was actually sent as a response to several key divisions in intel, because several people had asked why we (intel) care about gamers, they make up less than 5% of the PC market (it's actually closer to 1%).
___
Paul- We care about gamers because gamers grow up. They grow up to work mainly in IT fields. The gamers from 5-10 years ago are now the IT professionals we most want to be on our side. They are the ones making purchasing decisions and recommendations and they do so based on what they know. They know AMD better than us because we ignored them for so long.
Why do we care about games? We don't. We care about the people playing them and we want them to identify with our products.
____
So now you have some insight as to where intel thinks this is all going. It's not that they care about gaming or graphics, because they really don't. They care about the people behind it, and getting them hooked into a brand that "supports" them.
Then there is the really obvious reasons for Intel getting into graphics, VISTA, and other next gen OS's and GUI's are going to use a lot of hardware acceleration. Which means discrete graphics cards aren't for the desktop anymore, they are for the server and the workstation too.
Add to that using the GPU to do certain types of parallel processing at much better thru-put than you can get from a CPU.
The motivation should be obvious.
*Posted AC for my sake. I like my contacts at Intel. I'm hoping Paul doesn't remember talking to a PFY about his companies gaming culture.
Translation: "We want gamers to like us." (Score:2)
This sounds real to me. Intel CEO Paul Otellini [wikipedia.org] could have said that.
But it must be translated from corporate-speak. It doesn't necessarily mean anything, except that he wants to tell you something you want to hear. The translation is: "We want gamers to like us." You already knew that.
I don't intend this to indicate anything about whether I think Intel is serious this time about making competitive GPUs. I'm just commenting on the fact that CEOs often don't believe that what
Re: (Score:2)
Me- I'd try them out, but if they can't beat my current machine I won't use them. Even if they are free. Neither will anyone I know. We literally spend a couple thousand dollars a year keeping our machines state of the art so we can squeeze an extra frame per second out of our systems. We aren't going to use anything that isn't the best.
I figure by this time he called up the head of the CPU division and said "Build us the Core 2 Extremes! Those people are completely nuts and you could probably sell it for a thousand dollars as long as it thoroughly beats AMD". To me it seems fairly obvious where Intel is heading (though they're so large they can afford to go in multiple directions) and that is systems on a chip. It's already announced with Moorestown and in the meantime there's Atom for the low-cost fanless computers in the Nettops (not a
Re: (Score:1, Interesting)
shader cores 128
clock 1.7Ghz
128x1.7 = 217.6 Gflops
70.4GB Bandwidth
Larrabee (Not released until Q1 2009)
16-24 cores
Clock 1.7 to 2.5Ghz
2.5*24 = 60 GFlops
DDR3 Memory bandwith far less (Even faster DDR3-1600 has 12.800 GB/s speed)
This shows the Larrabee is at least 3.6 times slower processing speed. Plus memory bandwidth is around 6 times slower. Plus the Geforce 9800 isn't even the fastest. The GeForce 9800 GX2 is nearly twice as fast and available now.
Plus its also
Re: (Score:2)
It's not just intel either. Every promise to revolutionize graphics has failed. Anyone remember Microsoft Talisman? :)
I'm still waiting for intel to bring out PCI and PCI-E cards with open source drivers.
Duh (Score:4, Insightful)
Re: (Score:1, Insightful)
While I do agree that the endgame on this is that there will not be separate cards, I hardly think that it's a no brainer that the tasks won't be separated.
John Carmack's suggestion a while back that both ray tracing and rasterization being combined in games is a good reason to consider the merits of specific GPUs for both. If they were designed to work together having two chips on one card could be a significant advantage in terms of performance.
S
Re: (Score:2)
Re:Duh (Score:5, Insightful)
Re: (Score:2)
The theory that people were passing around was that it would be primarily targeted at raytracing and have a small rasterization engine that was decent but not high performance.
It was a stupid idea. I don't think even Intel could could make raytracing parts competitive in the market at this point. If they wanted to do that with a new part, I would expect them to be showing MUCH more at this stage. If they were to just drop this on the world in the next few months or year, no one would be able to support it
Not for games? (Score:2)
Re: (Score:2)
Re: (Score:2)
Keep Vista in mind (Score:3, Insightful)
A given computer is more likely to be used with Solitaire than with a demanding 3D game.
More seriously : Intel has been king in the ultra-low cost segment of GPU because nearly every business desktop (almost any non-high-end Dell machine for example) needs a graphic card, just to draw the desktop, but almost no 3D function. Thus it's hard to find 1 machine which was sold to a corporation and doesn't have an i8x0 or i9x0 embed GPU. (Even if sometimes, it is disabled because the buyer asked for a mid-range nVidia or ATI).
Th
Re: (Score:1)
Re: (Score:2)
Max Smart (Score:2)
Confidence in the man... (Score:4, Informative)
I hope their GPU gets much, much better (Score:3, Insightful)
Lock the Target
Or one 3D game. Go ahead, just try to play Halo on a budget PC. Most say they're good for 2D games only. That's because an âoeintegrated Intel graphicsâ chip steals power from the CPU and siphons off memory from system-level RAM. You'd have to buy an extra card to get the graphics performance of Mac mini, and some cheaper PCs don't even have an open slot to let you add one. - Apple Inc., Mac Mini G4 Graphics
In any case, what I'd really like is yesterday's technology with today's manufacturing capabilities. Imagine an old Radeon or GeForce GPU built at 45nm or lower. Would that result in a 5-10 watts GPU that could still beat whatever intel is making?
Re: (Score:2)
Maybe, but nVidia will leapfrog ahead of you with better tech on that 45nm fab.
To be honest with you I don't understand why people keep drooling over shaving off 5-10 watts in their computers. When you are paying 6-12 cents per kilowatt-hour. You'd have to run that sucker 100-200 to save a dime. Are you really gaming that hard, where those times add up? Don't
Re: (Score:2)
Frying Eggs (Score:3, Funny)
Re: (Score:2)
It's about a more powerful but still quiet computer. Since I bought my Mac mini, I consider external hard drives to be extremely noisy.
An efficient GPU that only requires a few watts equals less cooling, meaning a more quiet computer (perhaps even fanless, see low-end mini-ITX boards).
May I remind you that while some new videocards sometime require their
Re: (Score:2)
But it's also about less heat, which means less cooling apparatus, which means a quieter machine.
And it means less power needed from a battery, if and when you need one. (Laptops, UPS, etc.)
And that's not "shaving 5-10 watts off your GPU", it's about making a 5-10 watt GPU, if I understand the grandparent -- instead of, say, a 20-30 watt GPU. Which still isn't a lot, but a little bit here, a little bit there, and it adds up -- CPUs are getting more efficient, too.
Re: (Score:2)
Re: (Score:2)
steals power from the CPU and siphons off memory from system-level RAM.
Either you know what you're talking about and are oversimplifying a lot, or you don't know at all.
If I have 2 gigs of RAM, and a game takes 1 gig, wouldn't it be better if my GPU could use the other gig? Most video cards don't come with a gig of RAM, they come with much less. And I can upgrade my system RAM -- most video cards, you only upgrade the RAM when you buy a new one.
And "steals power from the CPU"? WTF? That is physically impossible. You could say that more is done in software, because less is eve
Re: (Score:2)
Re: (Score:2)
Sorry about that.
Perhaps its proper to join the competition (Score:2)
Stupid debate (Score:2, Insightful)
Re: (Score:2, Insightful)
Primary rays have no advantage whatsoever over rasterization. Secondary rays, now THIS is where it gets interesting. Primary rays can be done fully with rasterization, in fact rasterization is nothing more than a clever first-ray optimization. Therefore, a hybrid is the way to go.
Your "precious scan conversion" and "those algorithms" blubb shows serious lack of knowledge about the actual algorithms. I suggest you do some in-depth studies of them before posting again.
Re: (Score:1)
My day job consists mostly of writing rendering code, although not in a gaming context. I am not at all "dissing" scan conversion. It's what I do every day. My point is, WHEN (and only when) the technology is fast enough for real time recursive ray tracing, it will be the end of rasterization in 3D applications.
Cache coherency problems can be fixed by making an enormous cache, or simply making the RAM itself so damn fast it doesn't matter anymore. Adaptive subdivision of pixels for antialiasing is not exa
Re: (Score:3, Interesting)
Then why did you call these graphic engine experts "old men" as if they are just set in their ways? It sounds to me like they know just what they are talking about.
It's been my experience in the software world that people are incredibly stubborn about dropping old, familiar technology when something better comes along. It's certainly not limited to these folks. But even the smartest people get blinded by the familiarity of their ways.
Re:Stupid debate (Score:4, Interesting)
Also, given that hybrids are a no-brainer, I bet both pure raytracers and rasterizers will be extinct in games.
2. Algorithmic complexity will always come back to haunt you. O(nÂ) will always be worse than O(n), unless you have small scenes. So you have your geforce19000 and can render
3. You could have said path tracing or photon mapping at least.
Finally, these people don't particularly favor raytracing simply because it does not pay off for games. Games usually don't feature fully shiny scenes, games are expected to run at interactive framerates. In, say, 5 years, entirely new (and demanding) effects are en vogue; if raytracing steals too much time, it will be dropped, its results faked. This is what the "old men" do all the time in their games: fake. In the offline world, things are wildly different, so don't compare them.
Re: (Score:2)
My day job consists mostly of writing rendering code, although not in a gaming context. I am not at all "dissing" scan conversion. It's what I do every day. My point is, WHEN (and only when) the technology is fast enough for real time recursive ray tracing, it will be the end of rasterization in 3D applications.
If it isn't recursive then it's not ray tracing it's ray casting.
Oblig. (Score:2)
Re: (Score:2)
Re: (Score:2)
Maybe among all these top 10 games lists, its time for a top 10 game
Re: (Score:2)
Oh wait, I'm a dinosaur who advocates (and researches) ray tracing - damn blew that theory all to hell I guess.
And his post was ranked "4, Insightful" when I saw it - maybe understanding words like insightful should be required before people get to moderate. Yeah, yeah, I know... it'scan it uses it's own ram? the new amd chipset can. (Score:1, Troll)
Free drivers? (Score:2, Offtopic)
Double-Plus Good! (Score:2)
Was it a week or two weeks ago that intel's Larabee was going to replace nvidia and ati's raster graphics with ray tracing?
Resterilization (Score:1)
solving yesterdays problems.. tomorrow! (Score:1)
Pixels (Score:3, Insightful)
Long live Muckeyfoot! (Score:2)