Larrabee Team Is Focused On Rasterization 87
Vigile writes "Tom Forsyth, a well respected developer inside Intel's Larrabee project, has spoken to dispel rumors that the Larrabee architecture is ignoring rasterization, and in fact claims that the new GPU will perform very well with current DirectX and OpenGL titles. The recent debate between rasterization and ray tracing in the world of PC games has really been culminating around the pending arrival of Intel's discrete Larrabee GPU technology. Game industry luminaries like John Carmack, Tim Sweeney and Cevat Yerli have chimed in on the discussion saying that ray tracing being accepted as the primary rendering method for games is unlikely in the next five years."
Duh (Score:4, Insightful)
I hope their GPU gets much, much better (Score:3, Insightful)
Lock the Target
Or one 3D game. Go ahead, just try to play Halo on a budget PC. Most say they're good for 2D games only. That's because an âoeintegrated Intel graphicsâ chip steals power from the CPU and siphons off memory from system-level RAM. You'd have to buy an extra card to get the graphics performance of Mac mini, and some cheaper PCs don't even have an open slot to let you add one. - Apple Inc., Mac Mini G4 Graphics
In any case, what I'd really like is yesterday's technology with today's manufacturing capabilities. Imagine an old Radeon or GeForce GPU built at 45nm or lower. Would that result in a 5-10 watts GPU that could still beat whatever intel is making?
Stupid debate (Score:2, Insightful)
No need to take them seriously (Score:4, Insightful)
Re:Duh (Score:1, Insightful)
While I do agree that the endgame on this is that there will not be separate cards, I hardly think that it's a no brainer that the tasks won't be separated.
John Carmack's suggestion a while back that both ray tracing and rasterization being combined in games is a good reason to consider the merits of specific GPUs for both. If they were designed to work together having two chips on one card could be a significant advantage in terms of performance.
Suggesting otherwise is a bit like saying who needs audio or graphics acceleration when those started to appear. They weren't strictly necessary, but they added so much to the quality of the graphics that nobody today would argue that it was a mistake.
Re:Duh (Score:5, Insightful)
Re:Stupid debate (Score:2, Insightful)
Primary rays have no advantage whatsoever over rasterization. Secondary rays, now THIS is where it gets interesting. Primary rays can be done fully with rasterization, in fact rasterization is nothing more than a clever first-ray optimization. Therefore, a hybrid is the way to go.
Your "precious scan conversion" and "those algorithms" blubb shows serious lack of knowledge about the actual algorithms. I suggest you do some in-depth studies of them before posting again. You fail to mention issues like cache coherency (given practically for free in rasterization, while hard to achieve in a raytracer). Antialias is also not trivial, unless you go brute force and sample multiple rays per pixel. In rasterization, everybody uses multisampling, a very simple and efficient algorithm (not without drawbacks though, but the performance benefits usually outweigh them).
Oh, before you mention lighting: lighting models are independent of the rendering method. Global Illumination has _nothing_ to do with raytracing as the frame rendering method.
So... without fancy secondary effects, where does this leave us? With the same output a rasterizer can give me, only with much better performance (largely thanks to the cache and missing intersection tests) and just as easy. Reflections, now this is hard with a rasterizer, but thats why a hybrid is wise; use rasterization by default, and secondary rays for those parts of the scene where these effects are visible.
People quickly get blended by the apparent elegance of raytracing. Just try to raytrace fast
Re:*Sigh* (Score:5, Insightful)
Hard to take them seriously? Are you kidding? The very low end is the massive majority of the market, and Intel has that well wrapped up. They are probably the #1 PC GPU manufacturer out there. If you want cheap or low power, you get an Intel GPU. Also, if you want 100% rock solid drivers that are supported out of the box and cream the competition in terms of stability (speaking about Linux here), you buy an Intel GPU.
So yeah, if you discount the market leader in terms of driver stability and volume of sales, and care only about speed then yes, Intel isn't competitive.
In my world, I will continue to take them seriously, since I always aim to but Intel graphics if I can. If they get faster, that's a nice bonus.
Keep Vista in mind (Score:3, Insightful)
The problem is that now thanks to Vista and its Aero, powerful 3D acceleration starts to matter not only to
Pixels (Score:3, Insightful)