Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel GUI Software Hardware

Larrabee Team Is Focused On Rasterization 87

Vigile writes "Tom Forsyth, a well respected developer inside Intel's Larrabee project, has spoken to dispel rumors that the Larrabee architecture is ignoring rasterization, and in fact claims that the new GPU will perform very well with current DirectX and OpenGL titles. The recent debate between rasterization and ray tracing in the world of PC games has really been culminating around the pending arrival of Intel's discrete Larrabee GPU technology. Game industry luminaries like John Carmack, Tim Sweeney and Cevat Yerli have chimed in on the discussion saying that ray tracing being accepted as the primary rendering method for games is unlikely in the next five years."
This discussion has been archived. No new comments can be posted.

Larrabee Team Is Focused On Rasterization

Comments Filter:
  • Duh (Score:4, Insightful)

    by Wesley Felter ( 138342 ) <wesley@felter.org> on Friday April 25, 2008 @05:01PM (#23202838) Homepage
    Creating a GPU that won't run existing games well (or at all) never made sense. Some people fantasized about forcing gamers to buy a rasterization GPU and a separate raytracing GPU, but those are probably the same fools who bought PPUs and Killer NICs.
  • by Yvan256 ( 722131 ) on Friday April 25, 2008 @05:15PM (#23202974) Homepage Journal
    As a Mac mini user, I'm forced to use whatever GPU intel comes up with, unless Apple suddenly remembers their own words when they introduced the Mac mini G4:

    Lock the Target

    Or one 3D game. Go ahead, just try to play Halo on a budget PC. Most say they're good for 2D games only. That's because an âoeintegrated Intel graphicsâ chip steals power from the CPU and siphons off memory from system-level RAM. You'd have to buy an extra card to get the graphics performance of Mac mini, and some cheaper PCs don't even have an open slot to let you add one. - Apple Inc., Mac Mini G4 Graphics



    In any case, what I'd really like is yesterday's technology with today's manufacturing capabilities. Imagine an old Radeon or GeForce GPU built at 45nm or lower. Would that result in a 5-10 watts GPU that could still beat whatever intel is making?
  • Stupid debate (Score:2, Insightful)

    by pclminion ( 145572 ) on Friday April 25, 2008 @05:24PM (#23203044)
    The whole damn debate is just a bunch of old men whining. Raytracing is obviously a superior rendering method, the question is simply when it will become fast enough. The dinosaurs don't want to let go of their precious scan conversion -- and who can blame them given the massive amount of work put into those algorithms over the last decades -- but the time of scan conversion is coming to an end.
  • by Sycraft-fu ( 314770 ) on Friday April 25, 2008 @05:27PM (#23203064)
    It isn't as though they are only going to sell to true believers or anything. Just wait until it comes out, then evaluate it. At this point I don't really have an opinion one way or the other. Intel certainly has the know how and the fabrication tech to make a good GPU, but they also have the ability to miss the boat. I'll simply wait until it is real silicon that I can purchase before I concern myself with it. It'll either be competitive or it won't, we won't know until it is out and real tests are done.
  • Re:Duh (Score:1, Insightful)

    by Anonymous Coward on Friday April 25, 2008 @05:35PM (#23203136)
    You say that as if there are no advantages to that approach.

    While I do agree that the endgame on this is that there will not be separate cards, I hardly think that it's a no brainer that the tasks won't be separated.

    John Carmack's suggestion a while back that both ray tracing and rasterization being combined in games is a good reason to consider the merits of specific GPUs for both. If they were designed to work together having two chips on one card could be a significant advantage in terms of performance.

    Suggesting otherwise is a bit like saying who needs audio or graphics acceleration when those started to appear. They weren't strictly necessary, but they added so much to the quality of the graphics that nobody today would argue that it was a mistake.
  • Re:Duh (Score:5, Insightful)

    by frieko ( 855745 ) on Friday April 25, 2008 @05:36PM (#23203144)

    Creating a GPU that won't run existing games well (or at all) never made sense.
    Not to Intel, they've been doing exactly that for years!
  • Re:Stupid debate (Score:2, Insightful)

    by ardor ( 673957 ) on Friday April 25, 2008 @05:45PM (#23203220)
    A superior rendering method because ... ?

    Primary rays have no advantage whatsoever over rasterization. Secondary rays, now THIS is where it gets interesting. Primary rays can be done fully with rasterization, in fact rasterization is nothing more than a clever first-ray optimization. Therefore, a hybrid is the way to go.

    Your "precious scan conversion" and "those algorithms" blubb shows serious lack of knowledge about the actual algorithms. I suggest you do some in-depth studies of them before posting again. You fail to mention issues like cache coherency (given practically for free in rasterization, while hard to achieve in a raytracer). Antialias is also not trivial, unless you go brute force and sample multiple rays per pixel. In rasterization, everybody uses multisampling, a very simple and efficient algorithm (not without drawbacks though, but the performance benefits usually outweigh them).

    Oh, before you mention lighting: lighting models are independent of the rendering method. Global Illumination has _nothing_ to do with raytracing as the frame rendering method.

    So... without fancy secondary effects, where does this leave us? With the same output a rasterizer can give me, only with much better performance (largely thanks to the cache and missing intersection tests) and just as easy. Reflections, now this is hard with a rasterizer, but thats why a hybrid is wise; use rasterization by default, and secondary rays for those parts of the scene where these effects are visible.

    People quickly get blended by the apparent elegance of raytracing. Just try to raytrace fast .. it won't remain elegant for long. And just like many papers, at some point you'll use rasterization, because it IS faster in special cases, which usually make up 60-70% of the scene (opaque, (un)lit surfaces).
  • Re:*Sigh* (Score:5, Insightful)

    by serviscope_minor ( 664417 ) on Friday April 25, 2008 @06:14PM (#23203464) Journal
    Intel has been saying with each and every iteration of graphics hardware that it's created that it would be 'competetive'. None have been except at the very, very low end. I like Intel's CPU's quite a bit, but I have heard the boy who cried wolf too many times from them with regards to GPU's to take them very seriously at this point.

    Hard to take them seriously? Are you kidding? The very low end is the massive majority of the market, and Intel has that well wrapped up. They are probably the #1 PC GPU manufacturer out there. If you want cheap or low power, you get an Intel GPU. Also, if you want 100% rock solid drivers that are supported out of the box and cream the competition in terms of stability (speaking about Linux here), you buy an Intel GPU.

    So yeah, if you discount the market leader in terms of driver stability and volume of sales, and care only about speed then yes, Intel isn't competitive.

    In my world, I will continue to take them seriously, since I always aim to but Intel graphics if I can. If they get faster, that's a nice bonus.
  • Keep Vista in mind (Score:3, Insightful)

    by DrYak ( 748999 ) on Friday April 25, 2008 @08:22PM (#23204340) Homepage
    Keep Microsoft Windows Vista in mind and reconsider your last sentence :

    A given computer is more likely to be used with Solitaire than with a demanding 3D game.
    More seriously : Intel has been king in the ultra-low cost segment of GPU because nearly every business desktop (almost any non-high-end Dell machine for example) needs a graphic card, just to draw the desktop, but almost no 3D function. Thus it's hard to find 1 machine which was sold to a corporation and doesn't have an i8x0 or i9x0 embed GPU. (Even if sometimes, it is disabled because the buyer asked for a mid-range nVidia or ATI).

    The problem is that now thanks to Vista and its Aero, powerful 3D acceleration starts to matter not only to /.er gamers, but even for the secretary who only runs an OS, a browser, an office suite and the occasional mine-sweeper/solitaire.
  • Pixels (Score:3, Insightful)

    by Whiteox ( 919863 ) on Friday April 25, 2008 @08:22PM (#23204342) Journal
    All it is is changing pixels. After all, it's still a 2D screen that displays as a bitmap. Sometimes a step backwards is valuable too.

All the simple programs have been written.

Working...