Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Intel GUI Software Hardware

Larrabee Team Is Focused On Rasterization 87

Vigile writes "Tom Forsyth, a well respected developer inside Intel's Larrabee project, has spoken to dispel rumors that the Larrabee architecture is ignoring rasterization, and in fact claims that the new GPU will perform very well with current DirectX and OpenGL titles. The recent debate between rasterization and ray tracing in the world of PC games has really been culminating around the pending arrival of Intel's discrete Larrabee GPU technology. Game industry luminaries like John Carmack, Tim Sweeney and Cevat Yerli have chimed in on the discussion saying that ray tracing being accepted as the primary rendering method for games is unlikely in the next five years."
This discussion has been archived. No new comments can be posted.

Larrabee Team Is Focused On Rasterization

Comments Filter:
  • *Sigh* (Score:5, Interesting)

    by dreamchaser ( 49529 ) on Friday April 25, 2008 @04:57PM (#23202796) Homepage Journal
    Intel has been saying with each and every iteration of graphics hardware that it's created that it would be 'competetive'. None have been except at the very, very low end. I like Intel's CPU's quite a bit, but I have heard the boy who cried wolf too many times from them with regards to GPU's to take them very seriously at this point.
    • Re: (Score:3, Interesting)

      by QuantumRiff ( 120817 )
      One would think that a company that could do a complete turn around after it got its 64bit ass handed to it (thanks AMD) would be able to dedicate just a bit of brain-power to their graphics.
    • Considering their more aggressive stance against AMD right now, I'd say it's more likely that they're going to try to compete in the graphics arena. AMD now has the ability to bring strong integrated graphics to the table which could result in a net gain in spite of the loss in performance they've suffered recently. The more I think about it, the more I realize that buying ATI put AMD in a very good position (other than that whole "no money to spend on anything" problem).
      • Re:*Sigh* (Score:4, Funny)

        by Toonol ( 1057698 ) on Friday April 25, 2008 @05:50PM (#23203270)
        ...buying ATI put AMD in a very good position (other than that whole "no money to spend on anything" problem).

        Funny, that's the same thing that happens when I buy ATI...
      • I wouldn't say it put them in a good position as much as it enabled them to survive. They are currently losing the price/performance fight in a big big way on the CPU side. Having a high margin product like popular ATI GPU's can at least help keep their heads above water until they come up with a better CPU architecture.
    • by Sycraft-fu ( 314770 ) on Friday April 25, 2008 @05:27PM (#23203064)
      It isn't as though they are only going to sell to true believers or anything. Just wait until it comes out, then evaluate it. At this point I don't really have an opinion one way or the other. Intel certainly has the know how and the fabrication tech to make a good GPU, but they also have the ability to miss the boat. I'll simply wait until it is real silicon that I can purchase before I concern myself with it. It'll either be competitive or it won't, we won't know until it is out and real tests are done.
      • by geekoid ( 135745 )
        I just want it to look good and play well. The method makes no difference to the end user.
      • Oh I do tend to agree. I am just sick of hearing them talk about it. Show me a shipping product.
      • by coopaq ( 601975 )
        "Intel certainly has the know how and the fabrication tech to make a good GPU"

        They have the no-idea-how you mean.

      • Read the article - Larrabee is designed for general purpose programmability.

        If your motherboard has Larrabee you could use it for the physics calculation while your add-in GPU does the graphics.

        This makes a whole lot more sense than trying to get a single GPU to do both tasks.
    • Re:*Sigh* (Score:5, Insightful)

      by serviscope_minor ( 664417 ) on Friday April 25, 2008 @06:14PM (#23203464) Journal
      Intel has been saying with each and every iteration of graphics hardware that it's created that it would be 'competetive'. None have been except at the very, very low end. I like Intel's CPU's quite a bit, but I have heard the boy who cried wolf too many times from them with regards to GPU's to take them very seriously at this point.

      Hard to take them seriously? Are you kidding? The very low end is the massive majority of the market, and Intel has that well wrapped up. They are probably the #1 PC GPU manufacturer out there. If you want cheap or low power, you get an Intel GPU. Also, if you want 100% rock solid drivers that are supported out of the box and cream the competition in terms of stability (speaking about Linux here), you buy an Intel GPU.

      So yeah, if you discount the market leader in terms of driver stability and volume of sales, and care only about speed then yes, Intel isn't competitive.

      In my world, I will continue to take them seriously, since I always aim to but Intel graphics if I can. If they get faster, that's a nice bonus.
      • by ardor ( 673957 )
        It is impossible to target their hardware when developing AAA titles, however. It would require enormous scaling. If these abundant GMAs would be equal to low-budget ATI/nVidia cards, things would be different. But even a 6200 gives GMAs a run for their money.
        • It is impossible to target their hardware when developing AAA titles, however. It would require enormous scaling.
          I don't understand the point you are trying to make. Are you claiming that Intel GPUs are less powerful than the Hollywood GPU of Nintendo's Wii console, or are you claiming that too few AAA titles come out for Wii?
          • Who said anything about the Wii? I must have missed that.
            • by tepples ( 727027 )

              It is impossible to target [Intel's 3D graphics] hardware[, which is less powerful in that of NVIDIA or ATI,] when developing AAA titles, however.

              Are you claiming that Intel GPUs are less powerful than the Hollywood GPU of Nintendo's Wii console, or are you claiming that too few AAA titles come out for Wii?

              Who said anything about the Wii? I must have missed that.

              I brought up the Wii. I was using it as an example of a platform for which the major video game publishers publish titles, but whose GPU is less powerful than today's low-end to mid-range 3D video cards for PCs. Now why do think it's possible to develop games for Wii but not for PCs with Intel graphics?

        • About this 6200... (Score:3, Informative)

          by sznupi ( 719324 )
          Last Intel offerings (X3100 that is in all laptops here) are actually (finally) definatelly faster...
          Yes, it's still nothing spectacular, but as long as I can play (with tweaked settings of course) Orange Box titles, Hellgate: London, Sins of Solar Empire and Mythos, I'm happy.
          • Comment removed based on user account deletion
            • by sznupi ( 719324 )
              Quite possible - I just replied to parent claiming that Intel can't touch gf6200, while X3100 actually surpasses it

              (again, it's nothing dramatic, but I guess it's enough for a lot of folks, including me - I'm in a market for a new Thinkpad R61 14", and cheapest one, with X3100, will do the job fine; plus I'm somehow under, perhaps misjujed, impression that Intel gfx will give longest battery time; anyway Lenovo doesn't deal with AMD...)
      • I meant with regards to statements like it will play DX 9 and 10 games just fine. I've never seen an Intel GPU solution that plays an game with significant 3D accelleration needs 'fine'.
      • Yes, I love how there are absolutely rock solid, open drivers for just about every Intel card ever made (of any kind) on Linux.

        Can anyone at Intel confirm that this will be the case with the new drivers? Or will ATI beat them to it? Because more than anything else, this is what will determine my next video card purchase: Rock solid open source drivers that have all the features of the Windows drivers.
      • Re: (Score:3, Informative)

        by pavon ( 30274 )

        Also, if you want 100% rock solid drivers that are supported out of the box and cream the competition in terms of stability (speaking about Linux here), you buy an Intel GPU.

        I wouldn't go that far. I've had stability issues with my intel graphics. Some OpenGL screensavers and some games running under Wine will crash or lockup X, regardless of what settings I use in my xorg.conf (XAA vs EXE, Composite on/off). Furthermore, several extentions (like composite) that are fairly stable with NVidia drivers are still buggy as hell with the intel drivers.

        I never had any stability issues whatsoever with the last NVidia card I bought. Then again, that card is now useless to me since NVid

      • In my world, I will continue to take them seriously, since I always aim to bu[y] Intel graphics if I can. If they get faster, that's a nice bonus.

        I smell a fanboi...

    • Re:*Sigh* (Score:5, Interesting)

      by Anonymous Coward on Friday April 25, 2008 @07:06PM (#23203874)
      Whether you take them seriously or not, this is a serious effort to be a major player in the discrete graphics market. (a market not likely to disappear soon as some seem to think)

      I happen to know a great many people that work at Intel. And I just happen to also do product testing and marketing focus groups for them. All centered around gaming.

      This was a topic that intel did not take seriously 5-10 years ago. They take it deadly serious now.

      I spoke with paul otellini on one occasion on the topic of intel gaming. It went more or less like this.

      Paul- Which Intel chip do you have in your machine at home?
      Me- It's an AMD actually.
      Paul- You work for Intel, your family works here and you buy an AMD?
      Me- I run what gives me the highest performance in what I do. It also happened to be cheaper, but thats secondary.
      Paul- They only beat us in gaming! Our chips are better at EVERYTHING else.
      Me- Gaming leads the market.
      Paul- No it doesn't.
      Me- No one upgrades twice a year to keep up with MS office. We upgrade to keep up with Carmack.
      Paul- If I offered to give you a couple of our next gen processors, would you use them?
      Me- I'd try them out, but if they can't beat my current machine I won't use them. Even if they are free. Neither will anyone I know. We literally spend a couple thousand dollars a year keeping our machines state of the art so we can squeeze an extra frame per second out of our systems. We aren't going to use anything that isn't the best.

      You want me and my market segment to take you seriously? Take us seriously. We make up a small segment, but we are fanatical.

      ___
      A couple years later, I got an email from him.
      It was actually sent as a response to several key divisions in intel, because several people had asked why we (intel) care about gamers, they make up less than 5% of the PC market (it's actually closer to 1%).
      ___
      Paul- We care about gamers because gamers grow up. They grow up to work mainly in IT fields. The gamers from 5-10 years ago are now the IT professionals we most want to be on our side. They are the ones making purchasing decisions and recommendations and they do so based on what they know. They know AMD better than us because we ignored them for so long.

      Why do we care about games? We don't. We care about the people playing them and we want them to identify with our products.

      ____

      So now you have some insight as to where intel thinks this is all going. It's not that they care about gaming or graphics, because they really don't. They care about the people behind it, and getting them hooked into a brand that "supports" them.
      Then there is the really obvious reasons for Intel getting into graphics, VISTA, and other next gen OS's and GUI's are going to use a lot of hardware acceleration. Which means discrete graphics cards aren't for the desktop anymore, they are for the server and the workstation too.
      Add to that using the GPU to do certain types of parallel processing at much better thru-put than you can get from a CPU.

      The motivation should be obvious.

      *Posted AC for my sake. I like my contacts at Intel. I'm hoping Paul doesn't remember talking to a PFY about his companies gaming culture.
      • PFY = Pimply Faced Youth

        This sounds real to me. Intel CEO Paul Otellini [wikipedia.org] could have said that.

        But it must be translated from corporate-speak. It doesn't necessarily mean anything, except that he wants to tell you something you want to hear. The translation is: "We want gamers to like us." You already knew that.

        I don't intend this to indicate anything about whether I think Intel is serious this time about making competitive GPUs. I'm just commenting on the fact that CEOs often don't believe that what
      • by Kjella ( 173770 )

        Me- I'd try them out, but if they can't beat my current machine I won't use them. Even if they are free. Neither will anyone I know. We literally spend a couple thousand dollars a year keeping our machines state of the art so we can squeeze an extra frame per second out of our systems. We aren't going to use anything that isn't the best.

        I figure by this time he called up the head of the CPU division and said "Build us the Core 2 Extremes! Those people are completely nuts and you could probably sell it for a thousand dollars as long as it thoroughly beats AMD". To me it seems fairly obvious where Intel is heading (though they're so large they can afford to go in multiple directions) and that is systems on a chip. It's already announced with Moorestown and in the meantime there's Atom for the low-cost fanless computers in the Nettops (not a

      • Re: (Score:1, Interesting)

        by Anonymous Coward
        Geforce 9800 (Finished and can be bought now)
        shader cores 128
        clock 1.7Ghz
        128x1.7 = 217.6 Gflops
        70.4GB Bandwidth

        Larrabee (Not released until Q1 2009)
        16-24 cores
        Clock 1.7 to 2.5Ghz
        2.5*24 = 60 GFlops
        DDR3 Memory bandwith far less (Even faster DDR3-1600 has 12.800 GB/s speed)

        This shows the Larrabee is at least 3.6 times slower processing speed. Plus memory bandwidth is around 6 times slower. Plus the Geforce 9800 isn't even the fastest. The GeForce 9800 GX2 is nearly twice as fast and available now.

        Plus its also
    • It's not just intel either. Every promise to revolutionize graphics has failed. Anyone remember Microsoft Talisman? :)

      I'm still waiting for intel to bring out PCI and PCI-E cards with open source drivers.

  • Duh (Score:4, Insightful)

    by Wesley Felter ( 138342 ) <wesley@felter.org> on Friday April 25, 2008 @05:01PM (#23202838) Homepage
    Creating a GPU that won't run existing games well (or at all) never made sense. Some people fantasized about forcing gamers to buy a rasterization GPU and a separate raytracing GPU, but those are probably the same fools who bought PPUs and Killer NICs.
    • Re: (Score:1, Insightful)

      by Anonymous Coward
      You say that as if there are no advantages to that approach.

      While I do agree that the endgame on this is that there will not be separate cards, I hardly think that it's a no brainer that the tasks won't be separated.

      John Carmack's suggestion a while back that both ray tracing and rasterization being combined in games is a good reason to consider the merits of specific GPUs for both. If they were designed to work together having two chips on one card could be a significant advantage in terms of performance.

      S
      • but if the raytracking GPU doesnt have the speed to keep up with a raster GPU, they cant really be combined into a game, when your moving you get worse graphics :S.
    • Re:Duh (Score:5, Insightful)

      by frieko ( 855745 ) on Friday April 25, 2008 @05:36PM (#23203144)

      Creating a GPU that won't run existing games well (or at all) never made sense.
      Not to Intel, they've been doing exactly that for years!
    • by MBCook ( 132727 )

      The theory that people were passing around was that it would be primarily targeted at raytracing and have a small rasterization engine that was decent but not high performance.

      It was a stupid idea. I don't think even Intel could could make raytracing parts competitive in the market at this point. If they wanted to do that with a new part, I would expect them to be showing MUCH more at this stage. If they were to just drop this on the world in the next few months or year, no one would be able to support it

    • has anybody stoped to think that they might be trying to go after a different market. Laptop gaming just isnt a very big market (if you geeky enough to play games, your probably not far off building your own desktops for about 1/3 of the price of a laptop). The main use of graphics cards on laptops is for graphics design and the sort. Sure there are people that want to play games on their laptop, but few people will spend $3000 to get a laptop capable of playing the latest games, when a desktop will cost th
      • by tepples ( 727027 )

        Laptop gaming just isnt a very big market (if you geeky enough to play games, your probably not far off building your own desktops for about 1/3 of the price of a laptop).
        Then what else sits between DS/PSP/GP2X and PC gaming?
    • Not all GPUs have to support high end games. There's little reason to have a chip that's as powerful as the nV 8000 series in every computer. It's not necessary, because most people don't play that kind of game. A given computer is more likely to be used with Solitaire than with a demanding 3D game.
      • Keep Vista in mind (Score:3, Insightful)

        by DrYak ( 748999 )
        Keep Microsoft Windows Vista in mind and reconsider your last sentence :

        A given computer is more likely to be used with Solitaire than with a demanding 3D game.

        More seriously : Intel has been king in the ultra-low cost segment of GPU because nearly every business desktop (almost any non-high-end Dell machine for example) needs a graphic card, just to draw the desktop, but almost no 3D function. Thus it's hard to find 1 machine which was sold to a corporation and doesn't have an i8x0 or i9x0 embed GPU. (Even if sometimes, it is disabled because the buyer asked for a mid-range nVidia or ATI).

        Th

      • Except Larrabee is a big, complex, expensive GPU, so it has to be fast (on most games) to be economically viable. If Intel was content with the GMA they wouldn't have created Larrabee.
  • This information was based from someone inside a mailbox.
  • by ravyne ( 858869 ) on Friday April 25, 2008 @05:14PM (#23202968)
    Tom Forsyth is a lesser-known name in graphics but, having read his blog and exchanging emails with him on a couple occasions, I assure you all that he really knows his stuff. He's been a graphics programmer on early game consoles, software engines, video codecs, and other modern things. The man knows 3D and has mapped it to some low-end and odd-ball hardware. I'm sure he's gotten his head around Larrabee quite nicely.
  • by Yvan256 ( 722131 ) on Friday April 25, 2008 @05:15PM (#23202974) Homepage Journal
    As a Mac mini user, I'm forced to use whatever GPU intel comes up with, unless Apple suddenly remembers their own words when they introduced the Mac mini G4:

    Lock the Target

    Or one 3D game. Go ahead, just try to play Halo on a budget PC. Most say they're good for 2D games only. That's because an âoeintegrated Intel graphicsâ chip steals power from the CPU and siphons off memory from system-level RAM. You'd have to buy an extra card to get the graphics performance of Mac mini, and some cheaper PCs don't even have an open slot to let you add one. - Apple Inc., Mac Mini G4 Graphics



    In any case, what I'd really like is yesterday's technology with today's manufacturing capabilities. Imagine an old Radeon or GeForce GPU built at 45nm or lower. Would that result in a 5-10 watts GPU that could still beat whatever intel is making?
    • Imagine an old Radeon or GeForce GPU built at 45nm or lower. Would that result in a 5-10 watts GPU that could still beat whatever intel is making?

      Maybe, but nVidia will leapfrog ahead of you with better tech on that 45nm fab.

      To be honest with you I don't understand why people keep drooling over shaving off 5-10 watts in their computers. When you are paying 6-12 cents per kilowatt-hour. You'd have to run that sucker 100-200 to save a dime. Are you really gaming that hard, where those times add up? Don't
      • Every one of those watts that goes into your computer comes out as heat though. Imagine a future where frying an egg on your cpu, gpu, or any other major hardware component is hyperbole again.
      • by Yvan256 ( 722131 )
        It's not about trimming cents from a power bill (and since I live in Mauricie, Quebec, hydro-electricity is the rule, meaning cheap and clean electricity).

        It's about a more powerful but still quiet computer. Since I bought my Mac mini, I consider external hard drives to be extremely noisy.

        An efficient GPU that only requires a few watts equals less cooling, meaning a more quiet computer (perhaps even fanless, see low-end mini-ITX boards).

        May I remind you that while some new videocards sometime require their
      • It's about reducing power, yes...

        But it's also about less heat, which means less cooling apparatus, which means a quieter machine.

        And it means less power needed from a battery, if and when you need one. (Laptops, UPS, etc.)

        And that's not "shaving 5-10 watts off your GPU", it's about making a 5-10 watt GPU, if I understand the grandparent -- instead of, say, a 20-30 watt GPU. Which still isn't a lot, but a little bit here, a little bit there, and it adds up -- CPUs are getting more efficient, too.
    • Yea that was a funny quote, but i think Apple has always cared more about interfaces and video than games. The GMA950 does what they really care about quite well, and they seem to think their core image and core animation stuff is important.
    • steals power from the CPU and siphons off memory from system-level RAM.

      Either you know what you're talking about and are oversimplifying a lot, or you don't know at all.

      If I have 2 gigs of RAM, and a game takes 1 gig, wouldn't it be better if my GPU could use the other gig? Most video cards don't come with a gig of RAM, they come with much less. And I can upgrade my system RAM -- most video cards, you only upgrade the RAM when you buy a new one.

      And "steals power from the CPU"? WTF? That is physically impossible. You could say that more is done in software, because less is eve

  • Seriously, I don't know why they don't just open source the interface and let it be compatible with crossfire and hybrid technologies from Nvidia and ATI. This would make far more logical sense than going to war with them. Plus, users would no longer need to plug into the video card for video. They could just plug into the motherboard interface and add more video cards.
  • Stupid debate (Score:2, Insightful)

    by pclminion ( 145572 )
    The whole damn debate is just a bunch of old men whining. Raytracing is obviously a superior rendering method, the question is simply when it will become fast enough. The dinosaurs don't want to let go of their precious scan conversion -- and who can blame them given the massive amount of work put into those algorithms over the last decades -- but the time of scan conversion is coming to an end.
    • Re: (Score:2, Insightful)

      by ardor ( 673957 )
      A superior rendering method because ... ?

      Primary rays have no advantage whatsoever over rasterization. Secondary rays, now THIS is where it gets interesting. Primary rays can be done fully with rasterization, in fact rasterization is nothing more than a clever first-ray optimization. Therefore, a hybrid is the way to go.

      Your "precious scan conversion" and "those algorithms" blubb shows serious lack of knowledge about the actual algorithms. I suggest you do some in-depth studies of them before posting again.
      • My day job consists mostly of writing rendering code, although not in a gaming context. I am not at all "dissing" scan conversion. It's what I do every day. My point is, WHEN (and only when) the technology is fast enough for real time recursive ray tracing, it will be the end of rasterization in 3D applications.

        Cache coherency problems can be fixed by making an enormous cache, or simply making the RAM itself so damn fast it doesn't matter anymore. Adaptive subdivision of pixels for antialiasing is not exa

        • Re:Stupid debate (Score:4, Interesting)

          by ardor ( 673957 ) on Friday April 25, 2008 @06:34PM (#23203606)

          WHEN (and only when) the technology is fast enough for real time recursive ray tracing, it will be the end of rasterization in 3D applications.
          Oh yes, the brute force solution. It will be a very long while until it is fully obsolete. Expect hybrids to stay for a long time. An example: many people claim terrain culling methods to be fully obsolete nowadays. Then they try to render really large terrains...

          Also, given that hybrids are a no-brainer, I bet both pure raytracers and rasterizers will be extinct in games.

          Cache coherency problems can be fixed by making an enormous cache, or simply making the RAM itself so damn fast it doesn't matter anymore.
          Ehrm .... got any other wishes?! You do realize that this would be a revolution and might not be possible? One of the reasons why the cache is fast is that the signal propagation delay can be much lower due to the proximity of the cache to the cpu. In other words, there will always be a cache. As for an enormous cache: cache mem is very expensive, is likely to be because cache performance doesnt stand still (read: it will always use more expensive special hardware), and huge caches have issues with cache misses.

          Adaptive subdivision of pixels for antialiasing is not exactly a first year student problem but not enormously difficult either.
          I didn't say its enormously difficult, just not nearly as trivial as with rasterization.

          Honestly, I want to see the technology blow right past raytracing and go straight to radiosity
          1. Radiosity is not the ultimate. Just try doing specular stuff with radiosity.
          2. Algorithmic complexity will always come back to haunt you. O(nÂ) will always be worse than O(n), unless you have small scenes. So you have your geforce19000 and can render ... ONE room with realtime radiosity! Nice! Somebody else fakes it and renders an entire city. Guess what will be chosen for games.
          3. You could have said path tracing or photon mapping at least.

          Finally, these people don't particularly favor raytracing simply because it does not pay off for games. Games usually don't feature fully shiny scenes, games are expected to run at interactive framerates. In, say, 5 years, entirely new (and demanding) effects are en vogue; if raytracing steals too much time, it will be dropped, its results faked. This is what the "old men" do all the time in their games: fake. In the offline world, things are wildly different, so don't compare them.
        • My day job consists mostly of writing rendering code, although not in a gaming context. I am not at all "dissing" scan conversion. It's what I do every day. My point is, WHEN (and only when) the technology is fast enough for real time recursive ray tracing, it will be the end of rasterization in 3D applications.

          If it isn't recursive then it's not ray tracing it's ray casting.

    • It's just a bunch of noise rasterbation.
    • by Kjella ( 173770 )
      Perhaps. After looking at the latest 3D games/tech previews as well as seeing movies like Beowulf etc. which can spend forever rendering I get the impression it's not the rendering technology that is the problem. Rasterization is good enough, it's that the models still act unnatural that break the illusion. Maybe raytracing can do more with less once it's past the tipping point but I don't think it'll be any major revelation. Conversely, if we do get natural movement going I think we're already at the "is t
      • by ardor ( 673957 )
        I agree. The shading can be done very, very well nowadays, but once things move, the illusion is dispelled. In Half-Life 2, I was instantly amazed by the movement of the Strider and the Hunter aliens. It made them believable, more than just a bunch of triangles. A level with multiple striders and hunters attacking, alongside these flying synths and the soldiers is very immersive because the behavior of the acting entities is so believable.

        Maybe among all these top 10 games lists, its time for a top 10 game
    • Wow, don't try and hold back that ageism so much - tell us what you really think!

      Oh wait, I'm a dinosaur who advocates (and researches) ray tracing - damn blew that theory all to hell I guess.

      And his post was ranked "4, Insightful" when I saw it - maybe understanding words like insightful should be required before people get to moderate. Yeah, yeah, I know... it's /.
  • can it uses it's own ram? the new amd chipset can. Also will it be able to work with a add in video card?
  • Free drivers? (Score:2, Offtopic)

    by CSMatt ( 1175471 )
    Off-topic. but will Intel provide drivers under a free license for their new GPU, similar to what they are doing now with their integrated GPUs?
  • Was it a week or two weeks ago that intel's Larabee was going to replace nvidia and ati's raster graphics with ray tracing?

  • I read that as resterilization. Thought it seemed a little too meta.
  • claims that the new GPU will perform very well with current DirectX and OpenGL titles
    well, there's your problem..
  • Pixels (Score:3, Insightful)

    by Whiteox ( 919863 ) on Friday April 25, 2008 @08:22PM (#23204342) Journal
    All it is is changing pixels. After all, it's still a 2D screen that displays as a bitmap. Sometimes a step backwards is valuable too.
  • Startopia is still one of my favorite games. Thanks again Tom.

What is research but a blind date with knowledge? -- Will Harvey

Working...