Forgot your password?
typodupeerror
Games Entertainment Hardware

101 3D Graphics Cards Tested 35

Posted by Zonk
from the comparison-shopping-for-the-anal-retentive dept.
Phantom69 writes "Ixbt Labs have spent a month testing 101 graphics cards in 14 different games. They used one reference system, and tested at two different resolutions (800x600 and 1024x768). The cards cover virtually all the major manufacturers ATi, nVidia, 3dfx, Matrox, PowerVR and S3 from 1999-2004."
This discussion has been archived. No new comments can be posted.

101 3D Graphics Cards Tested

Comments Filter:
  • my dual voodoo2 SLI setup
  • Irony in action (Score:5, Interesting)

    by WormholeFiend (674934) on Friday October 15, 2004 @11:17AM (#10535546)
    Geforce beats Radeon in every game except Unreal Tournament, a game that opens with an Nvidia splash screen that says "the way it's meant to be played".
    • Re:Irony in action (Score:3, Insightful)

      by gl4ss (559668)
      .. the splash screens are just marketing.

      tribes vengeance has ati's splash screen(tribes vengeance is based around the ut engine)..
    • Farcry has ATI as the dominating card, at least at the highest level. I didn't look at all the results, just a couple.
    • Re:Irony in action (Score:2, Insightful)

      by SkyWalk423 (661752)
      1. Geforce beats Radeon in every game...

      There are still people that dedicate fanboy time, thought, and effort to which company makes the faster 3D accelerators?? Didn't that die when 3DFX went under???

      They one-up each other every three months, people. The pattern is well documented now. Move on.

  • by vasqzr (619165)

    Tom's VGA Charts [about.com] are pretty similar.
  • Seriously (Score:3, Interesting)

    by fr0dicus (641320) on Friday October 15, 2004 @11:31AM (#10535692) Journal
    I'm all for freedom of choice but isn't the sheer number here overwhelming evidence of just how pointless the cycle method of graphics hardware has become? I'm continually amazed at just how much people are willing to blow time after time on a consumer component.
    • Re:Seriously (Score:3, Informative)

      by FortKnox (169099)
      The best review he did a year or two ago was which card gave you the best bang for your buck (it was something like fps/$). The answer was the Radeon 9600XT, which is the card I bought. Wasn't the best, wasn't the fanciest, but it was a lot cheaper than the other cards, and I don't notice the difference between 60fps and 75fps.
  • by Anonymous Coward
    It's about time they started putting hot anime chicks on the video cards! It makes them go faster!

    http://www.ixbt.com/video2/images/over2k4/x800xt .j pg

  • There is no such thing as a 2D card?

    Unless your imagining it!
  • All links give me a short paragraph in russian and no graphs, does that mean it's slashdotted?

    I wanted to see how the GF6800/128MB compares to the 6800GT or Ultra and see if I'm really missing out on a big chunk of performance.
  • Highlight (Score:2, Interesting)

    by Mike Hawk (687615)
    The GeForce 3 Ti500 destroys all other cards at the Quake3 800x600 and RtCW 800x600. A difference between 1st and 2nd of >5fps. We need a whiskey tango foxtrot on that one.
    • Also, my radeon 8500 edges out the GeForce 6800 Ultra. I guess I don't need to upgrade for a while... :-)
    • The GF3Ti500 did kick ass when it came out. Lots of people didn't do their homework and bought GF4MX cards that were much suckier. But to do that well? Perhaps the games were written with some GF3 specifics in mind?
      • Well the Q3 point release 1.17 came out around 9/2000 and the GeForce 3 appears to have shipped on around 10/2001. Maybe it was the other way and the GeForce 3 was optimized for Q3? But it didn't come out so impressively in CoD. Just a strange anomoly I guess, but also a testament to the card.
  • the GeForce 6800 Ultra 256MB, with the best price/performance going to the 6800 128MB. The 128MB version is $150 less than the 256 Ultra and only did significantly worse in one category of one game - Farcry 1024x768 with 2.0 shaders.
  • They shoulda run 3dmark 05 on these cards you know. Woulda been funny to see them smoking.
  • As an all desktop Linux user, I don't need the 3D powers.

    So does anyone know the best 2D choice for desktop Linux users?
    • IIRC, the Matrox cards have the best 2D quality. I don't know how the speed or Linux support is.
      • Any 2D card with enough ram will be essentially interchangable with the exception of supported modes and RAMDAC quality. Matrox wins both of these categories hands down. Matrox cards also have such lovely features as the ability to drive sync-on-green monitors, and by matrox cards I mean just about everything since the millenium. Matrox's RAMDACs are second-to-none, and virtually eliminate the "bounce" effect in a rapidly-shifting waveform by controlling voltage to an additional extent beyond what a RAMDAC
        • I have an 8MB Matrox Millenium II around here somewhere, PCI, that I use when I need a 2D card for something I'll be sitting at regularly.

          I've got a pile of 4MB PCI Matrox Millennia - they're absolutely brilliant for when I want to add another monitor to a machine. Great 2D cards, and extremely well supported in Linux, - my only real complaint is that 4MB isn't quite enough to do 1280x960x32bpp. Still, they were really cheap. :-)
  • These benchmark shootouts always represent performance in very high-end systems, not very useful for average system that most people use.
  • The D3 benchmarks are pretty fucked up, as we can see the pre-FX nvidia cards perform slightly better than normal due to the less featureful but more optimized rendering paths. And the FX series kind of suffer because they use the generic "ARB2" rendering path, but don't really have the guts to handle it well.

There is no distinction between any AI program and some existent game.

Working...