ATI Distributing Spurious HL2 Benchmarks 52
BatonRogue writes "Apparently ATI provided a few Half Life 2 benchmarks to the press and some websites are actually using the benchmarks for their Half Life 2 performance reviews. AnandTech and HardOCP seem to be the only reputable sources of Half Life 2 performance data as they both put together their own benchmarks representative of Half Life 2 gameplay. AnandTech apparently went through every Half Life 2 level and put together a list of the 11 most stressful levels and then created 5 demos, while HardOCP put together two long benchmarks for their review. AnandTech and HardOCP's results appear to agree with each other, while the ATI-backed benchmarks show ATI with a huge performance lead in Half Life 2. Apparently (according to the AnandTech article), ATI was allowed to make their demos while at Valve before Half Life 2 was released, while Valve would not let NVIDIA remove any data from their time at Valve until the game was released. Politics at work as usual."
Not at all surprising... (Score:5, Informative)
Not to say that given the chance, NVIDIA wouldn't post absurdly inflated numbers. I still personally favor NVIDIA, mostly because thier Linux drivers are of such high quality. And although ATI's Win32 drivers have improved greatly over the past 2 years, in my experience, they aren't quite up to the level of NVIDIA's. Maybe another year and they'll get there. My biggest beef is the lack of support for older products -- the new Catalyst drivers are good, but drivers for the original Radeon and All-in-Wonders suck. NVIDIA's detonator drivers support everything they've ever made, other than the craptastic Riva128 ZX. I'm still using my trusty old TNT2 -- plays a mean game of Quake3 under Linux.
Re:ATi's results are spurious for not sucking? (Score:4, Informative)
Re:Not at all surprising... (Score:3, Informative)
1. Get your PCI interrupts in order. There are only actually 4 PCI interrupts (A,B,C, and D) which are assigned to PCI/AGP slots in hardware by the motherboard (and some of which are assigned to onboard stuff like the IDE/USB controller - chack your motherboard manual). If it can possibly be helped, don't have anything sharing the AGP slot's PCI interrupt (requires moving cards from slot to slot in the case). Stay away especially from a sound card and the AGP on the same PCI interrupt - Sound cards like to open up long bus mastering sessions that the AGP port absolutely hates.
2. Cooling/Power issues. An overheating card or underpowered PSU (or dust-clogged intake) can cause a lot of flakiness with graphics cards that suck up obscene amounts of power. Make sure not only that your wattage is correct overall, but that the 12V, 5v, and 3.3v rails all are capable of the load being placed on them individually. This information can be dug up on the benchmarking sites. One more thing - I once troubleshot a problem like this that came down to a failing fan inside the PSU - any funny noises in the case?
3. Try a total driver reinstallation using one of the 3rd-party driver removal utilities like Detonator Destroyer (now deprecated in favor of something else - I haven't needed to do this in a while). Oh, and never, never listen to windows update when it tells you you have an Nvidia driver update - it'll roll back the version to something over a year old and probably hose the driver in the process.
Still doesn't answer my question (Score:3, Informative)
I have a Geforce2 GTS w/64 MB of memory and an Athlon XP 2200+ w/512 MB memory. I can play UT2004 fine with 32 players on any given map without frame loss (lowest detail settings, but the framerate's smooth, which is what's really important). Doom3 is a no-show. Would I have to fork over cash for a new GFX card for this game to play reasonably well or not?
Re:On the tit (Score:3, Informative)
Reviews for anything other than the top of the line are mostly "bang for buck" reviews and in this generation the numbers from ATI and Nvidia are highly competitive.
As I recall the Nvidia FX series got some well deserved flak but the last set of outright suck video cards has to have been the Cryo series chips - ack.