ATI Radeon 9800 Pro vs. NVidia GeForce 5900 322
HardcoreGamer writes "Today ATI shipped its Radeon 9800 Pro 256 MB DDR-2 card in time for E3 and nVidia announced the NV35-based GeForce 5900 which will be available in June. Early tests seem to say that while nVidia edges ahead of ATI in specific areas, overall ATI still has the better card. The caveat is that the next generation of DirectX 9-based games (like Doom 3 and Half-Life 2, demonstrated with ATI at E3) will truly determine which is the better card. Lots of coverage at PC Magazine, PC World, The Register (ATI) (nVidia), ExtremeTech, InternetNews, and Forbes/Reuters. Either way, at $450-$500, serious gamers are about to get another serious dent in their wallets."
this stuff is getting crazy (Score:3, Insightful)
My first freakin' PC had 20 meg HD.
Anandtech (Score:1, Insightful)
Re:I really don't have a big choice between the tw (Score:2, Insightful)
This isn't great news.. (Score:1, Insightful)
By the time these games actually are released, there will be something bigger, and better. I think graphics are the single most important aspect of the gaming experience. But don't take out a loan to buy a video card - Save your money.
I upgraded my geforce256 SDR card a month ago, with a Geforce4 ti 4200 and I'm a happy camper. Probably won't need to upgrade this one for an even longer period of time.
Re:I really don't have a big choice between the tw (Score:2, Insightful)
I really dont care about Nvidia's drivers not being open-source as long as they promptly release the official version of their drivers for all the major linux distributions. Ease of installation matters, and full points to Nvidia for understanding that.
Re:I really don't have a big choice between the tw (Score:5, Insightful)
If the situation is like this (where the cards are pretty much neck & neck) the balance swings even farther towards buying NVidia. The only NVidia card I'd have never ever considered buying would have been the dustbuster...
Given that I'm running an (ancient) dual p3-450 bought 3 years ago, I guess this Fall it might be time to upgrade
Stay behind and save money (Score:4, Insightful)
There's all the free walkthroughs, hints, and cheat codes on the web by then, too.
Re:Minor annoyances (Score:3, Insightful)
Re:I really don't have a big choice between the tw (Score:4, Insightful)
I will never forget or forgive that blatant attempt to obsolete brand new hardware. The fact that they can't be bothered to stay current with Xfree doesn't help their case in my eyes.
The only windows box I have left is the one that I play most of my games on. Every machine I own runs only NVidia hardware. The fact that NVidia's drivers support every piece of hardware they've made back to the original GeForce (and I think the Riva) makes me much more comfortable in investing in hardware from them.
Video cards get faster... who cares anymore (Score:3, Insightful)
I'm not a hard core gamer. I have a Radeon something or other I got with my current machine (powermac g4). It plays wolfenstien and quake 3 great at 1024x768 with lots of eye candy on. I thnk a lot of people get way too caught up in frame rates and technical specs..
These cards are just too expensive for most of us (Score:1, Insightful)
Coupled with an Athlon 2000+, 512M of generic PC2100 RAM and an inexpensive ECS motherboard (integrated crapola but good enough sound and Ethernet) it makes a really decent system for under $400 (no Windows tax). A 40G 7200RPM Maxtor drive, 16X DVD-ROM, 48x16x48 CDRW and a cheapo generic case/PS round it out.
I picked a card with an nVidia chipset because it was less expensive than any comparable ATI card and it looks like nVidia's Linux drivers are supported better than ATIs. I wish nVidia was more open about their drivers though...
It's the other way around. (Score:4, Insightful)
Re:Some better reviews (Score:3, Insightful)
Re:Someone explain the math to me... (Score:2, Insightful)
That sounds like a great plan to me.
orthey could be the same since it's a unified drvr (Score:3, Insightful)
It's a unified driver. Has been for a LONG time. Obviously the kernel hooks etc are different for Windows versus Linux, but the rest of the code is all the same. Claiming the "linux drivers are better" is clueless linux zealotry(sp?)
Re:Minor annoyances (Score:5, Insightful)
The Unreal engine, and more generally the guys at Epic (Tim Sweeney) operate under a different philosophy than the guys at Id. The unreal engine is quite modular. In fact, it was originally written focusing on GLIDE as the preferred rendering method. Today, DX is the preferred method, even though the current engine (even with all of its changes, which has surely included complete rewrites of components over the years) can trace itself all the way back to that GLIDE-inspired code.
Id, on the other hand, likes to start "from scratch". Between Unreal I and UT2K3/Unreal 2/Splinter Cell/Raven Shield/all of the other Unreal-based games out today, Id's gone through Quake 2, Quake 3, and is gearing up for Doom 3. Each one of those engines was different, and pretty much rewritten from the ground up each time (I'm sure there are some core components that theCarmack reuses, but essentially it's all new code).
Which approach is better? Depends. Epic's approach to incremental engine design lets third parties license their engine and benefit from on-going development, as well as getting the newer technology out there quicker. Id's approach caters to theCarmack's godlike abilities, and gives us something to look forward to with bated breath. The strength of theCarmack's code proves itself when the aging Q3 engine can still hold its own against the newest of Unreal-based games (for example, the upcoming Jedi Knight Academy game). I say let's keep 'em both.
Oh, and I'm pretty sure Unreal's audio engine is modular as well, supporting the proprietary Miles system, DirectSound, and probably also OpenAL. Same with the input engine (DirectInput, SDL).
Re:A Question (Score:1, Insightful)
You know, it's interesting... (Score:3, Insightful)
Re:It's the other way around. (Score:3, Insightful)
So obviously for those of us who do game under linux drivers ARE an issue. So what was your point besides trolling?
Re:I wouldn't hold your breath (Score:5, Insightful)
Hmmm... (Score:3, Insightful)
Re:Hmmm... (Score:2, Insightful)
The fact that it takes two slots might annoy people, but in reality, on todays motherboards, with everything but the kitchen sink on them, there are usually far more slots than you need. Also, the first PCI slot is often unusable because it may share resources with the AGP port, and cause stability-problems.
There is no question about the fact that NVidia stumbled with the nv30 and that ATi still holds the performance crown for available hardware. I'm still going to wait for this card for the same reason as many others: My last three cards have been NVidia, and driver stability has been exemplary. ATi is getting better, but it aint there just yet.
The next batch of cards should make things even more exiting. ATi has yet to move to a 0.13 process and could gain a lot from that, and the design of nv30 compared to the nv35 suggest that NVidia has a lot more headroom for improving GPU and memory clockspeeds.
I'm impressed by Carmacks ability to target a spesific level of performance on the hardware that will be available at the time his next engine ships, but this time it seems like NVidia and ATi will exceed even his expectations, with Doom3 being playable even on 1600x1200 with the latest crop of cards, and we might even see the next generation of cards out before Doom3 ships.