Forgot your password?
typodupeerror
Graphics Software Hardware

ATI Radeon 9800 Pro vs. NVidia GeForce 5900 322

HardcoreGamer writes "Today ATI shipped its Radeon 9800 Pro 256 MB DDR-2 card in time for E3 and nVidia announced the NV35-based GeForce 5900 which will be available in June. Early tests seem to say that while nVidia edges ahead of ATI in specific areas, overall ATI still has the better card. The caveat is that the next generation of DirectX 9-based games (like Doom 3 and Half-Life 2, demonstrated with ATI at E3) will truly determine which is the better card. Lots of coverage at PC Magazine, PC World, The Register (ATI) (nVidia), ExtremeTech, InternetNews, and Forbes/Reuters. Either way, at $450-$500, serious gamers are about to get another serious dent in their wallets."
This discussion has been archived. No new comments can be posted.

ATI Radeon 9800 Pro vs. NVidia GeForce 5900

Comments Filter:
  • by Anonymous Coward on Monday May 12, 2003 @09:39PM (#5941526)
    256 MB RAM???

    My first freakin' PC had 20 meg HD.

  • Anandtech (Score:1, Insightful)

    by Anonymous Coward on Monday May 12, 2003 @09:39PM (#5941528)
    I read the reviews over at Anandtech, and I dunno but it looks like the FX 5900 beat the pants off the Radeon 9800 256MB in all tests save one, and to top it off the 256MB radeon and the FX 5900 are the same price...
  • by millertime3250 ( 631993 ) on Monday May 12, 2003 @09:41PM (#5941540) Journal
    /agree I love the linux drivers for Nvidia.
  • by Anonymous Coward on Monday May 12, 2003 @09:49PM (#5941583)


    By the time these games actually are released, there will be something bigger, and better. I think graphics are the single most important aspect of the gaming experience. But don't take out a loan to buy a video card - Save your money.

    I upgraded my geforce256 SDR card a month ago, with a Geforce4 ti 4200 and I'm a happy camper. Probably won't need to upgrade this one for an even longer period of time.

  • by vivek7006 ( 585218 ) on Monday May 12, 2003 @09:50PM (#5941587) Homepage
    Absolutely right...

    I really dont care about Nvidia's drivers not being open-source as long as they promptly release the official version of their drivers for all the major linux distributions. Ease of installation matters, and full points to Nvidia for understanding that.

  • by MarcoAtWork ( 28889 ) on Monday May 12, 2003 @09:50PM (#5941588)
    Definitely, I don't care if -any- ATI card has a 2%-5%-10% performance advantage, having absolutely great drivers from NVidia (for Linux & windows) far outweighs any small performance gains the ATI card might supposedly have.

    If the situation is like this (where the cards are pretty much neck & neck) the balance swings even farther towards buying NVidia. The only NVidia card I'd have never ever considered buying would have been the dustbuster...

    Given that I'm running an (ancient) dual p3-450 bought 3 years ago, I guess this Fall it might be time to upgrade :)
  • by YetAnotherName ( 168064 ) on Monday May 12, 2003 @09:54PM (#5941611) Homepage
    Whenever I've given into hype, my wallet's regretted it. But buying the current way-cool game a year-and-a-half or more later almost always guarantees it'll run just fine on my current hardware.

    There's all the free walkthroughs, hints, and cheat codes on the web by then, too.
  • by The Analog Kid ( 565327 ) on Monday May 12, 2003 @09:54PM (#5941619)
    You right, but that will change as Linux popularity will grow, developers will find it easier to use OpenGL as it cross-platform and DirectX isn't. SDL and OpenAL will come into play as well. People may say that OpenGL is lagging in progress but games like DOOM3 make me somewhat skeptical of these people. Long live Carmack.
  • ATI has never wanted to trouble themselves with Drivers. Historically they have abandoned hardware as quickly as they thought they could get away with. I got bit by this back with the introduction of the "new" windows driver model. A card less than two months old was "unsupported". I made the mistake of buying an ATI PCI TV Wonder while experimenting with HTPC setups. Fortunately that one is still quite useful in Linux. ATI dropped windows support for IT over a year ago. Shortly after I purchased one NEW. The ATI Windows apps still don't work right. Every time they invoke Windows scheduler to set up a scheduled show, they GPF.

    I will never forget or forgive that blatant attempt to obsolete brand new hardware. The fact that they can't be bothered to stay current with Xfree doesn't help their case in my eyes.

    The only windows box I have left is the one that I play most of my games on. Every machine I own runs only NVidia hardware. The fact that NVidia's drivers support every piece of hardware they've made back to the original GeForce (and I think the Riva) makes me much more comfortable in investing in hardware from them.

  • by acomj ( 20611 ) on Monday May 12, 2003 @10:11PM (#5941696) Homepage
    Why would anyone spend 400-500$ on a video card. Unless you really NEED to be cutting edge for the next 6 month or so before the next batch comes out and the price of these cards becomes more reasonable.

    I'm not a hard core gamer. I have a Radeon something or other I got with my current machine (powermac g4). It plays wolfenstien and quake 3 great at 1024x768 with lots of eye candy on. I thnk a lot of people get way too caught up in frame rates and technical specs..

  • by Anonymous Coward on Monday May 12, 2003 @10:12PM (#5941707)
    I'm not a 'serious gamer', but my inexpensive GeForce 4 MX440 with 64M SDRAM works just fine with any game I've tried on it, and it was only $70 on sale at Fry's.

    Coupled with an Athlon 2000+, 512M of generic PC2100 RAM and an inexpensive ECS motherboard (integrated crapola but good enough sound and Ethernet) it makes a really decent system for under $400 (no Windows tax). A 40G 7200RPM Maxtor drive, 16X DVD-ROM, 48x16x48 CDRW and a cheapo generic case/PS round it out.

    I picked a card with an nVidia chipset because it was less expensive than any comparable ATI card and it looks like nVidia's Linux drivers are supported better than ATIs. I wish nVidia was more open about their drivers though...

  • by Trepidity ( 597 ) <delirium-slashdot@NosPam.hackish.org> on Monday May 12, 2003 @10:15PM (#5941723)
    Unless Linux suddenly got a bunch of new latest-generation games, the issue of Linux drivers is a non-issue. 99% of gamers use Windows to play games, even those who use Linux for everything else (hell, CmdrTaco even reboots to Windows to play games).
  • by afidel ( 530433 ) on Monday May 12, 2003 @10:38PM (#5941845)
    You mentioned Tom's Hardware and reputable in the same paragraph and it wasn't talking about the lack thereof, for shame. (yep it's flaimbait but I have karma to burn and Tom has more bias then a CNN reporter, he just changes loyalties every so often to seem "fair and balanced")
  • by TeraCo ( 410407 ) on Monday May 12, 2003 @10:46PM (#5941888) Homepage
    At least not until you can sell the engine to other developers a few months later at a couple of hundred grand a pop, plus a percentage of royalties.

    That sounds like a great plan to me.

  • by SuperBanana ( 662181 ) on Monday May 12, 2003 @10:48PM (#5941901)
    closed source or not, the fact is that the NVIDIA drivers on Linux are as good or better that it's win* counterparts

    It's a unified driver. Has been for a LONG time. Obviously the kernel hooks etc are different for Windows versus Linux, but the rest of the code is all the same. Claiming the "linux drivers are better" is clueless linux zealotry(sp?)

  • by Osty ( 16825 ) on Monday May 12, 2003 @11:06PM (#5941977)

    While UT2003 uses DirectX by default on Windows platforms, it does have an OpenGL renderer also. You can switch it to use OpenGL instead, and the Linux version (of course) uses OpenGL by default.

    The Unreal engine, and more generally the guys at Epic (Tim Sweeney) operate under a different philosophy than the guys at Id. The unreal engine is quite modular. In fact, it was originally written focusing on GLIDE as the preferred rendering method. Today, DX is the preferred method, even though the current engine (even with all of its changes, which has surely included complete rewrites of components over the years) can trace itself all the way back to that GLIDE-inspired code.


    Id, on the other hand, likes to start "from scratch". Between Unreal I and UT2K3/Unreal 2/Splinter Cell/Raven Shield/all of the other Unreal-based games out today, Id's gone through Quake 2, Quake 3, and is gearing up for Doom 3. Each one of those engines was different, and pretty much rewritten from the ground up each time (I'm sure there are some core components that theCarmack reuses, but essentially it's all new code).


    Which approach is better? Depends. Epic's approach to incremental engine design lets third parties license their engine and benefit from on-going development, as well as getting the newer technology out there quicker. Id's approach caters to theCarmack's godlike abilities, and gives us something to look forward to with bated breath. The strength of theCarmack's code proves itself when the aging Q3 engine can still hold its own against the newest of Unreal-based games (for example, the upcoming Jedi Knight Academy game). I say let's keep 'em both.


    Oh, and I'm pretty sure Unreal's audio engine is modular as well, supporting the proprietary Miles system, DirectSound, and probably also OpenAL. Same with the input engine (DirectInput, SDL).

  • Re:A Question (Score:1, Insightful)

    by Anonymous Coward on Monday May 12, 2003 @11:08PM (#5941987)
    But z buffer memory is not half of all memory used, due to Z culling and hidden surface removal it is probably a very small % of the total ram used,
    Um, no. Hidden surface removal doesn't reduce the amount of z buffer memory required; you still need several (2-4) bytes per pixel. It just speed things up by not rendering things that aren't seen.
  • by chameleon_skin ( 672881 ) on Monday May 12, 2003 @11:12PM (#5942000)
    ...because I love computer games, but I haven't owned a cutting-edge video card in about five years, and if anything my gaming experience has *improved.* Why? Because ninety percent of the time games that are written to use the features of brand-spankin'-new video cards are so intent on milking the most out of the card's technology that they fail to concentrate on the most important aspect - gameplay. If a game is actually innovative, challenging, and involving, then it's still going to be enjoyable two years from now despite the fact that its graphics aren't quite up to par with the latest offerings. Because I've got a wimpy 10Mb video card, all of the games I can play on my machine are a year or two old. Sure, this means that I miss out on a lot of the online gaming experience - a lot of the multiplayer servers for a game are dead by the time I get around to playing it. But if those servers have disappeared inside of eighteen months, then how good was the game in the first place? Half-Life is pushing five years now, and there are still tons of places to play it. $450 for a freakin' video card? Sheesh. Give me a break. I'll wait until they're $100, by which time all the mediocre games will have disappeared into a much-deserved oblivion while I'll just be ready to tackle the top ten of the bunch. Sure, a year and a half is like an eon in computer gaming, but the ones that last the eons are the best anyway. Chess, anyone?
  • by bogie ( 31020 ) on Monday May 12, 2003 @11:32PM (#5942085) Journal
    Yea most games are for Windows, so what? The parent said under linux Nvidia kicks ATI's ass. This of course is true and has been for a while now. For people considering a video card for linux this is a fairly important piece of information.

    So obviously for those of us who do game under linux drivers ARE an issue. So what was your point besides trolling?
  • by Verity_Crux ( 523278 ) <countprimes @ g m a i l.com> on Tuesday May 13, 2003 @01:42AM (#5942638)
    "Really, John Carmack singlehandedly keeps OpenGL alive;" Uh, anybody purchased a nice CAD program lately that uses DirectX? Or any EDA tool? Or any math tool?
  • Hmmm... (Score:3, Insightful)

    by klui ( 457783 ) on Tuesday May 13, 2003 @03:55AM (#5942952)
    If both cards perform relatively the same but the nVidia card takes up an extra slot, my vote would go to ATi. I get the sense ATi and nVidia would just continue to one-up the other and continue to produce products at a furious pace. Will they get enough revenue to continue with their new product release. $400, $500... $600... where will it end? Sure they can push the state-of-the art, but if less people can justify buying these expensive parts, does it matter whose product is better?
  • Re:Hmmm... (Score:2, Insightful)

    by Rothron the Wise ( 171030 ) on Tuesday May 13, 2003 @07:42AM (#5943527)
    If both cards perform relatively the same but the nVidia card takes up an extra slot, my vote would go to ATi.

    The fact that it takes two slots might annoy people, but in reality, on todays motherboards, with everything but the kitchen sink on them, there are usually far more slots than you need. Also, the first PCI slot is often unusable because it may share resources with the AGP port, and cause stability-problems.

    There is no question about the fact that NVidia stumbled with the nv30 and that ATi still holds the performance crown for available hardware. I'm still going to wait for this card for the same reason as many others: My last three cards have been NVidia, and driver stability has been exemplary. ATi is getting better, but it aint there just yet.

    The next batch of cards should make things even more exiting. ATi has yet to move to a 0.13 process and could gain a lot from that, and the design of nv30 compared to the nv35 suggest that NVidia has a lot more headroom for improving GPU and memory clockspeeds.

    I'm impressed by Carmacks ability to target a spesific level of performance on the hardware that will be available at the time his next engine ships, but this time it seems like NVidia and ATi will exceed even his expectations, with Doom3 being playable even on 1600x1200 with the latest crop of cards, and we might even see the next generation of cards out before Doom3 ships.

We don't really understand it, so we'll give it to the programmers.

Working...