GF FX 5900 Ultra vs. ATi Radeon 9800 Pro 336
Mack writes "OCAddiction.com has their GF FX 5900 Ultra vs. ATi Radeon 9800 Pro article online detailing which card is more powerful. Running a plethora of benchmarks we were anxious to see which card outperformed the other. Quite simple really. We take nVidia's top offering and pair it up against the current top offering from ATi and let them duke it out till the bitter end. Who will come out on top? Let's take a look."
Synthetic Benchmarks? Incredible... (Score:4, Informative)
If you haven't heard about the controversy with MadOnion/Futuremark/3dmark2003, check out This article. [hardocp.com] Kyle @ HardOCP suggests that if you give Futuremark more $$$, they will 'optimize' their benchmark to help out your video card's score.
Now, in this review, we see that GeForceFX 5900 clearly dominates the hardware side of things: .13 vs .15 micron process, 450/850 vs. 380/340 (GPU/Core), 27.2 GB/sec vs. 21.8 GB/sec memory bandwidth, etc. Yet when we start looking at real-world scores, the 9800 keeps up pretty well & even beats the faster GeForceFX 5900 in most tests.
The big exception is the 3DMark2003 score - the GeForceFX 5900 wins 3477 to 2837!!! (!!!).
This can be attributed to one of three things;
1.Speed isn't everything (e.g., AMD vs. Intel CPU's). But of course, the slower Radeon 9800 *is* faster even though it's slower in all the real-world tests.
2.The GeForceFX used WHQL drivers... But despite these 'superior' drivers, the Radeon 9800 still reigned in all the real world tests!
3.3DMark2003 added unfair optimizations to their program to make the nvidia card seem better than ATi's
Re:Synthetic Benchmarks? Incredible... (Score:4, Funny)
Yes, it's an evil satanic conspiracy.
Did you know that "Mature Furk" is an anagram for "Futuremark"? Google for it, and be enlightened.
You actually *believe* hardocp? (Score:4, Interesting)
Then, when the fix is posted [futuremark.com], they write "This is in response to the news item we posted last week."
And now, they're making unfounded accusations that 3DMark is taking bribes to skew the benchmark results? WTF? Why doesn't HardOCP just hire Jayson Blair to write their "articles"? At least then, they'd have less spelling errors.
Re:You actually *believe* hardocp? (Score:2)
Re:You actually *believe* hardocp? (Score:4, Insightful)
Re:You actually *believe* hardocp? (Score:3, Funny)
"spelling is arbitrary"
No, there are rules. Some people just don't understand them, expecially you.
Re:Synthetic Benchmarks? Incredible... (Score:5, Insightful)
Re:Synthetic Benchmarks? Incredible... (Score:4, Funny)
Re:Synthetic Benchmarks? Incredible... (Score:4, Insightful)
And what exactly differentiates a real benchmark from a synthetic benchmark? While Futuremark does report the fill rate (both single-texturing and multi-texturing), it is simply extraneous information, which is in no way used to determine the resulting 3DMark score; the score is determined by running four game demos, which use engines akin to those used in "real games." The individual game results are reported by 3DMark, multiplied by certain coefficients, and then added together, rendering the result (3DMarks).
The reason 3DMark03 is invalid is not because it is a "synthetic" benchmark, but because nVidia mucked it up with their shenanigans. The frightful truth of the matter is, however, that the same illegitimate "optimizations" (i.e. static clip planes) that were used by nVidia in 3DMark can just as easily be used in any and all timedemo. Hence, your precious "real" benchmarks are just as susceptible, and may be just as compromised and invalid as 3DMark03. To make matters worse, unlike 3DMark03, which offers advanced diagnostic tools that allowed nVidia's dubious actions to be exposed, "real" benchmarks have no such tools. Therefore, exposing cheating in "real" benchmarks is much more difficult; however, just because something cannot be proven does not make it false.
Re:Synthetic Benchmarks? Incredible... (Score:3, Insightful)
Re:Synthetic Benchmarks? Incredible... (Score:2)
Re:Synthetic Benchmarks? Incredible... (Score:5, Informative)
WHQL doesn't mean they're better drives, it just means that they passed some MSFT testing bits. If anything, non-WHQL drivers have potential to have higher performance (think a car engine that doesn't have to worry about passing emissions), since they don't have to worry so much about playing nice with -all- available hardware.
Re:Synthetic Benchmarks? Incredible... (Score:2)
3DMark2003 (or rather Futuremark, since I doubt the program is advanced enough to program itself) did no such thing. nVidia did it all by themselves [extremetech.com], and Futuremark conducted an investigation confirming the "optimizations" (cheats i.e. static clip planes inserted by nVidia) and denouncing them. Google for more.
Boo, hiss (Score:3, Insightful)
Great theory, except for the fact that nVidia dropped out of 3DMark's developer program last fall. I doubt they're ponying up anything.
I think it's also been firmly established as well that nVidia BS'd its way through build 330 by way of straight-up cheating, not by paying any one off.
And your numbers are generally irrelevant. Smaller core means cheaper, means lower t
Benchmarks... (Score:5, Interesting)
Re:Benchmarks... (Score:2)
It doesn't rely heavily on synthetic benchmarks, it just "throws them in for whatever they're worth" (paraphrased) and specifically makes a point that the performance of the 5900 in 3DMark03 doesn't line up quite the way you'd expect with the real-world performance scores. That is, the 5900 spanks the 9800 in 3DMark03 even though the real-world tests (taken together) slightly favor the 9800 and the 5900 doesn't really just all-out clobber the 9800 in any one bench
Who Won (Score:5, Funny)
Re:Who Won (Score:4, Insightful)
Re:Who Won (Score:5, Insightful)
Re:Who Won (Score:5, Informative)
Conclusion
Let's break down performance of both cards and see which one comes out on top.
UT2k3 - FX 5900 Ultra - While both cards perform well, the FX 5900 comes out on top
AquaMark - R9800 Pro - The R9800 takes home the gold in this real-world benchmark
Comanche 4 - R9800 Pro - The R9800 also wins out by an edge for this nearly obsolete benchmark
Specviewperf 7.0 - R9800 Pro - This one is really close but the #'s lean to the R9800
Code Creatures - FX 5900 Ultra - The 5900 beats up the R9800 pretty good in this intensive benchmark
Splinter Cell - R9800 Pro - Hands down, the R9800 takes it in this awesome game from UBISoft
ShaderMark - R9800 Pro - While the FX 5900 Ultra makes a good showing, the R9800 wins this one
3DMark 01 SE Build 330 - R9800 Pro - The R9800 takes top honors with this tried and true synthetic benchmark
3DMark 03 Build 320 - FX 5900 Ultra - Should we include this? Possibly not, however the FX 5900 wins with WHQL Det Drivers
3D Visual Quality - R9800 Pro hands down
And the winner is.........The FIC ATi Radeon 9800 Pro 128MB. We compared these cards in every category we could think of and in the end, we saw better performance overall from the ATI Radeon 9800 Pro. Did the FX 5900 fail to impress us? No, not at all. We believe both cards are worthy of any good system but we do have to tip our hats to the excellent performance that the Radeon 9800 Pro has showed us here today.
--------
Daniel
Re:Who Won (Score:2)
That's nice to know. I'll probably pick one up in a month or two when the price drops a little. I'd go for ATI anyway, even if NVidia was a little faster, just because ATI plays better better than NVidia at releasing tech specs.
But guess what I'll buy tomorrow, to drop into the AGP slot on my new Shuttle PC? A Matrox.
- Specs are totally public
- Runs cool
- Really cheap
- Superior rendering quality
Re:Who Won (Score:2)
I don't get this... I checked out the featured shots, and I honestly couldn't tell any difference. Is that really a "hands down" win?!?
(I'm buying a new pc a week before Half-Life 2 comes out, and whatever's the best equipment on the market at that point, that's what I'm getting. For my money, Nvidia has the edge because of their solid Linux support.)
Re:Who Won (Score:2)
Am I the only one.... (Score:5, Insightful)
Staring at graphs indicating a
Why, oh why, can't we get some interesting writing in the field of online hardware reviews?
Re:Am I the only one.... (Score:5, Interesting)
It's not the most interesting thing to read for pleasure, but I find it useful since I am currently looking for a new video card. I would like to decide for myself which one is better. It's nice to see tests done on several games, so you know its not a single game that just happens to be optimized more for one card than the other. At least now they include things beyond frame rates, like image quality.
At least I now know (actually I knew before since it is good to check several reviews) that I can get the ATI 9800 and know that the extra $100 for the 5900 would not have been worth it. I would still think this even if the 5900 was 1% faster on every test which would likely cause the conclusion to be that the 5900 was better.
Besides, most reviews have a nice navigation thing at the bottom that lets you skip to the exact benchmark you want to see, or straight to the conclusion.
Re:Am I the only one.... (Score:5, Insightful)
1) If they just gave the conclusion, you'd be saying "But they just made that up!" All those pages of boring numbers are there to convince you they went through a fairly scientific process and when they say "It is 0.3% faster", they know what they're talking about. Compare to the RIAA's statistics about a 0.3% drop in piracy.
2) Some people buy thesse cards because their money is burning a hole in their pocket, but most people don't spend $500 on a gfx card for bragging rights, they do it because their it will improve either their work or their gaming experience. These people want to know how much more time/better experience they'll get. Those people need to find the benchmark most relevant to them, rather than the 'overall' benchmark. For example, I have a program that runs faster on a 800MHz Duron than on a 2GHz Pentium 4. Why? Because it has lots of jumps. If I had just looked at the overall benchmark then I'd have 'upgraded' and I'd be feeling pretty stupid right now.
Re:Am I the only one.... (Score:2)
Re:Am I the only one.... (Score:2)
Re:Am I the only one.... (Score:3, Insightful)
Re:Am I the only one.... (Score:2)
Re:Am I the only one.... (Score:2)
There isn't any reason why Doom 3 couldn't have a version that runs on anything and looks like junk... But who would want to pay to develop that?
Re:Am I the only one.... (Score:2)
Re:Am I the only one.... (Score:2)
John Carmack has never made a mistake about that in the past, why would you expect him to do so now?
Re:Am I the only one.... (Score:2)
What I look for as a consumer is this [tomshardware.com] - a head-to-head comparison of several generations of cards. That's where you can find the sweet spot.
Re:Am I the only one.... (Score:2)
Re:Am I the only one.... (Score:2)
I have to wonder what the point is of having a card that is any faster than the one the guys writing the games software use. Like what is the probability that someone is going to write a game that only works on a $400 card?
It was one thing when the issue was whether you could do 3D and run the monitor at 800x400 or 1024x1280 but I'm not exactly in a hurry to go beyond that...
Guess how much the card that was top of the line 2 years ago cost
Time to upgrade! (Score:5, Funny)
Re:Time to upgrade! (Score:2, Interesting)
Real Benchmarks? (Score:2)
From the article (Score:5, Interesting)
And the winner is.........The FIC ATi Radeon 9800 Pro 128MB. We compared these cards in every category we could think of and in the end, we saw better performance overall from the ATI Radeon 9800 Pro. Did the FX 5900 fail to impress us? No, not at all. We believe both cards are worthy of any good system but we do have to tip our hats to the excellent performance that the Radeon 9800 Pro has showed us here today.
But it looked pretty damn close in most of the benchmarks. Interesting that in 3DMARK, the FX 5900 ran away with it. Hmmmm.. Oh well, I doubt 5% of the people who post comments on this are going to buy one soon anyway. I know i'm not in the market.
Re:From the article (Score:2)
Pretty damn close doesn't seem to cut it if you're going to pay $100 more. Pretty damn close would be reasonable if the two cards were the same price, but the fact is that overall the 9800 outperformed the 5900 with half the memory and 80% of the cost. The way those benchmarks came out, I don't think I could understand anyone picking up the 5900.
Re:From the article (Score:3, Informative)
The FX 5900 ran away with nothing.
First, the Radeon won in 3DMark01 [ocaddiction.com].
Second, observe the origin as well as the scale of the 3DMark03 graphs: Graph 1 [ocaddiction.com], Graph 2 [ocaddiction.com]
The difference is grossly exaggerated by the graph's peculiar origin (5700 and 3800 instead of 0) and large scale.
Third, 3DMark03 has been rendered an useless benchmark since it is riddled with nVidia "optimizations," which have been deemed illegitimate by Futuremark's own accord. Even
Re:From the article (Score:2)
With all of the flap recently (referenced here [slashdot.org], here [slashdot.org], here [slashdot.org], here [slashdot.org], and here [slashdot.org]) regarding nVidia writing custom benchmark- and application-specific code into their drivers for the purpose of getting higher ratings, the value of benchmark ratings for evaluating video card performance is diminishing for benchmark software as it currently stands.
Perhaps what is needed is some kind of "drunkard's walk" scene traversal, where a scene is set up fo
I prefer the Radeon 9800 (Score:4, Insightful)
Re:I prefer the Radeon 9800 (Score:3, Funny)
> the FX 5900
Sorry, I can't hear you over all the fan noise from my Nvidia graphics card.
Thanks but... (Score:5, Insightful)
*pats his shiney new GF4 Ti 4200*
Sure, I have to upgrade more often, but it seems to be a lot less painful for me than for early adopters - and there are plenty of homes for older cards in my secondary and tertiary boxes, and then a final home put out to pasture in the render farm.
Re:Thanks but... (Score:2)
I'm not sure how it works out financially... buying the "best performance" every two years, as opposed to buying the "best value" every year. I suspect it's pretty close, and being on the bleeding edge for a little while makes some of the extra cost worth it.
Re:Thanks but... (Score:2)
It was starting to creak around the edges a bit by the end, but was playing UT2k3 perfectly well with some of the bells and whistles turned off.
But then I run Linux only, which seems to get some extra mileage out of video cards for the same games, and until recently therre hasn't been a lot to make it worthwhile upgrading. How thing
Re:Thanks but... (Score:3, Interesting)
My next card will probably be a full DX9 card, and I'll wait until it's about a hundred bucks. My DX8-capable card is probably enough until Longhorn comes
Who is better? NVIDIA of course (Linux)! (Score:2)
XFree ATI Radeon Support much better than Nvidia (Score:5, Informative)
This hasn't been true for quite some time.
I have owned numerous high end nvidia and radeon cards, and have never had anything resembling stability from the nvidia cards using the nvidia binary driver (and yes, I've tried all of the tweaks and suggestions Nvidia and others suggest vis-a-vis AGP settings, etc.). This has been true on numerous machines, both single and dual Intel P3 and Athlon XP/MP boxes, with a variety of motherboards, memory configurations, and Linux kernels.
ATI radeon cards on the other hand have been pretty solid, with excellent support via the xfree DRI drivers for most cards, and adequate, reasonably stable support from ATI via their firegl binary-only drivers for those not yet supported.
NVidia has not been king of the Linux hill for quite sometime, and while I have had my gripes with ATI as well, the notorious instability of the Nvidia binary drivers and lackluster support via the xfree DRI drivers has placed me (and my employer) firmly in the ATI camp.
Re:XFree ATI Radeon Support much better than Nvidi (Score:2)
Re:XFree ATI Radeon Support much better than Nvidi (Score:2)
It probably depends on your distro, although in general I found them to be relatively easy to install. Whether using Red Hat's RPM, uncompressing and installing from a tarball, or using Gentoo's portage (the easiest approach I suspect, and the only one I've used personally: simply 'emerge ati-drivers'), once the software is installed configuration is easy. J
Re:XFree ATI Radeon Support much better than Nvidi (Score:2)
Maybe I will get an ATI card for my next card (GF4 Ti4200 is not that outdated yet).
Duke it out...? (Score:2, Funny)
Nvidia is dying... (Score:5, Funny)
The GeForce card has:
* Twice as much memory (256 MB vs. 128MB)
* More memory bandwidth (27 GB/s vs. 21 GB/s)
* Faster memory (3 ns vs. 3.8 ns chips)
And the GeForce still got it's ass handed to it by the ATI Radeon 9800 Pro, which, by the way, doesn't even need a leaf-blower attachment just to keep it from overheating!
Is anyone still buying Nvidia cards any more these days (other than the blindly trusting fanboys, that is)?
Re:Nvidia is dying... (Score:5, Interesting)
Hi! Yes, we buy them at work all the time. We do a lot of 3D graphics work on Linux, and support for ATI cards under Linux was pretty pathetic until very recently. I'm told this has improved, but it's still not as easy as using the NVidia drivers, and we don't really trust ATI's software now. (Apparently the Radeon Mobility is not supported under Linux either - this has made my search for a new laptop very difficult.)
Re:Nvidia is dying... (Score:5, Informative)
This statement is false. The Mobility Radeon has been supported since Xfree 4.2.
I have been using this chipset with a IBM Thinkpad X22 for almost a year now, and that's with GNU/Debian Linux.
You wan't a great, cheap, superlight laptop with decent 3d support?
Please visit the IBM eBay Store [ebay.com]
Laptops are brand new in the box, full warranty, are almost 50% retail, and you are buying directly from Big Blue.
The catch? They're slightly behind the newest models, but hey, with linux support, that's the best way to buy hardware.
Re:Nvidia is dying... (Score:2)
This is wrong. Support for all of ATI's mobility cards exist. Additionally if you opt to use the open source driver you may just get some nifty power management stuff from http://cpbotha.net/dri_resume.html .
Sunny Dubey
Re:Nvidia is dying... (Score:2)
ATI has poor Linux support (Score:3, Informative)
After buying a 7500 and tinkering with it for a few days, I decided that I didn't want to try anymore, and then traded it for a GeForce 4. It worked perfectly on the first try. I'm not a huge fan of either company, but yes, I still like to buy Nvidia cards.
Re:Nvidia is dying... (Score:2)
When ATI can start making drivers that don't lock up the machine, I'll consider buying their products. Of the last three ATI products I tried over the past two years, all three of them were unsuable due to driver issues.
Re:Nvidia is dying... (Score:3, Informative)
Re:Nvidia is dying... (Score:2)
I am and I will for some time to come. I don't even CONSIDER ati cards as a possibility due to driver issues in the past. Have you ever worked with drivers for some obscure onboard ati? All the driver problems I've had with ati cards STILL reflect on me today. (more then 4 years after I last touched an ati card at home...) I voted with my wallet in favour of nvidia. *pats his purdy Asus GF4 Ti 44
Re:Nvidia is dying... (Score:3, Informative)
The ATI 'Pro' cards have DVI-D output, however it's incompatible with many monitors at 1600x1200 and higher. It's generally the monitor mfr's fault for not getting the standards quite right, but that's little consolation when you hook your $2000 Viewsonic VP201m or similar up to a Radeon and just get green snow. :-/
Drivers, drivers, drivers! (Score:2)
Re:Nvidia is dying... (Score:4, Informative)
There really is no clear winner between these two, and they cost the same. So why wouldn't people buy the FX? I prefer to support NVidia because they brought about all the recent great leaps in graphics technology (programmable vertex and pixel shaders, Cg, etc.) whereas ATI hasn't come up with anything particularily impressive.
NVidia is not 3dfx. Don't expect them to die anytime soon.
(I am a professional game programmer. Just thought I'd mention that.)
Re:NVIDIA is not going away anytime soon. (Score:2)
That's a splendid piece of FUD. TNT2 has not been available on Dell, or any other major OEM, machines for quite a while now.
Why would they, anyway, when current integrated graphics is way ahead of TNT2, and is also cheaper.
Who would buy this anyway (Score:5, Funny)
1/ Both cards can display current games at 2 quajillion fps, the winner beating the loser by 3fps
2/ The economy of well, the world, is in the dumps
3/ Quite a few cool and very demanding games (Doom3, Halflife) will come out Soon(tm) but Definately Not Yet(tm). (Personally I wouldnt be surprised if it would be @ christmas time
4/ At X-mas time (or whenever these demanding games start to come out) newer, faster cards will be out, and/or these cards will be cheaper.
5/ At X-mas time people will actually have some money set aside to buy rad new videocards for.. eh.. their girlfriends.
So who would buy this?
(No, I haven't actually -read- the article
Re:Who would buy this anyway (Score:2, Insightful)
30 something yr old clan gamers.. (Score:2)
Re:Who would buy this anyway (Score:2)
3dmark scores, GF FX IS SOO MUCH BETTER! (Score:5, Funny)
GF FX: 999999
Ati Raedon: 40394
Weird outcome! It was strange though, because during the gf fx test, it just flashed and gave me my score! Awesome speed!
Keep up the good work, NVIDIA!
9800 overclocks more (Score:3, Informative)
Copy & Paste (Score:3, Funny)
Right-oh.
Do I care? (Score:2)
Ok fine, but why is it stuff that matters? Tell me about recent advances in fabrication, or bandwidth to RAM or bus latching and I'll be thrilled. Show me someone's benchmark of the XYZ Foobar vs the ABC Barfoo one more time, and I'm going to start moderating up the goat-posts just to have something more informative to read!
Benchmarks are so .. blah! (Score:2, Insightful)
Anyways, i wouldn't buy an FX ultra, because of the 2 slots you have to give it. Yeah that's kinda BS and also is a good sign of design flaw. Aside for that minor detail, i would, like always, trust the products from Nvidia. I've never had
This weeks theme ingredient... (Score:5, Funny)
To quote the article:
For some reason I thought of "Iron Chef" when I read this.
ATI Good (Score:2)
Re:ATI Good (Score:3, Interesting)
On the other hand, ATI has really turned themselves around recently by all accounts, and started writing go
I call shenanigans on OCAddiction.com (Score:5, Interesting)
I guess I wouldn't be as pissed if it was a genuinely interesting article, rather than a collection of specs and benchmarks.
Article summary. (Score:5, Informative)
5900 has a higher fill rate, is slightly ahead at high resolutions.
Otherwise there are no real differences between the benchmarks and it all comes down to differences any layperson could understand:
The 5900 takes up 2 slots (WTF?) and the 9800 is $100 cheaper (although $399 for a graphics card is still nuts if you ask me).
BTW, the ATI 9800 won the "shootout".
More Benchmarks from both cards (Score:4, Interesting)
WBGG
Re:More Benchmarks from both cards (Score:2)
Personally I don't like either card(both too expensive, both more than what I'd need). The 9700 Pro is about as much vid c
ATI needs to look at Linux (Score:5, Insightful)
a better question (Score:2)
Re:a better question (Score:2)
OT: Mac Video Card Upgrade Advice? (Score:2)
Now I'm looking to upgrade, with UT2k3 in mind. Apple offers a Ti Geforce for $400 -- out of my budget right now. However, I can get an ATI Radeon 9000 for $169. Buying a PC version of a card and flashing it isn't really an option, since I need an ADC port.
Right now I'm thinking a Ti Geforce might be overkill, since my CPU is only
Re:OT: Mac Video Card Upgrade Advice? (Score:2)
Re:OT: Mac Video Card Upgrade Advice? (Score:2)
Re:OT: Mac Video Card Upgrade Advice? (Score:2)
the picture quality seals it (Score:2, Insightful)
Given that both these cards are going to be able to give a decent frame rate with whatever program is thrown at them i would be looking at the picture quality - which after all is what we have to look at.
forget performance! (Score:4, Funny)
No Linux benchmarks in there? (Score:2)
About the only thing I can tell about this set of benchmarks is that OpenGL and Linux are ignored completely. At least
Home heating solution for the winter time (Score:3, Funny)
b) An nVidia FX 5900 gpu
c) 19 inch monitor
If you set it to turn on in the morning time, the FX 5900 also doubles as an alarm clock/wake-up service.
driver cracks and upgrades (Score:3, Informative)
Now, the real question is... (Score:4, Interesting)
"Brand loyalty" in video cards is a joke. It's like having brand loyalty on paper clips. This holy war between NVidia and ATI fans is retarded, it's like people are TRYING to find something to argue over. Neither company offers a product that really distinguishes itself from the other, so it's all a wash anyway. Can we please stop posting these "reviews," as they're all obviously biased in one way or another (based upon the "reviewer's" chosen side in the holy war.) It's just a goddamn video card, not the cure for cancer.
Tired of ATI and NVIDIA fanboys (Score:3, Insightful)
The "Compatibility Benchmark" (Score:3, Interesting)
A few nanoseconds in a game is well and good but if you plan on running two or more operating systems on a single machine you might check into that aspect of your video card.
Just a thought.
Re:Seeing the difference between the 9 AA screensh (Score:3, Informative)