Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

GF FX 5900 Ultra vs. ATi Radeon 9800 Pro 336

Mack writes "OCAddiction.com has their GF FX 5900 Ultra vs. ATi Radeon 9800 Pro article online detailing which card is more powerful. Running a plethora of benchmarks we were anxious to see which card outperformed the other. Quite simple really. We take nVidia's top offering and pair it up against the current top offering from ATi and let them duke it out till the bitter end. Who will come out on top? Let's take a look."
This discussion has been archived. No new comments can be posted.

GF FX 5900 Ultra vs. ATi Radeon 9800 Pro

Comments Filter:
  • by calebb ( 685461 ) * on Sunday June 29, 2003 @05:03PM (#6326842) Homepage Journal

    If you haven't heard about the controversy with MadOnion/Futuremark/3dmark2003, check out This article. [hardocp.com] Kyle @ HardOCP suggests that if you give Futuremark more $$$, they will 'optimize' their benchmark to help out your video card's score.

    Now, in this review, we see that GeForceFX 5900 clearly dominates the hardware side of things: .13 vs .15 micron process, 450/850 vs. 380/340 (GPU/Core), 27.2 GB/sec vs. 21.8 GB/sec memory bandwidth, etc. Yet when we start looking at real-world scores, the 9800 keeps up pretty well & even beats the faster GeForceFX 5900 in most tests.

    The big exception is the 3DMark2003 score - the GeForceFX 5900 wins 3477 to 2837!!! (!!!).

    This can be attributed to one of three things;
    1.Speed isn't everything (e.g., AMD vs. Intel CPU's). But of course, the slower Radeon 9800 *is* faster even though it's slower in all the real-world tests.
    2.The GeForceFX used WHQL drivers... But despite these 'superior' drivers, the Radeon 9800 still reigned in all the real world tests!
    3.3DMark2003 added unfair optimizations to their program to make the nvidia card seem better than ATi's

    • by CausticWindow ( 632215 ) on Sunday June 29, 2003 @05:10PM (#6326888)

      Yes, it's an evil satanic conspiracy.

      Did you know that "Mature Furk" is an anagram for "Futuremark"? Google for it, and be enlightened.

    • by User 956 ( 568564 ) on Sunday June 29, 2003 @05:37PM (#6327026) Homepage
      HardOCP's coverage of all this is disgraceful. When Extremetech originally broke the story [extremetech.com], HardOCP practically accused them of making it up, and said they had "motives of their own" for writing the article outlining the problem. Instead of investigating on their own, apparently the procedure at HardOCP is to question the findings of the other, more competent, tech sites.

      Then, when the fix is posted [futuremark.com], they write "This is in response to the news item we posted last week."

      ... As if _they_ broke the story. As if _they_ are responsible for causing a patch to be posted. No apology to Extremetech, either (in fact, no mention of them at all)

      And now, they're making unfounded accusations that 3DMark is taking bribes to skew the benchmark results? WTF? Why doesn't HardOCP just hire Jayson Blair to write their "articles"? At least then, they'd have less spelling errors.
    • by be-fan ( 61476 ) on Sunday June 29, 2003 @06:22PM (#6327234)
      Am I the only one who automatically ignores any benchmark whose result isn't in FPS? I learned a long ago, from PC Mags 3d benchmarks, that synthetic benchmarks are absolutely useless!
      • by Qzukk ( 229616 ) on Sunday June 29, 2003 @07:27PM (#6327549) Journal
        They may be useless to you, since you're a gamer and you just want to know how fast your games will run, but when I need a card to run 3dmark as fast as possible, I know which test I'm looking for.
      • by Distinguished Hero ( 618385 ) on Sunday June 29, 2003 @08:12PM (#6327741) Homepage
        I learned a long ago, from PC Mags 3d benchmarks, that synthetic benchmarks are absolutely useless!

        And what exactly differentiates a real benchmark from a synthetic benchmark? While Futuremark does report the fill rate (both single-texturing and multi-texturing), it is simply extraneous information, which is in no way used to determine the resulting 3DMark score; the score is determined by running four game demos, which use engines akin to those used in "real games." The individual game results are reported by 3DMark, multiplied by certain coefficients, and then added together, rendering the result (3DMarks).

        The reason 3DMark03 is invalid is not because it is a "synthetic" benchmark, but because nVidia mucked it up with their shenanigans. The frightful truth of the matter is, however, that the same illegitimate "optimizations" (i.e. static clip planes) that were used by nVidia in 3DMark can just as easily be used in any and all timedemo. Hence, your precious "real" benchmarks are just as susceptible, and may be just as compromised and invalid as 3DMark03. To make matters worse, unlike 3DMark03, which offers advanced diagnostic tools that allowed nVidia's dubious actions to be exposed, "real" benchmarks have no such tools. Therefore, exposing cheating in "real" benchmarks is much more difficult; however, just because something cannot be proven does not make it false.
      • I dunno: FPS benchmarks aren't all that helpful either, because they are inevitably averages of demo performance. What I want to know is the lowest FPS score: how bad it gets during the most intense action in a game. It's not the constant framerate throughout the game that I worry about, since I know pretty well that a given card can manage a given game at a certain level. It's the "hitches" that I worry about, and want to know if they are eliminated by the card.
    • Who cares about synthetic benchmarks. If it's like compare penis sizes at LAN parties great. But, when you're smoking people because you have a better real world graphics card who will care if you have a little dick?
    • by ameoba ( 173803 ) on Sunday June 29, 2003 @07:04PM (#6327451)
      The GeForceFX used WHQL drivers... But despite these 'superior' drivers, the Radeon 9800 still reigned in all the real world tests


      WHQL doesn't mean they're better drives, it just means that they passed some MSFT testing bits. If anything, non-WHQL drivers have potential to have higher performance (think a car engine that doesn't have to worry about passing emissions), since they don't have to worry so much about playing nice with -all- available hardware.
    • 3DMark2003 added unfair optimizations to their program to make the nvidia card seem better than ATi's

      3DMark2003 (or rather Futuremark, since I doubt the program is advanced enough to program itself) did no such thing. nVidia did it all by themselves [extremetech.com], and Futuremark conducted an investigation confirming the "optimizations" (cheats i.e. static clip planes inserted by nVidia) and denouncing them. Google for more.
    • Boo, hiss (Score:3, Insightful)

      Kyle @ HardOCP suggests that if you give Futuremark more $$$, they will 'optimize' their benchmark to help out your video card's score.

      Great theory, except for the fact that nVidia dropped out of 3DMark's developer program last fall. I doubt they're ponying up anything.

      I think it's also been firmly established as well that nVidia BS'd its way through build 330 by way of straight-up cheating, not by paying any one off.

      And your numbers are generally irrelevant. Smaller core means cheaper, means lower t

  • Benchmarks... (Score:5, Interesting)

    by mgcsinc ( 681597 ) on Sunday June 29, 2003 @05:03PM (#6326846)
    With the benchmark-favoring drivers fiasco, just how much can we be expected to trust a review which relies so heavily on this testing method?
    • This review? You must have read it wrong.

      It doesn't rely heavily on synthetic benchmarks, it just "throws them in for whatever they're worth" (paraphrased) and specifically makes a point that the performance of the 5900 in 3DMark03 doesn't line up quite the way you'd expect with the real-world performance scores. That is, the 5900 spanks the 9800 in 3DMark03 even though the real-world tests (taken together) slightly favor the 9800 and the 5900 doesn't really just all-out clobber the 9800 in any one bench
  • Who Won (Score:5, Funny)

    by bsharitt ( 580506 ) <bridget AT sharitt DOT com> on Sunday June 29, 2003 @05:03PM (#6326847) Journal
    Why didn't the poster tell who won? Now I have to actually read the article.

    • Re:Who Won (Score:4, Insightful)

      by AceJohnny ( 253840 ) <jlargentaye@NoSpam.gmail.com> on Sunday June 29, 2003 @05:06PM (#6326864) Journal
      that way, we are forced to have a look at the article, thus preventing uninformed rants. Yes, it requires a tad more effort, but I think Slashdotters need that =)
    • Re:Who Won (Score:5, Insightful)

      by anotherone ( 132088 ) on Sunday June 29, 2003 @05:11PM (#6326896)
      Becaues the guy who submitted the article is from the website that wrote the article... He obviously wants a billion slashbots to raise their ad revenue. If he gave away the ending, fewer people would read his article.
      • Re:Who Won (Score:5, Informative)

        by KDan ( 90353 ) on Sunday June 29, 2003 @05:33PM (#6327014) Homepage
        Well the site is crawling by now, had several timeouts already, but managed to get to the conclusions (wasn't really worth the effort tbh):::

        Conclusion

        Let's break down performance of both cards and see which one comes out on top.

        UT2k3 - FX 5900 Ultra - While both cards perform well, the FX 5900 comes out on top

        AquaMark - R9800 Pro - The R9800 takes home the gold in this real-world benchmark

        Comanche 4 - R9800 Pro - The R9800 also wins out by an edge for this nearly obsolete benchmark

        Specviewperf 7.0 - R9800 Pro - This one is really close but the #'s lean to the R9800

        Code Creatures - FX 5900 Ultra - The 5900 beats up the R9800 pretty good in this intensive benchmark

        Splinter Cell - R9800 Pro - Hands down, the R9800 takes it in this awesome game from UBISoft

        ShaderMark - R9800 Pro - While the FX 5900 Ultra makes a good showing, the R9800 wins this one

        3DMark 01 SE Build 330 - R9800 Pro - The R9800 takes top honors with this tried and true synthetic benchmark

        3DMark 03 Build 320 - FX 5900 Ultra - Should we include this? Possibly not, however the FX 5900 wins with WHQL Det Drivers

        3D Visual Quality - R9800 Pro hands down

        And the winner is.........The FIC ATi Radeon 9800 Pro 128MB. We compared these cards in every category we could think of and in the end, we saw better performance overall from the ATI Radeon 9800 Pro. Did the FX 5900 fail to impress us? No, not at all. We believe both cards are worthy of any good system but we do have to tip our hats to the excellent performance that the Radeon 9800 Pro has showed us here today.

        --------

        Daniel
        • And the winner is.........The FIC ATi Radeon 9800 Pro 128MB.

          That's nice to know. I'll probably pick one up in a month or two when the price drops a little. I'd go for ATI anyway, even if NVidia was a little faster, just because ATI plays better better than NVidia at releasing tech specs.

          But guess what I'll buy tomorrow, to drop into the AGP slot on my new Shuttle PC? A Matrox.

          - Specs are totally public
          - Runs cool
          - Really cheap
          - Superior rendering quality
        • 3D Visual Quality - R9800 Pro hands down

          I don't get this... I checked out the featured shots, and I honestly couldn't tell any difference. Is that really a "hands down" win?!?

          (I'm buying a new pc a week before Half-Life 2 comes out, and whatever's the best equipment on the market at that point, that's what I'm getting. For my money, Nvidia has the edge because of their solid Linux support.)
          • I think the "hands down" part is in reference to the fact that on the UT2003 testing the 9800 can go to 6xAA and 16xAF and still beat the Nvidia in frame rates while the 5900 is at 8x8x and 4xS/8x (I don't know what the "S" refers to - some Nvidia-specific enhancement I assume).
  • by mrpuffypants ( 444598 ) * <mrpuffypants@@@gmail...com> on Sunday June 29, 2003 @05:04PM (#6326849)
    who finds these types of articles really, really, really boring?

    Staring at graphs indicating a .03% increase in one card over the other is just tearingly boring to me. I often find myself skipping right through to the end just to see the final "verdict"

    Why, oh why, can't we get some interesting writing in the field of online hardware reviews?
    • by atomicdragon ( 619181 ) on Sunday June 29, 2003 @05:29PM (#6327004)

      It's not the most interesting thing to read for pleasure, but I find it useful since I am currently looking for a new video card. I would like to decide for myself which one is better. It's nice to see tests done on several games, so you know its not a single game that just happens to be optimized more for one card than the other. At least now they include things beyond frame rates, like image quality.

      At least I now know (actually I knew before since it is good to check several reviews) that I can get the ATI 9800 and know that the extra $100 for the 5900 would not have been worth it. I would still think this even if the 5900 was 1% faster on every test which would likely cause the conclusion to be that the 5900 was better.

      Besides, most reviews have a nice navigation thing at the bottom that lets you skip to the exact benchmark you want to see, or straight to the conclusion.

    • by lakeland ( 218447 ) <lakeland@acm.org> on Sunday June 29, 2003 @05:37PM (#6327028) Homepage
      You're right that the sub-results are largely irrelevant, except for a couple points.

      1) If they just gave the conclusion, you'd be saying "But they just made that up!" All those pages of boring numbers are there to convince you they went through a fairly scientific process and when they say "It is 0.3% faster", they know what they're talking about. Compare to the RIAA's statistics about a 0.3% drop in piracy.

      2) Some people buy thesse cards because their money is burning a hole in their pocket, but most people don't spend $500 on a gfx card for bragging rights, they do it because their it will improve either their work or their gaming experience. These people want to know how much more time/better experience they'll get. Those people need to find the benchmark most relevant to them, rather than the 'overall' benchmark. For example, I have a program that runs faster on a 800MHz Duron than on a 2GHz Pentium 4. Why? Because it has lots of jumps. If I had just looked at the overall benchmark then I'd have 'upgraded' and I'd be feeling pretty stupid right now.
    • Not to mention that they completely overlook the fact that ATI's Linux drivers provide only a fraction of the performance that the Windows ones do, while the nVidia drivers provide almost the exact same level of performance across the different platforms.
    • We recieved several of these 5900's in the office recently and are running some of our builds through it for compatibility testing. The feeling of everyone is that it runs pretty darn well even with all of the tricks turned up, but isn't worth $500 to anyone, including the programmer with a dual-xenon box at home. It's just not that much better than the $300 and $400 cards available on the market to justify such a high price. The framerate on the previous 4800 is about the same if you drop two resolution
      • It still puzzles me that at QuakeCon last year the id guys said that D3 was going to be targeted at GeForce 3-level cards. However, after seeing everything that D3 was doing I think that they are crazy if you want to play at anything more than 640/480 or watching a slide show with scary monsters
        • Put in low-enough LOD models and a prioritized effects system and it could be done on a VooDoo 2. The difficulty is that nobody wants to pay artists to do 8-step LOD models, nor do the programmers want their beautiful particle systems scaled back to one particle per second.

          There isn't any reason why Doom 3 couldn't have a version that runs on anything and looks like junk... But who would want to pay to develop that?
        • It still puzzles me that at QuakeCon last year the id guys said that D3 was going to be targeted at GeForce 3-level cards. However, after seeing everything that D3 was doing I think that they are crazy if you want to play at anything more than 640/480 or watching a slide show with scary monsters

          John Carmack has never made a mistake about that in the past, why would you expect him to do so now?
    • I don't like the head-to-head tests either, particularly since I'm not in the high-end market.

      What I look for as a consumer is this [tomshardware.com] - a head-to-head comparison of several generations of cards. That's where you can find the sweet spot.

    • A lot of the verbosity has to do with the fact that cards are faster at different things. PC Magazine used to (still does?) review a graphics card by running one stupid synthetic benchmark and using it as the number. Running a whole suite of tests gives a prospective buyer a much better idea of which card will be faster for the games he plays.
    • who finds these types of articles really, really, really boring

      I have to wonder what the point is of having a card that is any faster than the one the guys writing the games software use. Like what is the probability that someone is going to write a game that only works on a $400 card?

      It was one thing when the issue was whether you could do 3D and run the monitor at 800x400 or 1024x1280 but I'm not exactly in a hurry to go beyond that...

      Guess how much the card that was top of the line 2 years ago cost

  • by Xeth ( 614132 ) on Sunday June 29, 2003 @05:05PM (#6326856) Journal
    I mean, damn! Four more FPS! For only $499 (plus tax ans S&H)! Where's my credit card...
    • Re:Time to upgrade! (Score:2, Interesting)

      by Slack3r78 ( 596506 )
      I agree with you, hardware sites in general tend to make way too big a deal out of minimal increases in performance. That said, I'll probably end up buying an FX series card eventually for one reason - DX9 support. (Almost all the cool new features are also supported as OGL extensions. See also: Doom 3). It's gonna suck for my wallet, but when you're attempting to get into graphics development, hey, it happens. :)
  • Is this a matter of faster cards, or a matter of best optimized software' [slashdot.org]

  • From the article (Score:5, Interesting)

    by gerf ( 532474 ) on Sunday June 29, 2003 @05:07PM (#6326870) Journal

    And the winner is.........The FIC ATi Radeon 9800 Pro 128MB. We compared these cards in every category we could think of and in the end, we saw better performance overall from the ATI Radeon 9800 Pro. Did the FX 5900 fail to impress us? No, not at all. We believe both cards are worthy of any good system but we do have to tip our hats to the excellent performance that the Radeon 9800 Pro has showed us here today.

    But it looked pretty damn close in most of the benchmarks. Interesting that in 3DMARK, the FX 5900 ran away with it. Hmmmm.. Oh well, I doubt 5% of the people who post comments on this are going to buy one soon anyway. I know i'm not in the market.

    • But it looked pretty damn close in most of the benchmarks.

      Pretty damn close doesn't seem to cut it if you're going to pay $100 more. Pretty damn close would be reasonable if the two cards were the same price, but the fact is that overall the 9800 outperformed the 5900 with half the memory and 80% of the cost. The way those benchmarks came out, I don't think I could understand anyone picking up the 5900.

    • Interesting that in 3DMARK, the FX 5900 ran away with it.

      The FX 5900 ran away with nothing.

      First, the Radeon won in 3DMark01 [ocaddiction.com].

      Second, observe the origin as well as the scale of the 3DMark03 graphs: Graph 1 [ocaddiction.com], Graph 2 [ocaddiction.com]
      The difference is grossly exaggerated by the graph's peculiar origin (5700 and 3800 instead of 0) and large scale.

      Third, 3DMark03 has been rendered an useless benchmark since it is riddled with nVidia "optimizations," which have been deemed illegitimate by Futuremark's own accord. Even
    • Interesting that in 3DMARK, the FX 5900 ran away with it. Hmmmm..

      With all of the flap recently (referenced here [slashdot.org], here [slashdot.org], here [slashdot.org], here [slashdot.org], and here [slashdot.org]) regarding nVidia writing custom benchmark- and application-specific code into their drivers for the purpose of getting higher ratings, the value of benchmark ratings for evaluating video card performance is diminishing for benchmark software as it currently stands.

      Perhaps what is needed is some kind of "drunkard's walk" scene traversal, where a scene is set up fo

  • by Travoltus ( 110240 ) on Sunday June 29, 2003 @05:08PM (#6326875) Journal
    It's cheaper, it apparently runs faster, and I also hear that it doesn't need TWO SLOTS like the FX 5900.
    • by Anonymous Coward
      > I also hear that it doesn't need TWO SLOTS like
      > the FX 5900

      Sorry, I can't hear you over all the fan noise from my Nvidia graphics card.
  • Thanks but... (Score:5, Insightful)

    by Realistic_Dragon ( 655151 ) on Sunday June 29, 2003 @05:11PM (#6326892) Homepage
    I'll do what I always do. Wait for my current card not to be able to keep up at the optimal resolution for my screens with the games I like, then pick a £100 card that does.

    *pats his shiney new GF4 Ti 4200*

    Sure, I have to upgrade more often, but it seems to be a lot less painful for me than for early adopters - and there are plenty of homes for older cards in my secondary and tertiary boxes, and then a final home put out to pasture in the render farm.
    • I go the opposite route. When my old card can't keep up, I spring the big bucks for the shiniest/newest/fastest video out there. It costs a little more up front, but it usually means I'm good for a couple of years before it's upgrade time again.

      I'm not sure how it works out financially... buying the "best performance" every two years, as opposed to buying the "best value" every year. I suspect it's pretty close, and being on the bleeding edge for a little while makes some of the extra cost worth it.
      • Well, the last card I brought was a GF2 MX 400 2 years ago at around £100 (IIRC), so £200 every 4 years for me compared to £300-£400 for a top end card.

        It was starting to creak around the edges a bit by the end, but was playing UT2k3 perfectly well with some of the bells and whistles turned off.

        But then I run Linux only, which seems to get some extra mileage out of video cards for the same games, and until recently therre hasn't been a lot to make it worthwhile upgrading. How thing
    • Re:Thanks but... (Score:3, Interesting)

      by drinkypoo ( 153816 )
      Amen to that, brother. I have the same card. In a case with adequate cooling (I have an aluminum rackmount case with a bunch of fans) you can overclock it to the same rates as the Ti4400, further saving you ten bucks or so. I paid US$129 for mine a while back. My card before that was a GF3Ti200, at about the same price point, preceded by a GF2MX, US$99.

      My next card will probably be a full DX9 card, and I'll wait until it's about a hundred bucks. My DX8-capable card is probably enough until Longhorn comes

  • NVIDIA cards because ATI's Linux drivers are not very good compared to NVIDIA's. I won't be buying an ATI card until ATI supports Linux fully like NVIDIA. I do play Linux native-port games in Linux.
    • by FreeUser ( 11483 ) on Sunday June 29, 2003 @05:57PM (#6327117)
      NVIDIA cards because ATI's Linux drivers are not very good compared to NVIDIA's. I won't be buying an ATI card until ATI supports Linux fully like NVIDIA. I do play Linux native-port games in Linux.

      This hasn't been true for quite some time.

      I have owned numerous high end nvidia and radeon cards, and have never had anything resembling stability from the nvidia cards using the nvidia binary driver (and yes, I've tried all of the tweaks and suggestions Nvidia and others suggest vis-a-vis AGP settings, etc.). This has been true on numerous machines, both single and dual Intel P3 and Athlon XP/MP boxes, with a variety of motherboards, memory configurations, and Linux kernels.

      ATI radeon cards on the other hand have been pretty solid, with excellent support via the xfree DRI drivers for most cards, and adequate, reasonably stable support from ATI via their firegl binary-only drivers for those not yet supported.

      NVidia has not been king of the Linux hill for quite sometime, and while I have had my gripes with ATI as well, the notorious instability of the Nvidia binary drivers and lackluster support via the xfree DRI drivers has placed me (and my employer) firmly in the ATI camp.
      • Hmm! It has been improved lately. I might be wanting an ATI card then. After reading http://slashdot.org/comments.pl?sid=69322&threshol d=0&commentsort=3&tid=137&tid=152&tid=185&mode=thr ead&cid=6326912 ... it said that the drivers are hard to install. Is this true?
        • Hmm! It has been improved lately. I might be wanting an ATI card then. After reading ... it said that the drivers are hard to install. Is this true?

          It probably depends on your distro, although in general I found them to be relatively easy to install. Whether using Red Hat's RPM, uncompressing and installing from a tarball, or using Gentoo's portage (the easiest approach I suspect, and the only one I've used personally: simply 'emerge ati-drivers'), once the software is installed configuration is easy. J
  • by Anonymous Coward
    Duke Nukem Forever was released?? Woo-hoo!!
  • by illumin8 ( 148082 ) on Sunday June 29, 2003 @05:13PM (#6326912) Journal
    Let's see here, they compare two cards that shouldn't compare in real life.

    The GeForce card has:

    * Twice as much memory (256 MB vs. 128MB)
    * More memory bandwidth (27 GB/s vs. 21 GB/s)
    * Faster memory (3 ns vs. 3.8 ns chips)

    And the GeForce still got it's ass handed to it by the ATI Radeon 9800 Pro, which, by the way, doesn't even need a leaf-blower attachment just to keep it from overheating!

    Is anyone still buying Nvidia cards any more these days (other than the blindly trusting fanboys, that is)?
    • by the gnat ( 153162 ) on Sunday June 29, 2003 @05:23PM (#6326975)
      Is anyone still buying Nvidia cards any more these days

      Hi! Yes, we buy them at work all the time. We do a lot of 3D graphics work on Linux, and support for ATI cards under Linux was pretty pathetic until very recently. I'm told this has improved, but it's still not as easy as using the NVidia drivers, and we don't really trust ATI's software now. (Apparently the Radeon Mobility is not supported under Linux either - this has made my search for a new laptop very difficult.)
      • by niko9 ( 315647 ) on Sunday June 29, 2003 @06:17PM (#6327198)
        Apparently the Radeon Mobility is not supported under Linux either - this has made my search for a new laptop very difficult.

        This statement is false. The Mobility Radeon has been supported since Xfree 4.2.

        I have been using this chipset with a IBM Thinkpad X22 for almost a year now, and that's with GNU/Debian Linux. ;) People using more cutting edge distro's have been using it longer.

        You wan't a great, cheap, superlight laptop with decent 3d support?

        Please visit the IBM eBay Store [ebay.com]

        Laptops are brand new in the box, full warranty, are almost 50% retail, and you are buying directly from Big Blue.

        The catch? They're slightly behind the newest models, but hey, with linux support, that's the best way to buy hardware.
      • Apparently the Radeon Mobility is not supported under Linux either - this has made my search for a new laptop very difficult.

        This is wrong. Support for all of ATI's mobility cards exist. Additionally if you opt to use the open source driver you may just get some nifty power management stuff from http://cpbotha.net/dri_resume.html .

        Sunny Dubey

      • I don't have it installed on this box anymore, but Mandrake 9.0 ran beautifully on my ATI Radeon 8500 64MB card. Of course that's just my experience.
    • On the lower end side (GeForce 3/4 and Radeon 7500), I really still do prefer Nvidia, because ATI still dosn't provide commerical Linux drivers, and DRI just refuses to work, no matter what I try.

      After buying a 7500 and tinkering with it for a few days, I decided that I didn't want to try anymore, and then traded it for a GeForce 4. It worked perfectly on the first try. I'm not a huge fan of either company, but yes, I still like to buy Nvidia cards.

    • Is anyone still buying Nvidia cards any more these days

      When ATI can start making drivers that don't lock up the machine, I'll consider buying their products. Of the last three ATI products I tried over the past two years, all three of them were unsuable due to driver issues.
    • Is anyone still buying Nvidia cards any more these days (other than the blindly trusting fanboys, that is)?

      I am and I will for some time to come. I don't even CONSIDER ati cards as a possibility due to driver issues in the past. Have you ever worked with drivers for some obscure onboard ati? All the driver problems I've had with ati cards STILL reflect on me today. (more then 4 years after I last touched an ati card at home...) I voted with my wallet in favour of nvidia. *pats his purdy Asus GF4 Ti 44

    • The GeForce cards still have one thing going for them: DVI-D at high resolutions.

      The ATI 'Pro' cards have DVI-D output, however it's incompatible with many monitors at 1600x1200 and higher. It's generally the monitor mfr's fault for not getting the standards quite right, but that's little consolation when you hook your $2000 Viewsonic VP201m or similar up to a Radeon and just get green snow. :-/

    • Let's see some decent Linux drivers from ATi and a benchmark showdown at Linuxhardware.org. Till then, it might be wise not to make such sweeping remarks to the Slashdot crowd.
    • by Temporal ( 96070 ) on Monday June 30, 2003 @02:36AM (#6329044) Journal
      1. The cards are nearly identical in speed. In fact, benchmarks done by others (like Tom's Hardware and Anandtech) seem to show the FX5900 edging the Radeon in most tests. (You may be thinking of the 5800, which was, indeed, slower than the Radeon.)
      2. The FX5900's that you have seen benchmarked are all running at 450/850. The eVGA version of the FX5900 is clocked at 500/900, which is possible because they put 2ns VRAM on their card. Naturally, this means a 5%-10% performance boost, allowing it to edge out the Radeon in more tests.
      3. The best GeForce FX and the best Radeon cost the same at $500 (last I checked, about two weeks ago).
      4. The FX 5900 allows far more complex vertex and pixel shaders. Pixel shaders can be 1024 instructions long and may include branches. I think the Radeon's limit is, like, 16 or 32 instructions, with no branches, but don't quote me on that.
      5. The FX 5900 runs Doom 3 much faster [anandtech.com]. I know this isn't relevant now but it's an interesting point. Current games are going to run unbelievably fast on either card, but future games will run faster on the FX 5900.
      6. The GeForce FX 5900 fan is not loud. The infamous "dust buster" fan was on the 5800. The 5900 uses a more traditional fan. The only time you can even hear it is when you open a game, and it's really not loud at all. I don't even notice it unless I'm listening for it (yes, I own one).

      There really is no clear winner between these two, and they cost the same. So why wouldn't people buy the FX? I prefer to support NVidia because they brought about all the recent great leaps in graphics technology (programmable vertex and pixel shaders, Cg, etc.) whereas ATI hasn't come up with anything particularily impressive.

      NVidia is not 3dfx. Don't expect them to die anytime soon.

      (I am a professional game programmer. Just thought I'd mention that.)

  • by puntloos ( 673234 ) on Sunday June 29, 2003 @05:18PM (#6326947) Journal
    Let's see now.

    1/ Both cards can display current games at 2 quajillion fps, the winner beating the loser by 3fps
    2/ The economy of well, the world, is in the dumps
    3/ Quite a few cool and very demanding games (Doom3, Halflife) will come out Soon(tm) but Definately Not Yet(tm). (Personally I wouldnt be surprised if it would be @ christmas time
    4/ At X-mas time (or whenever these demanding games start to come out) newer, faster cards will be out, and/or these cards will be cheaper.
    5/ At X-mas time people will actually have some money set aside to buy rad new videocards for.. eh.. their girlfriends.

    So who would buy this?

    (No, I haven't actually -read- the article :)
    • I agree. Though the serious gamer with an Uber system would go on and on about which card is best for which game and exactly why using alot terminology like FPS and vertex-shader, none of it would really apply to a casual gamer like me -nor would always being on the bleeding edge of graphics card technology. Especially with the prices. Honestly, both cards would blow away my crappy Matrox something-or-other from about 3 years ago and impress the socks off me, but realistically how many folks could afford
      • Seriously. And those with excess cash. I was involved in a Tribes 2 clan for quite a while and it would really impress me (some of the systems people put together for the competative edge). Players that don't lag frag. Just thought I'd throw that one out there too.
    • If you read the benchmarks, you'd see that these cards are just good enough to run current games (like UT2003) at 70fps at their highest detail levels. So among gamers with disposable income, a lot of people might buy these cards. If not these specific flagship (think advertising_ models, then certainly the more affordable versions in the same product lines.
  • by Eric(b0mb)Dennis ( 629047 ) * on Sunday June 29, 2003 @05:20PM (#6326956)
    3dmark2003

    GF FX: 999999
    Ati Raedon: 40394

    Weird outcome! It was strange though, because during the gf fx test, it just flashed and gave me my score! Awesome speed!

    Keep up the good work, NVIDIA!

  • 9800 overclocks more (Score:3, Informative)

    by PhotoBoy ( 684898 ) on Sunday June 29, 2003 @05:22PM (#6326973)
    The 9800 is still the better purchase, the 5900 has little to no overclocking room and needs a massive heatsink to remain "cool". The manufacture process for the 9800 is more mature on the other hand, and it usually clocks about 60Mhz beyond stock for the GPU and about 20Mhz for the RAM giving 440/370, which makes it comfortably faster than the 5900.
  • by LadyLucky ( 546115 ) on Sunday June 29, 2003 @05:25PM (#6326982) Homepage
    OCAddiction.com has their GF FX 5900 Ultra vs. ATi Radeon 9800 Pro article online detailing which card is more powerful. Running a plethora of benchmarks we were anxious to see which card outperformed the other. Quite simple really. We take nVidia's top offering and pair it up against the current top offering from ATi and let them duke it out till the bitter end

    Right-oh.

  • Ok, so what we have here is a branding battle between two companies that want to make sure that when you think "I need a fast video card" you think of one of their products.

    Ok fine, but why is it stuff that matters? Tell me about recent advances in fabrication, or bandwidth to RAM or bus latching and I'll be thrilled. Show me someone's benchmark of the XYZ Foobar vs the ABC Barfoo one more time, and I'm going to start moderating up the goat-posts just to have something more informative to read!
  • Call it a troll if you want, i just trust benchmarks as much as i trust political surveys. IOW both of those are only tools for the people who publish them, not for the people who are actually reading them ( Sadly, there was time when that wasn't so true... not anymore. )

    Anyways, i wouldn't buy an FX ultra, because of the 2 slots you have to give it. Yeah that's kinda BS and also is a good sign of design flaw. Aside for that minor detail, i would, like always, trust the products from Nvidia. I've never had
  • by janda ( 572221 ) <janda@kali-tai.net> on Sunday June 29, 2003 @05:50PM (#6327079) Homepage

    To quote the article:

    We take nVidia's top offering and pair it up against the current top offering from ATi and let them duke it out till the bitter end. Who will come out on top?

    For some reason I thought of "Iron Chef" when I read this.

  • IMHO, I've always liked ATI more than nVidia. Do I have any particular reason? Not really. I have had both brands of cards in different machines over the years, and both have performed exceptionally well. The kicker for me was customer service. When I had an issue with the GeForce 2 I had at the time, nVidia was less than helpful, didn't respond to my initial e-mails, and, when they did, basically said try a different machine, motherboard, BIOS setting, all the essential busywork that compaines do to try an
    • Re:ATI Good (Score:3, Interesting)

      by drinkypoo ( 153816 )
      Without considering the more recent ATI cards (7xxx and up) I've always felt that the offerings from ATI sucked. A bunch. Endless problems with everything from late-revision ATI Mach64 cards (which are not very compatible at all with earlier Mach64 cards) and with assorted Rage cards led me away from ATI. The crappy state of ATI drivers on Windows - the bread and butter mind you - just made me laugh.

      On the other hand, ATI has really turned themselves around recently by all accounts, and started writing go

  • by hackshack ( 218460 ) on Sunday June 29, 2003 @05:52PM (#6327090)
    This is the second blatantly karma-whoring article I've seen this weekend. The article submitter, Mack, also wrote the damn article.

    I guess I wouldn't be as pissed if it was a genuinely interesting article, rather than a collection of specs and benchmarks.

  • Article summary. (Score:5, Informative)

    by Anonymous Coward on Sunday June 29, 2003 @05:53PM (#6327092)
    9800 has a faster transform engine, is slightly ahead at lower resolutions.
    5900 has a higher fill rate, is slightly ahead at high resolutions.

    Otherwise there are no real differences between the benchmarks and it all comes down to differences any layperson could understand:

    The 5900 takes up 2 slots (WTF?) and the 9800 is $100 cheaper (although $399 for a graphics card is still nuts if you ask me).

    BTW, the ATI 9800 won the "shootout".
  • by WannaBeGeekGirl ( 461758 ) on Sunday June 29, 2003 @05:54PM (#6327099) Journal
    I like the reviews on Tom's Hardware Guide [tomshardware.com] too. Theres a nice review of the GeForce FX 5900 [tomshardware.com] that includes comparisons to both the Radeon 9800s. There's also a comparison between the Radeon 9800 256 vs the Radeon 9800 128 [tomshardware.com] with some benchmarks and a little bit about previous comparisons to GeForce cards. Sounds like they favor the NVidia cards for now.

    WBGG

    • I personally think the review Tom's did awhile back(that you cited) is set up better than the OCAddiction one. It also shows the FX 5900 in a better light and includes the 256 meg variant of the 9800 Pro as well. I don't see why the OCAddiction benchmark warranted any attention other than the fact that it is the only one I've seen that shows a 9800 Pro card beating a 5900 Ultra card.

      Personally I don't like either card(both too expensive, both more than what I'd need). The 9700 Pro is about as much vid c
  • by Anonymous Coward on Sunday June 29, 2003 @05:55PM (#6327108)
    Sorry ATI, but I use Linux... If ATI supported Linux as much as nvidia does mayby I'd buy one. But till then I'll stick to nvidia, no matter if it's slower then ATI's card.
  • radeon 9600pro or fx5600? both around $150, both have good performance, but which is the better card? im not spending more than $170 on a new card, what should i get?
  • So, I've been pretty happy with the Radeon 7000 that came with my PowerMac. It played most of the games I like well, but then UT2k3 came out last week and changed everything.

    Now I'm looking to upgrade, with UT2k3 in mind. Apple offers a Ti Geforce for $400 -- out of my budget right now. However, I can get an ATI Radeon 9000 for $169. Buying a PC version of a card and flashing it isn't really an option, since I need an ADC port.

    Right now I'm thinking a Ti Geforce might be overkill, since my CPU is only
  • With such amazing performance from both cards the ultimate benchmark has to be the picture quality - which OCAddition gave to ATI Hands Down.

    Given that both these cards are going to be able to give a decent frame rate with whatever program is thrown at them i would be looking at the picture quality - which after all is what we have to look at.
  • by irving47 ( 73147 ) on Sunday June 29, 2003 @07:25PM (#6327537) Homepage
    I NEED to know which one is cooler than the other. I can't be going to school tomorrow and know I have the wrong video card in my computer. If the other kids find out, I will be the laughing stock of the A/V club!
  • I don't see why this particular set of benchmarks is special enough to deserve attention on Slashdot. Okay, so they run through a nice variety of benchmarks and they're a fairly credible source. But it has nothing to do with Linux, nothing to do with Apple, and not much to do with OpenGL. The benchmarks are all done in Windows XP with DirectX 9. Even the UT2003 benchmarks were done in DX9.

    About the only thing I can tell about this set of benchmarks is that OpenGL and Linux are ignored completely. At least
  • by Travoltus ( 110240 ) on Sunday June 29, 2003 @07:49PM (#6327628) Journal
    a) A Pentium-4 >=2500mhz
    b) An nVidia FX 5900 gpu
    c) 19 inch monitor

    If you set it to turn on in the morning time, the FX 5900 also doubles as an alarm clock/wake-up service. :)
  • by crisco ( 4669 ) on Sunday June 29, 2003 @08:45PM (#6327888) Homepage
    For those Windows / dual boot users looking a little on the lower end of the performance and price curve, I just found this hacked driver page [maxdownloads.com] and this thread [hardforum.com] that basically turns certain Radeon 9500 cards (~$135) into 9700s (~$200) by unlocking 4 pixel pipelines on the chip. It doesn't work on all cards, producing visual artifacts on some (some workarounds exist for some users) but given the right hardware, you might pull a good deal of performance out of a mid-priced piece of hardware.
  • by Ryan Amos ( 16972 ) on Sunday June 29, 2003 @09:06PM (#6327963)
    Who really cares? I'm not about to drop $500 on a video card, nor are most people on slashdot. Honestly, the video card market is totally uninteresting these days. There aren't many games available right now that take advantage of the features of these cards. And when games really start appearing, the cards will be available for much less. NVidia vs. ATI, I mean seriously, who cares? Both companies are full of lying sleazeballs, both companies offer similar products at similar prices, and both companies pay off "hardware review" sites to give their products favorable reviews.

    "Brand loyalty" in video cards is a joke. It's like having brand loyalty on paper clips. This holy war between NVidia and ATI fans is retarded, it's like people are TRYING to find something to argue over. Neither company offers a product that really distinguishes itself from the other, so it's all a wash anyway. Can we please stop posting these "reviews," as they're all obviously biased in one way or another (based upon the "reviewer's" chosen side in the holy war.) It's just a goddamn video card, not the cure for cancer.
  • by GregoryD ( 646395 ) on Sunday June 29, 2003 @09:58PM (#6328124)
    Anyone else see a trend here? I have an ATI card, blah blah blah its so much better. I have a NVIDIA card, blah blah blah its so much better. Can't they just agree both are pretty much the same and both are good products? That the benchmarks are so close that it really doesn't matter whos on top? Just find what card is cheaper and buy it. And for the record, BOTH cheated on benchmarks. So unfairly saying one is better "cuz the other cheats in benchmarks" is retarded logic.
  • by Mark Ferguson ( 684950 ) <slashdot@stop-spam.org> on Sunday June 29, 2003 @11:50PM (#6328576)
    With all that testing, did anybody consider compatibility? I run Red Hat and Windows on the same boot system so I need compatible hardware that will run in both environments and the Radeon 8500 does just that.

    A few nanoseconds in a game is well and good but if you plan on running two or more operating systems on a single machine you might check into that aspect of your video card.

    Just a thought.

GREAT MOMENTS IN HISTORY (#7): April 2, 1751 Issac Newton becomes discouraged when he falls up a flight of stairs.

Working...