Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

120+ GeForce FX Reviews Collected 142

Peter writes "We just finished at 8Dimensional our list of GeForce FX reviews. It tries to show all reviews of these video cards currently online, 120+ are listed at the moment." Hmmm, time to upgrade from an Xpert@Play98 ...
This discussion has been archived. No new comments can be posted.

120+ GeForce FX Reviews Collected

Comments Filter:
  • by Megor1 ( 621918 ) on Monday May 26, 2003 @05:42PM (#6042506) Homepage
    3dmark released a patch today that avoids nvidias cheating in their benchmark, so all the reviews that used 3dmark need to rerun their tests.
    • by Goalie_Ca ( 584234 ) on Monday May 26, 2003 @05:46PM (#6042538)
      It's true a lot of reviewers have relied on futuremarks benchmark. For me and others it means nothing because its "too" synthetic and uncharacteristic of current games. But for others this means everything but then again, they've probably already know that. It's not everyday your "favorite" gpu drops 24% :P

      It's worth to note that ATI also "cheated" but they still correctly rendered the scene. All they had done was re-order the shader instructions so they were optimized for their architecture. It only boost their performance by 3% IIRC.
      • how is that cheating? re-writing code to make it better VS lowering precision to make it faster and hope you dont notice! HMMMMMM
      • by Performer Guy ( 69820 ) on Monday May 26, 2003 @06:19PM (#6042728)
        ATI did not cheat (although they have cheated in the past), they reordered the instructions in a shader but it remained mathematically and functionally identical. This is what optimizing compilers do all the time. It's called optimization, not cheating, and it is legitimate, they looked at the shader INSTRUCTIONS to see if they were suitable for this optimization. It made about a 2% difference, and IMHO ATI should leave this optimization in and broaden the scope if possible.

        If it was a really narrow path optimization then it's borderline, but far different from the wholesale cheating of NVIDIA who: rewrote shaders with completely different results, didn't clear the screen at certain points and added hidden clip planes to eliminate pixel fill. All very underhanded and why you would make excuses for this I don't know. Sure it's a synthetic benchmark, but if it don't matter then don't cheat.
        • no, ATI also cheated, though on a MUCH smaller scale than nvidia. They also exchanged the complete shader (recognizing it by name or something like that), even though the exchanged shader delivered the same result (unlike nvidia, their shaders deliver different results, plus they have those really evil clip plane / buffer clear cheats).
          You could argue that replacing complete shaders is optimizing (as long as the shader delivers the same output)and you could be right if this would be a game. However, the 3d
          • Well it really gets into the realms of opinion here. I'm not going to debate this, like I say it's borderline and your conclusion depends on your outlook we agree at least that any infraction was minor compared to NVIDIA and that ATI are already tainted by earlier cheats. I'm no fanboy of either, although I love their work.
      • Heck yes, it is.

        You don't need to buy the top of the line card to play the current games well. So you want a benchmark that'll tell you how your card will deal with the features that are new to the card and will make their way down to the games later.

        If you're buying a top of the line card, either you just want the status of having a top of the line card, in which case you want it to pass all benchmarks with flying colors, or you want a card that you won't be replacing for a while...in which case you w

    • The thing that doesn't make sense to me is from the couple of reviews I have read the new NVidia card outperforms the Radeon card on almost all the game tests. Not by much, but still a little bit.

      And yet when it comes to the 3D Mark test ATI Creams NVidia when that new patch is applied.

      To me that makes the 3DMark benchmark Very sus. The two possibilities I can see are NVidia's original complaints against the benchmark are justified, and yes they cheated to boost the score. OR 3DMark are bitter at NVidia a
    • Patches? Bah. The true test of video cards is glxgears.

      So here now is my benchmark results for the Geforce 2 MX 64MB on a P4 @ 2.4Ghz with Gentoo 1.4rc4:
      bash-2.05b$ glxgears
      3961 frames in 5.0 seconds = 792.200 FPS
      3971 frames in 5.0 seconds = 794.200 FPS
      4028 frames in 5.0 seconds = 805.600 FPS
      4386 frames in 5.0 seconds = 877.200 FPS
      Amazing isn't it?
  • That the GeForce FX sounds like a leaf blower [bjorn3d.com]...
  • by The Terrorists ( 619137 ) on Monday May 26, 2003 @05:44PM (#6042521)
    to us if it contained such review compendia in itself, rather than making me go to 100 different sites to see them. It'd also be a way to counter the various technology zealotries that arise here. I'm willing to see these ads if the value of this site goes up commensurately.
  • by User 956 ( 568564 ) on Monday May 26, 2003 @05:46PM (#6042533) Homepage
    Hmmm, time to upgrade from an Xpert@Play98

    Hmmm.. I don't know if a GeforceFX is actually an *upgrade* from the Xpert@play98. What benchmark are you using?
    • by Evil-G ( 529075 ) <g_r_a_m_2000@hotmail. c o m> on Monday May 26, 2003 @05:55PM (#6042592) Journal
      What benchmark are you using?

      The "decibels produced by the video card" benchmark.
    • by Anonymous Coward
      My benchmark is a long bench I put beside my computer. I make a mark on it for how far away I have to go before I can't hear my Xpert@play98. 0 meters; it has passive cooling. I make another mark on it that shows how far away I have to go before I can't hear the GeforceFX I am considering. 19 meters; it's a long bench.

      The GeforceFX has a higher score on my benchmark by 19 meters. Thus it truly is an upgrade for me.
    • What about upgrading from my Voodoo3 2000 PCI? Even if not perfect (as was said, why make 120 links?), it's still nice to see such initiatives.
  • uhhh (Score:5, Insightful)

    by Tweakmeister ( 638831 ) on Monday May 26, 2003 @05:54PM (#6042583) Homepage
    "We just finished at 8Dimensional our list of GeForce FX reviews." ...and? now what?

    While the idea of a site that shows all the reviews in one place is noble...unless you have huge amounts on content it's easier to search google for the reviews. The good ones usually end up being on top as well.
  • [snicker] (Score:3, Funny)

    by A_Non_Moose ( 413034 ) on Monday May 26, 2003 @06:00PM (#6042620) Homepage Journal
    /obligatory sniping

    quote: "...120+ are listed at the moment".

    So, production quotas went better than expected, huh? /end sniping

    I have to admit, I knew the horse-power game was one that Nvidia was going to trip over eventually when the Radeons were *really* pushing the Z-axis occlusion.

    Face it, Nvidia became top dog by pure horse-power, and to some extent, deeper color depths and kick ass drivers.

    ATI finally got their hardware on track not by being able to "outshow" Nvidia's muscle, but by outclassing them by being smarter with the bandwidth and 'getting a clue' with drivers.

    (witness the Dawn demo on a 9800 running *faster* than on the FX series.)

    Speaking of the Dawn demo, does anyone else remember that this was to showcase the power of the FX, yet the 5200 is a *SLIDESHOW*?

    Final thought: Did they use 3Dmark in all the benchmarks? (/low blow, sorry)

    .
    • I have to admit, I knew the horse-power game was one that Nvidia was going to trip over eventually when the Radeons were *really* pushing the Z-axis occlusion.

      That is faulty logic; just because ATi was using Z-axis occlusion and ATi beat nVidia (which is also using z-occlusion nowadays) does not mean that that is why they beat nVidia. The Kyro II used tile based deferred rendering which was far superior to the z-occlusion ATi and nVidia were and are still doing, and look where PowerVR is today.

      ATI f
      • True, very true...I was being overly broad, but the simplistic/broad view of the Ati/Nvidia battle is:

        Nvidia used brute force and overtook 3dfx (barring 3dfx's stumbles and fall).

        Ati tried to "brute force" the Radeon series and could not keep up with Nvidia, until, they (Ati) got smart with the hardware and software optimizations (i.e. finesse).

        The point is that Ati overtook Nvidia because of points we both brought up:
        Smarter with bandwidth, not overdrawing a scene, texture compression and a whole host o
        • Nvidia used brute force and overtook 3dfx (barring 3dfx's stumbles and fall).

          Not exactly. 3dfx (3Dfx, before they changed their name, bought STB, and went crazy, wasn't that bad at all) fucked themselves. If anything, 3dfx was guilty of brute-forcing. The Voodoo2 was not much more than a brute-force step from the Voodoo1. The Voodoo3 wasn't much better, with the addition of fairly poor 2D. "22bpp", blah. Gamers wanted 32bpp, nVidia gave them 32bpp. 3dfx had their eyes firmly closed. And let's n

  • time? (Score:5, Insightful)

    by buddha42 ( 539539 ) on Monday May 26, 2003 @06:04PM (#6042643)
    Hmmm, time to upgrade from an Xpert@Play98 ...

    Nope. The most stressful video application most users do is DVD playback, and even that is loooooong past the point where hardware-assistance is needed.

    The video card market has gotten absolutly rediculous in the last 2 years. Its strange, when Intel and AMD fight it out, prices plummet. While nVidia and ATI have been fighting it out prices have skyrocketed. Sure, so have features, but they're so random and game or api dependant that most people don't even know how to turn them on in different games or in the drivers advanced settings.

    Up until UT2K3 it was completely absurd, because anything with DDR could play any game just fine. Now with the new crop its even worse because modern cards still can't play the very-new and upcoming games well. So buying a high-end card now is overkill for older games, and underpowered for upcoming games.

    • please, this dead horse gets wheeled out every time a story that has to do with graphics cards gets posted. It been rebutted a hundred times before, I'm not going to bother posting another one.
    • You CANNOT compare the CPU market (AMD and Intel) with the GPU market (ATI and NVidia).

      How often do CPU manufacturers release completely new chip designs? Once every couple years maybe. I know it has certainly been a few years since the Athlon was released, and it has been a while since the P4 was released, and there is no sign of the next CPU from Intel yet.

      Now think about how often new GPU's are released. Once a year at the most. The Radeon 9700 card hasn't even been out for a year, and ATI are already
    • Re:time? (Score:3, Interesting)

      by evilviper ( 135110 )

      While nVidia and ATI have been fighting it out prices have skyrocketed.

      Just a while ago, I was looking for an Nvidia card with TV-out (nothing else is even likely to work under Linux/FreeBSD). Searched pricewatch and found one for $20... Do you really need a videocard to be creaper?

      There is PLENTY of blame to go around for videocard prices:

      Stores try to only stock the most expensive items, because that means higher margins. You don't walk into Best Buy/Circuit City and see SIS videocards, because the

      • Oh please, stores tend to stock the more expensive cards because the kind of person that wants a new videocard is going to want something fast. Guess what? fast videocards are expensive! There's no retail market for slow cards, people who don't care about how fast thier card is are happy with the one that came with their OEM system. There's the people building a new system from parts, but they tend to want a fast videocard too and they almost always buy all their parts online anyways.

        As for the great ga
    • The video card market has gotten absolutly rediculous in the last 2 years. Its strange, when Intel and AMD fight it out, prices plummet. While nVidia and ATI have been fighting it out prices have skyrocketed.

      There are many quite logical reasons high-end video card prices rise, having to do with the differing economics of the businesses involved:

      1. First of all, Intel and AMD own their own fabs. A modern, 300mm, .13 micron chip fabrication plant costs in the neighborhood of 2.5 to 4 billion dollars. To mak
  • by Capt'n Hector ( 650760 ) on Monday May 26, 2003 @06:17PM (#6042709)
    Are you guys telling me that every single person with a geforce fx wrote an online review?
  • Hmmm, time to upgrade from an Xpert@Play98 ...

    Whoah, really? Probably time for me to upgrade to an Xpert@Play98...
  • 120+ reviews? (Score:3, Insightful)

    by earthforce_1 ( 454968 ) <earthforce_1@yaho[ ]om ['o.c' in gap]> on Monday May 26, 2003 @06:23PM (#6042745) Journal
    Nice piece of research, but I don't think I have time to read 120 reviews on anything, even the next vehicle I plan to buy. Can they just put up an executive summary?

  • But why... give it a month and they will be out of date... and for fucks sake... hardware is _cheap_ just but the thing and be done with it.
  • Im a linux user. My cheap Rage 128 is enough for tuxracer.
    • I'm a Linux user. My cheap geForce ti4200 is enough for Unreal Tournament 2003.
      Just because you run Linux doesn't mean you have to give up gaming. It usually means that you have alot of heartache :)
  • by tunabomber ( 259585 ) on Monday May 26, 2003 @06:39PM (#6042824) Homepage
    I'm using an Xpert@Play98 and I love it. Yeah, it doesn't support OpenGL very well, but look how fast it can display B's in a /. post:

    BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB BB BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB BBBBBBBBBBBBBBBBBBBBBB

    Wow, that's some card!
  • by rmdyer ( 267137 ) on Monday May 26, 2003 @07:03PM (#6042964)
    Everyone knit picks about a few percentage points here and there when comparing Nvidia and ATI cards. But, for people with money to spend, there are more expensive CAD Pro level cards out there. I am wondering, what the fastest card you can purchase is for the PC AGP bus? Anybody know? And, how much faster are they than the FX or Radeon?

    Another annoying thing... Looks like Nvidia and ATI are now price tiering cards. Up until recently, the most you would pay retail for the best consumer level card was around $400. Now it looks like Nvidia and ATI want to push us into the $500 card level. What is next year card $600?

    • The Radion 9800 Pro... For game performance that is..
    • try at 3d labs [3dlabs.com] theyre wild cat series is amazing for high end apps that take advantage of it. i have many friends that use fire gls and quadros, and i hear that they are not the best gaming cards, but they destroy gaming cards where they need to, in DCC apps and other specialized apps. features like hardware overlay planes and line anti aliasing arent needed by gamers, but developers would surely cry without them.

      i believe there are some sites that use gaming benchmarks to review these cards, try highend3

      • Seems a lot like the ATA vs. SCSI debates. You've got this class of people who must buy whatever the "best" cards are, and are recognized to be the "best" cards for doing certain things by their peers. If their peers are gamers, then you'd probably buy and FX, or Radeon. If their peers are engineers, then you wouldn't be caught dead not spending thousands of dollars for a Quadro or something. Interesting that the Quadros would be slower at rendering.

        The only thing you seem to be getting out of 3D labs
    • No. The add-in video card market is driven by Nvidia and ATI, Gamer card, CAD card, or otherwise.

      3dlabs is barely holding on to their tiny niche. While CAD cards need to do different things than gamer cards, this is basically how it works: Nvidia/ATI build the best cards they can. Then they tweak them and triple the price for the CAD market.
  • Nvidia recently announced they have shipped their 130th FX 5900 card.
  • by Xtro ( 113699 ) <xtro AT evem DOT org DOT au> on Monday May 26, 2003 @10:17PM (#6044318) Homepage Journal
    Who cares if someone found 120 reviews to link to, how is this helpful to anyone? Will anyone now go and read all 120? Go use Google, you'll get reviews of ANYTHING, and the best ones will probably rise to the top.

    Now if they had really 'collected' them they could have perhaps summarised all the conclusions into one short conclusion to give an overall 'world' view or something.

    This page could have been written by a dumb search robot just as easily as a human, where's the human value in it! Where's the humanity !?!?!? This is how the Matrix started you know.
  • My video card on this Dell really should be called an ATI Play@Work 98 hahaha just kidding i hope no one is monitoring my web browsing activity from the home office :)
  • Here's my review: It's better than my onboard geforce2. It's only 100 bones. It looks SO realistic when I am beating helpless pedestrians to death in GTA:Vice City. thank you...
  • Closed source drivers or not, nVidia's Linux performance rocks.

    Of course, that's only among those that actually reviewed Linux support...
  • (ripped from Dave H @ beyond3d - http://www.beyond3d.com)

    1. Anand posted benches that claimed the 5900U was platform-limited (at 223fps!) running Q3 at 1600x1200 with 4xAA and 8xAF.

    2. Lars at Tom's mislabeled the D3 Medium Quality + 4xAA benches as High Quality + 4xAA

    3. Kyle and Anand both ran D3 in Medium Quality with 8xAF set in the drivers, despite what seems to be the fact that Nvidia drivers interpret Medium Quality as forcing no AF, while ATI drivers do not

    5. ExtremeTech's 3dMark03 build 320 vs. b
  • I bought an ASUS V9520TD FX 5200 card last week and it wouldn't even install on Windows 98...:(

    Back it goes today!

    Jon Acheson

UNIX enhancements aren't.

Working...