Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Hardware

GeForce FX 5200 Reviewed 180

EconolineCrush writes "Tech Report has a great in-depth review of NVIDIA's budget GeForce FX 5200, which brings full DirectX 9 support down to an amazing sub-$70 price point. Any budget graphics card capable of running NVIDIA's gorgeous Dawn is impressive on its own, but when put under the microscope, the GeForce FX 5200 looks more like an exercise in marketing spin than a real revolution for budget graphics cards."
This discussion has been archived. No new comments can be posted.

GeForce FX 5200 Reviewed

Comments Filter:
  • Hovercraft (Score:3, Funny)

    by Anonymous Coward on Wednesday April 30, 2003 @07:56PM (#5849480)
    This thing sounds like a hovercraft when you turn on the PC.
    • Re:Hovercraft (Score:3, Informative)

      From this thread:
      This thing sounds like a hovercraft when you turn on the PC.

      From the article:
      However, for average consumers and business users, the GeForce FX 5200 offers better multimonitor software, more future-proof feature compatibility, and silent and reliable passive cooling.

      It's amazing what actually reading the review on the product you know nothing about will do for you.
  • FX (Score:4, Funny)

    by SugoiMonkey ( 648879 ) on Wednesday April 30, 2003 @07:58PM (#5849488) Homepage Journal
    I can only imagine what other useless computer paraphernalia I can waste $70 on. Hmmm. Maybe I'll get that USB toothbrush.

    The Monkey Pages [lazyslacker.com]: Not just another personal site...okay, so I lie.

    • by naes ( 560942 )
      I'm still waiting for the Bluetooth toothbrush.
    • well I have an extra $100 that I have been saving for the George Foreman USB iGrill... [thinkgeek.com]

      God my friends will never let me down for falling for that one...
      • I was going tu burn myu mod points on this article but that is truely a scarey thing. As an "early adopter" that owns the original George forman I have to speak up here.

        back in my day we didn't have no fancy USB connections for our George Foreman. All we had was a orang light and a 120v power adapter. The light would go on and off randomly but we didn't know why. However, we were confidant that it served as an important indicator to someone somewhere. We didnt have no sissy computer controlled timers and
  • by Tweakmeister ( 638831 ) on Wednesday April 30, 2003 @07:59PM (#5849499) Homepage
    As the poster states...looks like mostly marketing spin in terms of performance. "So, while the GeForce FX 5200 is technically capable of all sorts of DirectX 9-class eye candy, I have to question just how well the card will handle future DirectX 9 games and applications. After all, a slideshow filled with DirectX 9 eye candy is still a slide show." Throw some fancy "big boy" names on a box without the performance to back it up.
    • Tom's Hardware questions this as well:

      http://www6.tomshardware.com/graphic/20030311/ge fo rcefx-5600-5200-26.html

      THeir conclusion was the 5200 was good for the price, but questionable if it would really be able to keep up with DX9 games when they became available.
  • by AlabamaMike ( 657318 ) on Wednesday April 30, 2003 @07:59PM (#5849501) Journal
    Wonder if they dropped this on the market just to keep the steady stream of products rolling? Even if the performance isn't totally up to par, you've gotta give them this: $70 for a graphics accelerator that can perform this well is still an achievement. I can still remember paying $200 for my first MonsterFX. Now that seems like as old as Hercules graphics.

    -A.M.

  • by Tyler Eaves ( 344284 ) on Wednesday April 30, 2003 @08:00PM (#5849505)
    Does this honestly surprise anyone in the least?

    THINK!

    If the low end was worth the PCB is was printed on, there goes the market for the higher-end (and higher-margin) stuff.
    • This is exactly it. This is why video card's price doesn't scale well. If the flagship $400 card was simply a fast version of a $100 card nobody would buy it. Would you really pay $300 more for a few more mhz on your RAMDAC?
  • For God's sake (Score:4, Insightful)

    by Czernobog ( 588687 ) on Wednesday April 30, 2003 @08:04PM (#5849521) Journal
    You said it yourself.
    It's a budget card.
    No leaps and bounds in terms of graphics card techonology progress will be found, otherwise, it wouldn'b be a budget card.
    Besides, they have to put a product out, so that they keep customer awareness on their products and not on ATI's, considering how the latest NVIDIA flagship product performs...

    • Re:For God's sake (Score:3, Insightful)

      by damiam ( 409504 )
      It gets its ass kicked by the GeForce 4 MX in half the tests. There's not much you can do to defend that.
      • Indeed. And since I play a lot of old school games like the original Unreal Tournament, the GeForce4 MX with DDR is just fine for my LAN party box. It runs about $50 at Newegg from various manufacturers, and the 5200 is coming in at $75-$100 depending on which version you get.

        Of course, my flagship box has the Ti4200...mmmm, antialiasing 'til the cows come home...sweet....
  • by Ryu2 ( 89645 ) on Wednesday April 30, 2003 @08:04PM (#5849522) Homepage Journal
    Besides the lower memory bandwidth and other reasons in the article, it seems to me that the 5200 is implementing the fixed-function T&L pipeline as a vertex shader, to save transistors by foregoing a pure HW implementation, which of course means it'll be slower (although still faster than in software, of course), and of course, you'll incur a greater cost switching between shader and fix-function rendering too. This trick was also used by Trident in their sub-$100 "DX9 compatible" chipsets.


    It's a good measure, but it invaribly means that you'll get lagging performance with these low-end cards, so it's something to be careful of. Maybe in a year or so, once shaders become the norm in games, perhaps Moore's law^3 will have enabled them to put those transistors back on and still hit their price target, but definitely not now.

  • DirectX 9? (Score:4, Funny)

    by Trogre ( 513942 ) on Wednesday April 30, 2003 @08:05PM (#5849529) Homepage
    I trust when you say it has DirectX 9 support you mean it implements OpenGL 2.0?

  • by Rosco P. Coltrane ( 209368 ) on Wednesday April 30, 2003 @08:08PM (#5849548)
    I still use my old Matrox Millenium I bought in 1995 for $300 (if I remember correctly). Nowadays there are graphics adapters going for $70 that probably have more power and memory than the P200 that houses the Matrox Millenium. Moore's law never ceases to amaze me ...

    • by Hanno ( 11981 ) on Wednesday April 30, 2003 @08:21PM (#5849611) Homepage
      Once my desktop's graphics card had more memory than my laptop's system memory, I knew the graphics hardware development was going a weird route.

      The other indicators were: craphics cards that need external power plugs and graphics cards that need more than one slotplug for its cooler fan.

      Fans, anyway, are the work of the devil and the main reason why computers are driving me nuts these days.

    • I still use a 3Dfx Voodoo 5 5500, bought 3 months ago for $70.

      Interestingly, nVidia claimed that the new cards included 3Dfx technology, hence the FX moniker. I sill like the Voodoo 5 better. ;)
  • Why? (Score:5, Funny)

    by mrklin ( 608689 ) <.ken.lin. .at. .gmail.com.> on Wednesday April 30, 2003 @08:09PM (#5849552)
    Why use all the unnecessary GPU processing to draw a semi-realistic, semi-naked chick (as linked to Nvidia's Dawn demo) when you can play pics and movies of real naked chicks that looks tons better using the system intergrated GPU?
    • Re:Why? (Score:1, Funny)

      by Anonymous Coward
      Why use all the unnecessary GPU processing to draw a semi-realistic, semi-naked chick (as linked to Nvidia's Dawn demo) when you can play pics and movies of real naked chicks that looks tons better using the system intergrated GPU?

      Even better: I can build a decent computer for my girlfriend who wants to do word processing and doesn't care about 3D graphics, using an even cheaper graphics card that provides even better signal quality and get laid for the favor. What porn movie can compete with that?
      • Re:Why? (Score:2, Funny)

        by Anonymous Coward
        I can build a decent computer for my girlfriend who wants to do word processing and doesn't care about 3D graphics, using an even cheaper graphics card that provides even better signal quality and get laid for the favor.

        You have to build your girlfriend a computer to get laid?
    • well us geeks need to code ourselves girlfriends we can --interact-- with geeze, we aren't total recluseses... me have good spelling :D
    • You're right of course. I was dumb enough to download the whole thing without reading the fine print(only have a ti4200).

      On that note, would anyone mind converting this vid. to mpeg? All that fancy hardware FX, DX9 stuff is great...but ultimately it's still just a movie. 30 frames per second, and some standard of resolution and pixel depth. So could someone do this, and maybe link an FTP download?

      I'd just like to see the demo...cause it...looks nice :)
    • by CvD ( 94050 )
      You can *program* this girl and tell her what you want her to do. :-)
  • by LittleLebowskiUrbanA ( 619114 ) on Wednesday April 30, 2003 @08:10PM (#5849558) Homepage Journal
    Tom's Hardware is currently recommending [tomshardware.com] the geForce ti4200 for those looking for mid-range card w/ good performance.
    • Tom's recommendation changes frequently, depending on which compant gave gim the last shiney toy. Read THG for the articles, but don't take his Advice [google.com]
    • by Anonymous Coward
      Not really. The full quote from tom's hardware is:

      If you don't need the very best in image quality enhancements, can do without DirectX 9 and are happy with 2x FSAA, you'll be glad to hear that there's a very affordable card out there for you - the GeForce 4 Ti 4200
    • by poopie ( 35416 ) on Wednesday April 30, 2003 @08:57PM (#5849797) Journal
      If anything will be the downfall of NVIDIA, it will be the fact that nobody but a hardware weenie can figure out what card is better based on the age/name without a secret decoder ring.

      Seriously.. what average person would know that an a Geforce 3 TI200 was better than a Geforce 4 MX400. I mean.. geforce 4 sound better, right?

      Likewise, who would think that an "old" Geforce 4 TI4200 is way better than a new Geforce FX 5200.

      Please, NVIDIA, can you come up with some names that actually convey to people whether they're buying the 'Value' version of your graphics card, or the 'Professional/Platinum' version.
      • by Anonymous Coward
        I think that was the whole point of the new naming convention on the GF FX (IE Geforce5) series of GPUs; there is no longer such a thing as the mx line; the FX5200 is the low-end version of the line (think Celeron/Duron v. P4/Athlon; they burn out a few components & otherwise limit the chip, but the core is still fundamentally the same).

        The FX5200 being slower than the GF4 Ti series, well... that's not too suprising; the lowest-end of the new technology doesn't need to be faster than the highest end of
      • >Please, NVIDIA, can you come up with some names that actually convey to people whether they're buying

        I could forgive this is if the gaming industry would include suggested resolution, bit-depth, etc for each game. Say I have a GF3 TI200 and want to play some new game. I don't want to screw around with the settings to get the game to a FPS/Color combo that is usable, the game should tell me this by looking at my GPU and CPU combo.

        Now release this information publically and people shopping around for
        • I could forgive this is if the gaming industry would include suggested resolution, bit-depth, etc for each game.

          Some games do; at least, they do some quick benchmarks and suggest that. If nothing else, they usually offer "Fast, Better, Best" quality settings so you don't have to do too much tweaking.

          A wizard that did some simple tradeoffs wouldn't be too hard. Make it downloadable and people could get an idea of how well the game would run on their system. Of course, a lot of times the game companies wa

    • While Tom's Hardware recommendation of boards that use the nVidia GeForce4 Ti4200-8x may be fine for current games, it's going to end up being a wasted expenditure when games that use the full DirectX 9.0 functionality start arriving later this year. Given that ATI's Radeon 9500, 9600, 9700 and 9800 support DX9 functionality in hardware, small wonder why ATI sales have gone up quite a lot recently.

      Chances are pretty good that Doom III, EverQuest II, and a good number of other "hot" games coming out for the
      • Chances are pretty good that Doom III, EverQuest II, and a good number of other "hot" games coming out for the next few years will implement DX9 support

        Errr, actually
        Mr Carmack himself is a large OpenGL fan. Additionally he openly questions those who rely on MSFT's DirectX too much. This is probably the reason why most (if not all) of his games have native linux ports.

        Sunny Dubey

    • 5200 has the benefit of not having any fan, and thus not producing any noise.

      I found that those small GPU fans produce the most noise (and with the highest frequency) in my work PCs, so I am not going to buy any fan-cooled GPU. FX 5200 is the only modern option I know of that does not have fan, so it is on my list if I ever decide to upgrade my GeForce MX 200 (which works fine given I don't play games much).

  • by rzbx ( 236929 ) <slashdot@@@rzbx...org> on Wednesday April 30, 2003 @08:13PM (#5849572) Homepage
    Not really worth it. For just a little more you can buy a decent Geforce 4 4200/4400/4600 that runs better than this card. It only seems worth it if you want or need those DX9 features.
    Btw, I am selling my GF 4 4200 card. I am happy with my GF2 MX. I stopped playing games, no really, I did.
    • how much do you want for it and where do you live?
    • GF2MX, still runs a lot of games very well, I should know, my linux box has one on it..... safety net for you?
    • I have a 1st gen Nforce mobo with built in video. The built-in video is supposed to have about the same general capabilities as a GeForce 2. The reason why I bought that mobo in the first place? Cause I didn't want to pay for a crappy video card, and didn't want to shell out $200+ for a GeForce 3 back then, let alone a GeForce 4.

      I knew I was planning on upgrading. Just didn't know when.

      Anyways, several months pass by, and I'm happy with my built-in video card. that is, untill I see my cousin's card, a

      • I've got the new nForce2 chipset in my new system (Asus A7N8X mobo) and it's cohabitating quite well with my new ATI Radeon 9700 Pro. It's a great chipset and makes for a very fast machine. For the money, the 9700 Pro was a better deal than the various nVidia video cards I looked at, and since I've been using ATI for the past year plus, I felt good about their driver situation and they kept my business.
  • Wow! (Score:5, Funny)

    by flatface ( 611167 ) on Wednesday April 30, 2003 @08:26PM (#5849630)
    Thanks for the update! I've been waiting ages for a video card that will play Nethack at 10,000fps! Who cares about 3-D games when you can go dungeon hacking?
  • by ElGuapoGolf ( 600734 ) on Wednesday April 30, 2003 @08:38PM (#5849691) Homepage
    Here's the deal. It's cheap. But will it play Doom III and Half Life 2 acceptably when they're released? If it can, then it's worth buying. If it can't, it's nothing more than a card for the IBMs, Compaqs, Dells, etc. who want to list "Graphics by NVidia" as one of their bulletpoints.
    • The reason why Dell is on top is they know what they are doing and put the best into their computers. I knew ATI's were the king of the hill when Dell started putting them into their boxes instead of Nvidia.

      Check out this gaming machine:
      http://www.dell.com/us/en/gen/topics/segtopic_dimx ps.htm [dell.com]

      Brian
    • Actually, from the looks of it, that card is quite good in workstation apps. So, if you want to build a cheap Maya computer (yeah, right!), then this card is for you.

      Of course, if you are going to be buying Maya in the first place, you might as well get the high-end Quadro, since it only costs a few hundred dollars as opposed to Maya at $2000 or $7000.
      • by swankypimp ( 542486 ) on Thursday May 01, 2003 @04:23AM (#5851170) Homepage
        if you are going to be buying Maya in the first place, you might as well get the high-end Quadro, since it only costs a few hundred dollars as opposed to Maya at $2000 or $7000.

        I am setting up a l33t Maya workstation on my parent's Compaq Presario, but performance sux0rs. Where can I download teh warez version of this "Quadro"?

        • Do a search for RivaTuner or SoftQuadro.

          Yes, if you have a Ti-series card (Actually some Quadros were MX-based), you can "warez" a Quadro out of it with a driver hack.
          • Well, it makes it act like a Quadro, but it isn't quite as powerful. Since the Quadro hardware features (such as wireframe anti-aliasing) aren't there, the card emulates them.

            I would also think that the Quadro devices are designed much more robustly. ATA drive is to SCSI drive as Ti card is to Quadro.

            I could be completely wrong on both points. If I am, please correct me.
            • by Andy Dodd ( 701 )
              The silicon of the Quadros is EXACTLY the same as the GeForces, although in many cases the Quadros also were higher-clocked samples. But the wireframe AA is the main thing that SoftQuadro gives you - GeForce hardware supports wireframe AA since it's the same silicon, but because the PCI ID identifies it as a GeForce, the driver disables Quadro features.

              It's the same deal as with the ATI 9500-to-9700 hack, except that the success rate is 100%. (ATI 9500s and 9700s are the same chip, just lower clockrates
  • by brer_rabbit ( 195413 ) on Wednesday April 30, 2003 @08:40PM (#5849698) Journal
    Looks like PNY have made a PCI [newegg.com] version of this card. Before you l33t gam3r start laughing that it's PCI, a number of us have server-type or older motherboards that don't have AGP slots. The lowest price on Pricewatch I found for it was $139, so it's quite a bit more than the AGP version.
    • After RTFA'ing maybe those of us PCI bound should opt for the Geforce4 MX, which is also available in a PCI version...
      • Does the PCI bus even have enough bandwidth for this kind of video card to make a difference, though? I was under the impression that it wasn't worth getting a high end video card for PCI, because the bus just wasn't fast enough for it to be effective.

        Meh... looking at that Dawn, I'm thinking that my GeForce3 64MB video card just isn't good enough... :)
        • Does the PCI bus even have enough bandwidth for this kind of video card to make a difference, though?

          When I upgraded from a GeForce2 MX to a GeForce4 MX I tried to grab some benchmarks:

          test geforce2 geforce4
          --------
          xengine 237-238 rpm 60xxxx rpm
          gears -fps 39-45 fps 45 solid
          gears -fps -delay 1 83 fps 90 solid
          gears -fps -delay 0 120-160 fps 300-500

          so yeah, it makes a difference. Return to Castle Wolfenstein seemed a bit snappier with the GeForce4 than the 2, but I didn't
        • 33mhz x 32bit = 1056 = 132MB 66mhz x 32bit = 2112 = 264MB 33mhz x 64bit = 2112 = 264MB 66mhz x 64bit = 4224 = 528MB PCI holds up just fine in my opinion. Granted AGP 8x has up to five time the bandwidth of the fastest PCI card, but most graphics boards don't realy make use of what AGP can do.
    • Also, a PCI version is a necessity if you're hooking heaps of monitors off one PC. Maybe if I can dig up some old 32MB SIMMs for my Olivetti Envision I can use this card to get Diablo II running on it... Now I just have to get the BIOS running again. Hmm, a project. If only the maximum amount of RAM the system can handle wasn't less than the minimum required to run NWN.
    • Based on the image it looks like it may also support 66mhz PCI, which would be extra sweet.
  • by gatzke ( 2977 ) on Wednesday April 30, 2003 @08:54PM (#5849779) Homepage Journal
    Check out the nude patch:

    http://www.digital-daily.com/news/?view_options= by _message&message_id=202

    HA HA HA. I need a new card...
  • Err... (Score:5, Insightful)

    by BHearsum ( 325814 ) on Wednesday April 30, 2003 @08:58PM (#5849807) Homepage
    Is it just me, or does anyone else think there's something wrong with the ti4200 beating out the FX 5200 in every test?

    Or is the FX the new MX line?
    • Re:Err... (Score:5, Informative)

      by k_187 ( 61692 ) on Wednesday April 30, 2003 @09:25PM (#5849959) Journal
      Or is the FX the new MX line?

      No, they've dropped the mx moniker and are doing everything by model number like ATI has been. The High end GeForce FX is the FX 5800. There are 5800s, 5600s, and 5200s. 268. Pricewise too.
    • Why should every card in a totaly new range perform better than every card in the previous range?

      This card is the extreme buget end of the market...the 4200 was the midrange.

      Model numbers mean nothing and age means nothing.

      There are only three things that matter
      - features, performance and price.
      People looking for a video card will put one of those as their priority
      Those looking for performance won't be looking at this card. but that's ok, because those looking for price and features will.

      If you're looki
  • Water Cooled (Score:2, Informative)

    by fidget42 ( 538823 )
    Tom's Hardware has an article [tomshardware.com] on the Gainward version of the card. It is water cooled.
  • by caitsith01 ( 606117 ) on Wednesday April 30, 2003 @09:11PM (#5849884) Journal
    Is anyone else completely fed up with nvidia's moronic naming conventions?

    First we had the original GeForce 1+2 series, and things were good. Then GeForce 3 Ti kicked it up a notch performance wise. Following this the GeForce 4 *Ti* series continued the improvement in performance, but the GeForce 4 *MX* series was also introduced and performed like a piece of overcooked dog-doo. In benchmarking my old GeForce 2 GTS card easily beats a GF2-MX 400 in 3D games and benchmarks.

    But nvidia's marketing fools weren't done yet. Not content with ripping off kids who thought they would be getting a cool, up to date graphics card for a bargain price, they then introduced the following naming convention to the GeForce 4 Ti series:

    GF4-Ti 4200 - Entry level
    GF4-Ti 4400 - Mainstream
    GF4-Ti 4600 - High performance
    GF4-Ti 4800 - Either a 4200 or 4600 with an 8x AGP bus (read: no performance increase), depending on which version you happen to buy

    So, we have a GeForce 2 that kicks the ass of a GeForce 4 in 3D games, and now a GeForce 4 4400 that kicks the ass of some GeForce 4 4800s but will always be slower than a GeForce 4 4600, which in turn will always be at least as fast as a 4800.

    With the FX series, who the hell knows? All I know is that there is now absolutely no connection between the family number (Geforce 1,2,3,4,FX) and actual performance, and no connection between the model number (4200, 4400, 4600, 4800) and actual performance. Given that ATI is currently whupping nvidia in performance and output quality it seems to me that the marketing people at nvidia need to think *really* hard about their naming conventions. Amazingly adding a higher number to a piece of crap does not make it a faster piece of crap, although it may wreck your reputation with consumers.
    • by Ryu2 ( 89645 ) on Wednesday April 30, 2003 @09:21PM (#5849937) Homepage Journal
      In another fine showing of developer humor, Tim Sweeney, Epic's 3D mastermind behind the Unreal Tournament engine and current Unreal technologies was seen at the show running amok with a pad of Post-It notes poking some light fun at the GeForce 4 MX. Tim could be seen labeling a VW Beetle as a "Porsche MX", a stair-case as "Elevator MX", and finally turning the joke inward, he labeled himself "Carmack MX" in deference to the industry's most famous 3D programmer.
    • by Klox ( 29985 ) * <matt...w1@@@klox...net> on Wednesday April 30, 2003 @10:02PM (#5850116)
      But wait! You made a mistake: there never was a GeForce 1. It was the GeForce 256 (with both SDR and DDR versions). So the GeForce 4 is better than the GeForce 256...

      Yeah, yeah, we know what you meant, but NO! I know two GeForce 256 owners that are confused by this. They're answer to this: "whatever, I'll just buy whatever you tell me to when Doom comes out".
      • It's ok for those of us who can do a little digging and work out what's going on from hard benchmarks etc... I just have this horrible mental image of some poor kid whose parent have sprung for his first gaming rig. He thinks he's getting a top of the line card, a GeForce 4 but when he loads Doom III and tries to play all his hopes and dreams come crashing down around him. He eventually drops out of high school because he is so disillusioned about the state of society. He becomes a drug addict and all aroun
    • I agree the naming scheme is stupid. But you won't go far wrong simply going by price. For instance, the cheapest 4-MX is $10 cheaper than the cheapest 3-Ti on pricewatch.

      And unlike a benchmark, the price also reflects other factors like visual quality, fan noise, expected resale value, etc.

    • Is anyone else completely fed up with nvidia's moronic naming conventions?
      What about ATi's? The 9000Pro is slower than the 8500 (nor it is DX9 compliant as its name could imply). The 9600 is slower than the 9500. New isn't necessarely better, at least for GPUs. All those benchmarks are certainly getting on the marketing droids' nerves!
      When you think that most of the 9500 non-pro board are for all intent and purposes 9700pros, sold for half the price of the later:
      -the 4 disabled pipelines can be reenabled
      -9
    • Overcooked dog-doo

      Ohhhh as far as I'm concerned, you just can't cook dog-doo enough.
  • The Gigi FX5200P's blue board should nicely match Albatron's most recent motherboards, which sport the same color scheme.

    *breathe of relief* What would I have done if my video card and motherboard didn't match?!

    • The Gigi FX5200P's blue board should nicely match Albatron's most recent motherboards, which sport the same color scheme. *breathe of relief* What would I have done if my video card and motherboard didn't match?!

      This is really funny - until you realize that people are actually caring about that stuff. If you would have told me 2 years ago that it matters what color lighted fan you put in your case, I would have laughed at you. Hmm, come to think of it, I would laugh at you today.

      I know someone who was

  • by grolschie ( 610666 ) on Wednesday April 30, 2003 @10:31PM (#5850281)
    Hey c'mon this is normal. The budget NVidia cards have always supported advanced features, but when you actually use them they run like crap. I still have a Geforce 2 MX200 (a gift from a friend who got duped by a retailer). It supports 4x AA, but when this feature (and others eg: 32bit color on resolutions higher than 400x300) are activated, it craps out.

    The thing overclocks nicely, and when running in "best performance" mode in 16bit, it flies, uh well kinda. The key with all NVidia budget cards is to run 'em without all the technical advanced features. The reviewer enabled all kinds of crap that the card only just supported. Perhaps NVidia would do well to not let their budget cards support these advanced features. Benchies would be higher, and I guess more realistic. Most gamers (or would-be gamers with crappy MX200's like me) try to squeeze as much juice from their cards as they can. ;-)
  • by Ianworld ( 557858 )
    And the big secret is...

    The FX 5200 was being compared to an old budget card. The 9000 pro has been replaced(for a while now) by the 9100 and 9200 cards which are faster! Not to mention that you can get a 128 meg version for 74$ just 5$ more than that card(at gameve.com).
  • The FX 5800 requires an extra PCI slot, and I use all my PCI slots (one for video capture, one for USB 2.0, one for the Audigy, one for video capture, one for the NIC, I don't even have a slot available for my SCSI stuff!!).

    I absolutely refuse to give up dead necessary peripheral cards to add in a video card when they can just as easily make one that doesn't take up that extra PCI slot.

    I'll wait.
    (And no, moderators, jeez, this ain't a troll or flamebait, it's an honest opinion..)
  • Submitter summary (Score:2, Insightful)

    by Anonymous Coward

    The submitter said: the GeForce FX 5200 looks more like an exercise in marketing spin than a real revolution for budget graphics cards.

    Tech-report said: The GeForce FX 5200 isn't as capable a performer as its feature list might suggest, but that doesn't mean cards based on the chip aren't worth picking up... The GeForce FX 5200 is a great feature-rich card for anyone that's not looking for the best budget gaming option.

    Sheesh, why not let the article speak for itself and spare us the lame and inaccurate

  • by Fulg0re- ( 119573 ) on Wednesday April 30, 2003 @11:36PM (#5850509)
    Welcome to the real world. nVidia simply cannot compete with similar offerings by ATI at this point in time. Although the GeForce FX 5200 may be DX9 aimed at the masses, the performance isn't. Personally, I'd be more inclined to get an ATI based card, namely a 9000/9100/9200 series based card, even though they are "only" DX8.1.

    In terms of DX9, the only smart thing would be to get a 9500/9600 Pro if you're looking for something in the middle end, and a 9700/9800 Pro if you're looking high-end.

    I'm on a 9700 Pro right now myself, and there's no way that I'd consider any nVidia product at this moment in time. Maybe sometime in the future (and no, I am not an nVidiot or a fanATIc).
    • A few months ago I got a Radeon 9100 for $70, and I'm happy with it. In a year or 18 months when DX9 features get used more, I'll grab a 9700 Pro (or GeForce 5600 or whatever) for a hundred bucks. That makes more sense to me than spending $300 - $400 for a card that might run the games I want in 2004. I learned my lesson five years back, when I got a just released 12 meg Voodoo2 for three hundred bucks. Eighteen months later I traded it to my roommate for the $50 I owed in utility bills*.

      *$50 was the

    • NVidia is still the only option for those who are in any way concerned with driver reliability. While ATI has supposedly shaped up, Catalyst drivers are still far worse than NVidia's Detonator series in terms of quality and reliability. Take as an example the recent release of HDTV tuner cards that use software decoding: NVidia cards are currently the reccomended one with the DVICO FusionHDTV, because the ATI drivers are so buggy. If you do have ATI, you are forced to use an *older* driver because ATI i
      • I bought an ATI TV Wonder VE card and had all kinds of hell with my crappy on board video card, so I bought an medium end ATI AGP card which I could *never* get to work with my system. After two days of booting in VGA and green screens installing and uninstalling drivers, I returned in and bought a GeForce card that worked great first time and works great with the ATI TV Card.

        ATI may have the hardware, but, I agree, their drivers are *BUGGY*.
  • Sounds like... (Score:2, Interesting)

    by MaestroRC ( 190789 )
    They were just ansy to talk about their Radeon 9600 again. They start out the article telling about how the 9600 is a much more expensive and more capable card, and that it is not really in the same bracket that the FX5200 is in, yet the entire thing seems to brag about how much better the 9600 is doing. If they wanted to put the 9600 into the review, they should have at least included an NVidia card that was comparable to it, if only to not make NVidia look bad. If a potential buyer were looking over th
  • I went down the Best Buy the other day and picked one of these up to replace my venerable first-generation GeForce 3. It does seem just slightly faster (to be honest, my little Athalon 1600 is likely the main bottleneck), but it also has problems with texture mappings in some games that the GF3 handles fine (Shadowbane and Battlefield 1942).

    I took it back, not enough improvement to warrent suffering through another round of software updates until all the kinks are worked out.

How many QA engineers does it take to screw in a lightbulb? 3: 1 to screw it in and 2 to say "I told you so" when it doesn't work.

Working...