Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Tom's 46 Video Card Roundup 279

Hoagie writes "Tom's Hardware has posted (12/29) a huge 46 video card roundup. Included are a few generations of nVidia and ATI chipsets. Along with the newcomers/return of XTI, Parhelia, and S3."
This discussion has been archived. No new comments can be posted.

Tom's 46 Video Card Roundup

Comments Filter:
  • So? (Score:5, Interesting)

    by rafael_es_son ( 669255 ) <rafaelNO@SPAMhuman-assisted.info> on Tuesday December 30, 2003 @10:04AM (#7834513) Homepage

    My GeForce2MX (64 MB) runs Max Payne 2 and Tron 2.0 reasonably well. Why should I upgrade?

    • Re:So? (Score:5, Insightful)

      by Trashman ( 3003 ) on Tuesday December 30, 2003 @10:15AM (#7834581)
      Simple answer: If your current hardware does what you want acceptably, then there's no need to upgrade.
      • by Channard ( 693317 ) on Tuesday December 30, 2003 @10:22AM (#7834632) Journal
        .. was that working with PCs every day, coupled with the hassle of upgrading my own PC to play the latest games got on my nerves. Currently, my PC does all I want it to do - can be used to go on the internet, play most older games, and so forth. I may upgrade at some point when I'm not dealing with PCs every day, but at the moment I prefer just being able to get a game, slap it in my console and know it'll run at a decent rate.
        • I prefer just being able to get a game, slap it in my console and know it'll run at a decent rate.

          Let's hope that the increasing sophistication/availability/popularity of console games doesn't decrease what's available for the PC. I don't use my machine for gaming very often, but when I do, it's fun to install, run and modify some of the mods that are out there (TA [rakrent.com], C&C [moddb.com], for example). With console gaming, that wouldn't be possible anymore.
          PS: Know of any good sites for newbies to Slashcode?
    • Re:So? (Score:4, Funny)

      by Anonymous Coward on Tuesday December 30, 2003 @10:30AM (#7834678)
      My Atari VCS runs asteroids and lunar lander reasonably well. Why should I upgrade?
    • For HalfLife2, Doom3, and other new games. Also to make Max Payne2 look better.

      Don't get me wrong, I'm using a GeForce2Mx400 64MB pci, but if I had the money I'd update. (Personally, its more for having a constant FPS in counterstrike, as I play matches where I need to be able to stand in 3+ smoke grenades and still be able to aim.)
  • by sharkey ( 16670 ) on Tuesday December 30, 2003 @10:17AM (#7834598)
    At an estimated 7 pages per card, plus 4 pages of exposition on the front and 3 on the back, plus a big chart: a whopping 330 pages of ads estimated! Go Tom!
  • Prices (Score:4, Insightful)

    by Via_Patrino ( 702161 ) on Tuesday December 30, 2003 @10:18AM (#7834601)
    I think those benchmarks would have prices in, the boards would look much less atractive :)

    When will VGA board makers will compete by price, like AMD started to do few years ago and not for hundreds of FPS that no one uses (because they're over humam eyes limits)?
    • Re:Prices (Score:3, Interesting)

      by Tim C ( 15259 )
      It probably doesn't make a great deal of difference if a card can "only" run Q3 at 200fps rather 300.

      The card that can run it at 300fps, though, stands a better chance of running a new game at an acceptable frame rate than the slower card does. That's the point, really - chances are if you're a gamer, the last card you bought was benchmarked against Q3, so when shopping for a new one, you can do some comparisons based on that. Of course, the system used now is completely different, so you can't really comp
    • Read the article! (Score:2, Informative)

      by poge ( 678298 )
      He does include prices - and the last two pages of the article include a comparison of FPS per dollar - much more useful than a straight performance comparison, IMHO...
    • when Doom 3 limits the max frame rate to 60fps (I think that is what was said).
      • That is not going to have much of an affect on gamers. It will not handicap those with a better system.

        It's not going to make the game run any different, it'll just cap the max frames at 60. Older cards will still bottom out when there is too much on the screen at the same time, while the more powerful systems will stay parked at 60fps.

      • higher resolution.

        more detailed effects.

        better image quality.

        never going below 60 FPS.
    • Re:Prices (Score:5, Interesting)

      by Zathrus ( 232140 ) on Tuesday December 30, 2003 @11:19AM (#7835057) Homepage
      When will VGA board makers will compete by price

      They already do. Both nVidia and ATI have high end and low end chipsets, and they're very price competitive. They also segment them for sub-$100, sub-$200, and high-end (for which the price limit keeps going up).

      not for hundreds of FPS that no one uses (because they're over humam eyes limits)

      I'm sorry you have such poor eyesight. Have you considered seeing a doctor about it? I doubt they can do anything though -- it's probably neurological. Did you stare into the sun as a child?

      I wish people would quit spouting out the crap about "above human eye limits". There is no such thing. We don't know what the maximum frame rate that the eye can see is. Don't go talking about movies or TV -- they're not the same. All video capture methods (be it film or digital) capture motion blur, which our brains happily interpret when shown at a somewhat adequate frame rate. But that doesn't help a bit for somethings -- like fast pans (move the camera horizontally). Throw in some vertical definition (like, oh say, a white picket fence) and you'll wind up with a headache because what comes out on video does not look good. It's doubtful that it even looks like a white picket fence.

      Games don't render motion blur (3Dfx was working on this when they went tits up, but nobody has revived the work -- it wasn't well received at the time either). They render individual frames with static content. You CAN tell the difference between 30 fps and 60 fps. You can tell the difference between 60 fps and 120 fps too.

      And, of course, this doesn't address the minor issue that what the card is rendering still isn't photorealistic. Or truely 3D. When we get to ~300 fps of photorealistic 3D holograms then we can start talking about where to go next.

      Hey, go check out the benchmarks for the high end cards on HL2 or people's impressions of Doom3. IIRC, none of the cards were breaking 60 fps in HL2 at 1024x768. And those weren't even in intense firefights.
      • You don't care to explain how one's graphics card can display more than 60 frames per second when the refresh rate is 60hz, do you?

        Even if the refresh rate is 100hz (!!), you're going to top out at 100fps. If the display is updated 100 times per second, you simply can't display more than 100fps.

    • When will VGA board makers will compete by price, like AMD started to do few years ago and not for hundreds of FPS that no one uses (because they're over humam eyes limits)?

      The limit of the human eye does not limit the desire for more frame rates. Any hard gamer can tell you what an extra 30 fps (say... 30-60) can do for their game. Frankly, the higher the frame rate, the more continual the image looks (and hence, more like real life). I agree that after a certian point the mind does not notice anything,
    • Some others have mentioned that Nvidia et al have lower end $100 cards. Don't expect those prices to get much lower. Do expect onboard video on motherboards to get better though. The market for "cheap" video belongs to the onboard chipsets - they don't necessarily have to be pathetically slow either. There are a few motherboards out with separate memory channels for the video, which brings overall performance reasonably close, all other things being equal. Once it's all added up, it's pretty tough for
  • Fbucks (Score:2, Informative)

    by Anonymous Coward
    That Fbucks chart at the end is fantastic, I hope they make extensive use of it in the future.

    But be careful when you type Fbucks!
  • long in the tooth (Score:5, Insightful)

    by GerbilSocks ( 713781 ) on Tuesday December 30, 2003 @10:18AM (#7834609)
    6 years ago I could get excited about these roundups but lately, it's becoming a real yawn. Who cares anymore? Fine, give me a -100 flamebait, but I although I hardly play games now that I'm in my late teens, my old nVidia GEFORCE 2mx with 32MB RAM is more than enough for my daily computing. My enthusiam for video 3d accelerators died about the same time as 3dfx.
    • Not even 20 years old and bitter... You must be a blast at parties, bow tie and all.
  • by Anonymous Coward on Tuesday December 30, 2003 @10:19AM (#7834612)
    go out and buy a Dell with an ATi 9800 Pro in it.

    That's what I did. Buying a full machine from a supplier impacted on the price of the LCD screen and the GFX card enough to make it worthwhile. The reason it's a Dell is cos they seem to be the only mainstream supplier that gives you a decent choice over the matter. There's no way I'd ever buy a GFX card for 250 or an LCD for 500, but when I can get them inclusive in a PC for 1000, that's too much of a bargain to pass over.

    Generally, I find I can get through a PC every 2-3 years. If I'm buying machines with cutting edge stuff in them, why should I ever need to buy a GFX card upgrade? I'll just wait that extra 6-12 months and upgrade the whole caboodle...
    • We're not just talking about upgrades here. If you have the intelligence to open a door, you'll want to build your own (personal) computer yourself.
      • that's not a given you know.

        I've built my fair share of PC's for bhoth windows, linux and freebsd and frankly I've got bored of the hassle. yes, you can save money doing so but I'm not a student anymore, I've got a decent job with a decent salary, I don't need to penny pinch anymore and I'm coming to the conclusing that my time is worth more to me than the money I might save by hand making it, esp considering all the hassle you can get.

        yes, all the individual bits have their own warranties but that's litt
        • It depends on your level of dedication. For example, I know full well your issue with the high temps was because of a bad bios because I had the same problem. I'm up there in age but actually enjoy building my own systems in hand. I get a sick thrill out of doing research and finding deals and putting together exactly what I want. Recently that included:

          Antec Sonata quiet case (incredible deal btw)
          Extra 120mm case fan
          wd 2000jb drive
          black NEC 17 inch monitor
          ti4600 (ebay, 60 bucks)
          Optorite DVD -rw +rw -r +r

          • that looks like a nice system you've set up there, very nice bu I'm afraid I took the easy (but more expensive) way out and bought an apple which so far I've been very happy with, but then I might add, I'm not a dedicated gamer. I used to pla a few games but the only game I've played with any regularity is angband :)

        • by juhaz ( 110830 ) on Tuesday December 30, 2003 @12:33PM (#7835660) Homepage
          You may have a decent job with a decent salary, but some of us are still students.

          I'd get distressed by a mere thought of spending thousands for new Dell if I can upgrade a few years old system for 400e or so to a relatively modern beast. That's helluva lot of beer and pizza.

          And some people may actually like the very tinkering and tweaking you're so full of.
          • oh, I'm not saying that it's not right for others, and when I was a student, I did my fair share of it. it's just that there have been lots of posts recently implying that anyone who is a geek and doesn't put together their own machine is stoopid. I'm just trying to say that there are very valid reasons for even a geek to buy a premade machine.

            I used to like the tweaking, but frankly I've gone off it :)

    • One thing that you should be aware of is that you are NOT (usually) getting the same 9800 Pro that you'd get from ATI.

      Dell has the power to bulk order graphics cards to thier own specifications. They can say "leave off this IC" or "use this cheaper (ie slower) memory". It is standard practice to do this. They may actually just license the design and have them built by their own fabrication contractors using their modifications to cut costs.

      Either way, it is RARELY the same card. You are frequently limited
    • by Medievalist ( 16032 ) on Tuesday December 30, 2003 @12:01PM (#7835378)
      I agree with you 100%, and hope everyone else does too!

      I get nearly all my hardware from dumpsters and recycling bins, so the faster you upgrade the better my stuff is.

    • Come now, you buy from Dell? A Slashdotter who does not build his own PC is like a Jedi who does not build his own lightsaber.
  • by Rosco P. Coltrane ( 209368 ) on Tuesday December 30, 2003 @10:21AM (#7834624)
    is that normal ones, the cheapo ones witl 8M of ram and no 3D-XYZ and hyper-acme rendering, that work just dandy for word processing, spreadsheeting and other forms of work (oh the dirty word!), are disappearing.

    Pricewise, that's not a problem in itself, I don't care if I have a super vidboard for dirt cheap and underuse it, but with all those bells and whistles that I won't use, manufacturers don't release their specs anymore, and so I have to install shitty binary drivers instead of using kernel-compiled ones.

    In short, with my old Matrox Millenium, I could do 1600x1200x16 just like I do now, but I didn't have to fight with the nVidia drivers that belch on me each time I change something with libc, modutils or the kernel. And I suppose I could try out 2.6, while with the proprietary driver, I can't.

    I reckon there should be a market for sub-$10 basic video cards with open specs, for those who care more about low-cost, driver support and not having headaches to do real work, than playing games.
    • Then just buy a Radeon 9200 (the fastest ATI which is supported by the open source drivers). Vote with your dollars...
      • Vote with your dollars...

        there is a problem with this. they see you buying the 9200 but they don't know why. they probably think you're buying it because it's cheap and this gives them cause to bring out a cheaper version of the bigger cards, still with a proprietry driver. they don't know that the reason you're getting it is because it has OS drivers.

        • Except, downloads of the OS drivers will increase with the purchases if people are buying them solely for that reason.

          Marketeers are good at that crap. If the "9200 series" sales increase and the "9200 series drivers" downloads increase in tandem and at a relatively similar rate, they'll put two and two toghether. Whether they'll care is a whole different story.

    • what happens is that the market for motherboards with built-in video increases.

      eventually nearly all expansion cards will exist as outboard boxes or built-in motherboard components anyway
      • Well, judging by the current standard of onboard graphic chipsets, the graphic card makers needn't worry for a while yet. I just inherited a new P4 2.6ghz machine at work with an Intel 830 (I think) chipset, and it absolutely sucks for 3D (running Linux). It works, just, but try turning any options up in any of the GL screensavers and it really starts stuttering. I have an old Athlon 1.4 with a GF2MX that's used as a headless server that can easily double or tripple the fps that the Intel chipset can.

        • Well, I meant more like 5-10 years in the future. This is what I'm thinking.

          -Your case has a motherboard, which has a processor, RAM, FireWire 400 and 800, USB2, LAN (maybe), and video. Maybe a boot drive as well. The board costs about the same as a new video board does now, dependent on chipset quality.
          -audio interfaces, hard disks, optical storage, and any other peripherals connect by FireWire or USB or some other yet-undeveloped bus
          -all you external thingies are stackable

          As long as you can boot off you
    • The problem is that with the low-end 3D cards so cheap, there really isn't a market for people like you. Don't get me wrong, I actually agree with what you say - there are plenty of machines, from desktops that are never used for 3D stuff to servers, that have no need of anything as remotely powerful as even the cheapest of currently-available cards.

      But the thing is that those cards are so cheap already that the profit on them must be next to nothing. Making cheaper cards probably wouldn't be cost effectiv
    • I reckon there should be a market for sub-$10 basic video cards with open specs,

      Considering that the company would not only have to make the video card, but also support the video card, market them, distribute them, etc., it's hard to imagine that any market could exist for a peripheral card under 10 bucks - people might only pay 10 bucks for them, but it would cost more than 10 bucks to get them into the hands of the consumer. Name me one other peripheral card which is marketed, new, under 10 dollars.
    • ... which takes care of the misery. While I understand your point about paying for features you don't use, it's YOUR problem that your OS doesn't do what you want. See, it's software and a free world. If you use an OS that sucks in an area you don't want it to suck, perhaps you should move on and install another OS which doesn't have these problems.
    • I'm not sure where did you got the idea that nVidia cards need their binary drivers if you only do 2d - because they don't. Never (or at least for a loooong time) have.

      And the same is probably true for ATI, even though I'm not quite as sure because lack of first hand experience.
  • Is it me or... (Score:4, Insightful)

    by JFMulder ( 59706 ) on Tuesday December 30, 2003 @10:23AM (#7834635)
    ... has video card technology have become pretty much uninterresting in the last 1 or 2 years. I mean, I can remember being in awe when the first GeForce came out, reading everything about what made it great and how it worked. It introduced us to a whole new world of possibilities. Then came the GF2, boring. GF3 raised my interest for a while with it's vertex and fragment shaders, but it dissipated pretty quick. Then GF4 and GF5 FX. I don't even look at card comparisons anymore. It's been a while since I've been anticipating new video card technology. Am I the only one?
    • Come to think of it, the same thing is valid for CPUs also. I should be exited at the different AMD 64-bit offerings, but since it mostly involves more memory and bigger data type, I fail to be excited when I read about them. Itanium was very interresting to read about, but sadly it failed to deliver.

      I'll patiently wait until quantum computing comes out I guess.
    • In addition, I'm noticing a "slowdown" trend within CPU and Video cards. They seem to remain "higher priced" for longer, it seems.

      I've got a spare mobo and would like to get some pieces and parts to complete another machine. Of course, this means mildly upgrading my current machine, and then throwing most of the parts I'm currently using (amd2100+xp, GF4ti4200, memory) to the currently unused mobo.

      My problem is that I could buy the same parts I bought when I first built my primary machine, and only pay
    • Re:Is it me or... (Score:4, Interesting)

      by sklib ( 26440 ) on Tuesday December 30, 2003 @11:06AM (#7834958)
      You are the only one.

      Recent advances in video card technology may not be blatantly obvious from the gaming side, although certainly the difference between half-life 2 and half-life will make all of that clear.

      The real changes are from the programming side. Pixel and vertex shaders allow a programmer to use the hardware in un-foreseen ways, unlike the fixed-pipeline cards of the past. A lot of graphics programming on the fixed pipeline (GF1) came down to playing with parameters that OpenGL or Direct3D would expose to you -- as in how to look up textures, how to transform your geometry, etc. You say the GF2 came out, and it was "boring". In fact, it's the first generation of slightly programmable video hardware, because it supported hardware bump mapping -- a huge feature used by every modern game, although at the time it was still playing with pre-existing settings.

      Nowadays (since the geforce3), a programmer can invent his own parameters to tweak -- a huge step. You say things "dissipated" after that -- completely untrue! With every new generation of video card, the vertex and fragment programs can be longer and more complicated. The next-generation games (hl2, doom3) already use all of this technology, and next-generation consoles (xb2, gc2, ps3) will undoubtedly integrate all of it.

      • Software developped for these cards is really cool, of course, I never meant to say otherwise. What I meant is that hardware wise, since GF3, it seems the only thing nVidia and ATI have been doing to the rendering pipeline is making it faster and allowing bigger shader programs. So basically they've upped the clockrate and bolted more memory for the shaders.

        That's why I said I wasn't really interrested in hardware advances anymore. It ceased to be revolutionary to a certain extent, just faster.
        • Software developped for these cards is really cool, of course, I never meant to say otherwise.

          His post was talking about pixel and vertex shaders, which are enabled thanks to the huge advances in the hardware of the video cards. The old cards simply ran through 3d calculations and texture mapping, the new cards are completely programmable, with instructions, registers, etc. -- you run actual programs on them that are used to produce the amazing new effects you see in DX9 games.

          If you read up on it a bit,
        • So basically they've upped the clockrate and bolted more memory for the shaders.

          And added a lot of new instructions (like conditional statements and looping) and data types that didn't used to be there (like half, single, full, and maybe double-precision floats). There's also some support for 10 or more bits per color channel nowadays.
  • Laptop Video Cards (Score:5, Interesting)

    by AssClown2520 ( 695423 ) on Tuesday December 30, 2003 @10:31AM (#7834685)
    It would be nice to see a review of this extent at least mention a few laptop video cards. Laptops video cards have really progressed, but how do they compare to their desktop counterparts?

    Obviously, the desktop cards are always going to be ahead of the curve considerably, but does the 4200GO perform similar to the 4200 cards? For everything I do, this seems to be a pretty solid card, but I always wonder what kind of power I am giving up by going to a laptop only setup.

  • by MicroBerto ( 91055 ) on Tuesday December 30, 2003 @10:33AM (#7834695)
    The biggest mistake I ever made when building a new computer was buying a $300 graphics card. Unless you game 24 hours a day, don't waste your money.

    Instead, money is best sunk in a good set of speakers and monitor -- these things depreciate way less. Along with that $300 graphics card, I also bought a 19" Sony monitor and Klipsch Promedia v400 speakers with my athlon 550 back in dec 99 (yep, still using it!). While that graphics card has long been in the graveyard, the speakers and monitor are still rockin along.

    My graphics card, however, was a 2nd rate GeForce2 for about 60 dollars that performs excellently for what I do.

    My opinion? Look for a good price gap on graphics cards and processors, and go with something a bit older than the newest. But splurge on the stuff that won't depreciate as quickly.... unless you game 24 hours a day.

    • Or unless you make 3D engines :-)

    • $300 graphics card....don't waste your money.

      It's really all relative, isn't it ? There are subjective values to things like video quality which cannot just be measured in terms of dollars because they differ from person to person. Some people might only game for an hour a week, but when they do so, they really want to have the best quality video they can.

      Who is anyone to say they're "wasting" their money, especially when it is undeniable that the more expensive video cards ARE measurably better in seve
    • I agree.

      The reason monitors and speakers don't depreciate so much is because the technology can't improve very much.

      While silicon processes and transistor design might allow for a performance doubling every year or two, that simply doesn't apply to most other industries because it is much easier to max out any particular technology and improvements can only be incremental at best in comparison.
  • by Anonymous Coward
    When I want facts on graphics, I go to beyond3d.

    P.S. NV3x architectures can't do everything in 8x1 mode [beyond3d.com]. Has to drop to 4 ops/clock with color operations.
    • That article is way outdated.

      The GeforceFX 5800 sucked. It only had a 128-bit memory bus. etc. We all know this.

      None of that is true anymore. The 5800 isn't for sale anymore. The 5950 has as much memory bandwidth as ATI's fastest card.
  • A bit off-subject... (Score:5, Interesting)

    by evilviper ( 135110 ) on Tuesday December 30, 2003 @10:44AM (#7834757) Journal
    A bit off the subject, but interesting news for sure:

    MPlayer has XVMC support (with mpeg1/2). That means any videocard, with an XF86 driver that supports XVMC, can now do MPEG1/MPEG2 playback entirely on the card's processor, so no CPU load at all.

    NVidia's binary drivers support it on the Geforce4, and Intel 810/815 cards have open source X drivers that support it as well. ATI's driver don't support XVMC just yet, even though the hardware has the capability.
  • Old chipsets are reviewed too. I took their charts went to newegg and found their #2 for under $100. Very useful if you ask me.
  • Flawed results (Score:5, Interesting)

    by Call Me Black Cloud ( 616282 ) on Tuesday December 30, 2003 @10:49AM (#7834780)
    The GF4 Ti4600 comes out at or near the top of their "Fbucks" rating, which is fps/$. They show a price of $65 for the card, based on what bizrate.com reports. If you go to bizrate.com and look at the Ti4600's available it does appear there are some for $55-65.

    If you dig a little deeper and follow the link for the Jaton 3DForce4 Ti4600 for $54 you'll find all the retailers listed are actually selling the MX440, a lesser card.

    If you follow an $89 link (still a great price) you'll find half.com is offering the PNY Verto GEFORCE4 TI 4600 for that price (according to bizrate). Click the link to half.com and hey! you can get a new one for $319 or a used one for $180. No $89.

    While I respect Tom's hardware I think fact checking is a much larger task in these bulk reviews and is something they need to pay a little more attention to.
  • As I look through the benchmarks, I see the Parhelia consistently in the basement. For a 256bit card, it's a pig. And this raises an interesting question -- do any gamers actually use Matrox anymore?
    • Re:Matrox (Score:3, Informative)

      by jandrese ( 485 ) *
      The Parhelia isn't a gamers card though. It's optimized for professional 3D uses (whatever those are). That's what Matrox claims at least.
  • by TrollBridge ( 550878 ) on Tuesday December 30, 2003 @10:54AM (#7834832) Homepage Journal
    I don't know about you guys, but I thought that at ~$140, the Radeon 9800 scored pretty well for such a reasonably priced card. That was the first non-NVIDIA card I've bought since '99, and believe me it's worth every penny. No need to spend $400 on the PRO models or the latest NVIDIA offering. Ya can't beat the price/performance of the 9800, IMHO!
    • Most likely, you have the Radeon 9800 SE, which is grossly underpowered-marketing-named-it-to-cause-confusion video card. There is however, a real Radeon 9800, built only by ATI, which was never priced below $200. The real Radeon 9800 was killed by ATI because it presented too much threat to their Pro lines. The Radeon 9800 SE is priced in the $120 - $160 range.
  • The fbucks feature was pretty interesting to me. Being able to see just how much performance you can get for the price for all these cards definitely helps to narrow the field somewhat. These newer cards just don't seem to be worth the money they're asking for if that's all the performance you'll get, not counting the quality cut from not enabling all the latest "features" like FSAA and Anisotropic filtering.
  • by faust13 ( 535994 )
    Um, the Prahelia is no newcomer it's been on the market for well over a year.
  • by B5_geek ( 638928 ) on Tuesday December 30, 2003 @11:08AM (#7834973)
    #1) New cards are faster then old cards
    #2) Old cards are cheaper then new cards
    #3) Best bang for your buck = older cards

  • ATI 9700 pro (Score:4, Informative)

    by BrookHarty ( 9119 ) on Tuesday December 30, 2003 @11:08AM (#7834975) Homepage Journal
    I upgraded from a GF3-TI500 [tomshardware.de] to an ATI 9700 pro, almost as fast as the GF4-4200 or ATI 8500. At the time (2002) it was the king.

    I first tried the Nvidia GF4-4600 for 199, and it didn't even feel faster(took it back). The ATI 9700 Pro, Ati's main comeback into the game, really was impressive. It was worth every penny (39,900 of em).

    Anti-Aliasing was the new kid on the block, and the ATI 9700 pro allowed all games at the time (and most now) with AA turned on. Toms benchmarks shows the ATI 9700 pro still to be in the top 10. With video cards not doubling in speed every 6 months anymore (i miss you 3dfx), I dont expect to see the speeds jump like they use too. This card might just last me another year, and in the last 6 years, thats amazing in gfx card releases.

    The only problem I've seen so far, is Nvidia's CG code really messes with ATI's textures and shaders. And with lots of developers loving Nvidia SDK's. ATI has been good to fix most bugs with ever new Catalyst release, but I'm still waiting SecondLife [secondlife.com] to get patched. (Nvidia CG bugs) Such a work horse of an engine (Havok), should be interesting to see Havok2 [havok.com] engine used. (Also used in Max Payne2)

    The benchmark had me wondering, why only a P3.2ghz? I'd like to see them also include a High End AMD, and both mid range (2.6hz P4, AMD 2600) to round it out. Always wonder how many more FPS a faster CPU will give me, so I can just if its worth the cost. BTW Save those pics from toms hardware, then you can compare hardware later. I had to search the tomshardware.de for the benchmarks I was looking for 2002.

    Hey, lucky they didnt use a P4EE ;)

    • They didn't use an Athlon64 in the reviews because some Radeon cards were showing some artifacts. The article states that the artifacts may have been due to overheating due to an incompatible motherboard. I think that they were using a crappy case without enough cooling.
  • Fan Noise (Score:3, Insightful)

    by Slider451 ( 514881 ) <slider451@@@hotmail...com> on Tuesday December 30, 2003 @11:33AM (#7835149)
    They really need a column for fan noise. My Gainward FX5600 Ultra Flip Chip has the noisiest fan by far of the eight in my case. And it's a high-pitched whiny noise, the worst kind, because it reminds me of my wife after I've been playing on the PC too much.
  • It boils down to whether you want to play Doom 3 with all the bells and whistles (provided you have the loot for it natch). If so, you don't need the latest screamer videocard like the 9800 XT, but a reasonably good VPU which can exploit DirectX 9's more advanced features. That's really key, it's got to be DirectX 9 compliant for the new crop of ubergames. I've tried the Radeon 9700 (which is several hundred dollars cheaper than the latest generation) and it killed.
    That way your not shooting your whole wad
  • So I go out and buy the Radeon 9200 128... and a day later this review comes out. Anyone know Best Buy's restocking policy?


    p.s. Anyone offer a guess as to why the 128 performs no better than the 64?
    • The extra memory allows the card to store more textures on it. The reason it doesn't seem to matter a whole lot is that older games simply don't need that much memory, and newer games really don't seem to take advantage of it very well (my guess is they don't want to break things for the 64MB and 32MB cards out there).

      Then there is the extra overhead of managing twice the memory, which makes some chipsets slower (like the Radeon 8500) in many tests.

      Finally, in many systems the system ram and AGP bus is s
  • It may seem a bit unusual to ask about fast AGP 2x cards, (especially since we're up to 8x now) but I have an older motherboard and that's the most it'll take :-/. It's an Asus K7M [asus.com] -- one of the first Slot A boards -- and it only supports the AGP 1.0 spec (AGP 1x/2x).

    You'd think that AGP would be backwards compatible, but that's doesn't appear to be the case. Due to voltage changes as the spec evolved [ertyu.org], my motherboard will only supply 3.3v (as opposed to 1.5v or even 0.8v of some of the other AGP versions)

  • Ok, I bought a Sapphire atlantis ATI Radeon 9800 Pro just alittle before Christmas and it took a massive amount of effort to get everything working. These cards IMHO are excellent assuming you know what you are doing, but definitely not ready for the masses. Simple games like RTCW enemy territory, call of duty, delta force bhd, battlefield1942 all took an insane amount of tweaking and research to get working perfect.

    Things to do to reach Nirvana

    1.) catalyst 3.10 driver was the best there is, and I had t
    • I have had similar experience with every ATI card I've ever owned (flaky drivers, overheating, promised patches that never come, etc.).

      In fact, I foolishly tried ATI again when building my nephews a new computer during my visit to Florida this Xmas. I picked up an ATI 9600 because it seemed like a good deal compared to the Nvidia's I saw in the stores around there. Plus, I had read so many /. posts over the last year or so saying how much better the Catalyst drivers were than the old ATI drivers, and
  • by Jagasian ( 129329 ) on Tuesday December 30, 2003 @02:17PM (#7836897)
    Tom's charts list such things as DirectX version support... but it doesn't list Linux support. Anyone want to slap together an addition to Tom's chart that lists Linux support?
  • With cards being as powerful as they are now, are programmers getting sloppy with their coding?

    I just got Halo for the PC. I have an Celeron 2.4ghz machine with an ASUS P4R800-VM (integrated ATI 9100IGP graphics).

    It sucks. The VPU supposedly keeps crashing while I try to play it, I get about 3fps at 640x480, and it doesn't LOOK (at least as far as I've been able to get) much better than Half-Life, which can run with SOFTWARE rendering on a Pentium 166mhz machine, 320x240 resolution at ~15fps!

    (For th

"I prefer the blunted cudgels of the followers of the Serpent God." -- Sean Doran the Younger