Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

ATI Radeon 9800 Pro vs. NVidia GeForce 5900 322

HardcoreGamer writes "Today ATI shipped its Radeon 9800 Pro 256 MB DDR-2 card in time for E3 and nVidia announced the NV35-based GeForce 5900 which will be available in June. Early tests seem to say that while nVidia edges ahead of ATI in specific areas, overall ATI still has the better card. The caveat is that the next generation of DirectX 9-based games (like Doom 3 and Half-Life 2, demonstrated with ATI at E3) will truly determine which is the better card. Lots of coverage at PC Magazine, PC World, The Register (ATI) (nVidia), ExtremeTech, InternetNews, and Forbes/Reuters. Either way, at $450-$500, serious gamers are about to get another serious dent in their wallets."
This discussion has been archived. No new comments can be posted.

ATI Radeon 9800 Pro vs. NVidia GeForce 5900

Comments Filter:
  • Minor annoyances (Score:5, Interesting)

    by DetrimentalFiend ( 233753 ) * on Monday May 12, 2003 @09:32PM (#5941489)
    Just a small note, but one that's been bothering me with all of these reviews: Not all 'next generation' games are 'dx9.' Though the new cards are dx9, many games (coincidently, most of the best games) use OpenGL. Unfortunately, it's much easier to incorrectly call Doom3 a dx9 game than to cite the OpenGL extensions (like shaders) that are used.

    (Also, I'll note that Doom3 may be technically a DirectX9 game because its sound and input MAY use it, but in the context that people have been talking about dx9 games, it is still incorrect.)
    • by Anonymous Coward on Monday May 12, 2003 @09:37PM (#5941518)
      I'd like to add to this. At $400-500 serious gamers better get use to eating Ramen noodles.
    • Re:Minor annoyances (Score:5, Informative)

      by Anonymous Coward on Monday May 12, 2003 @09:46PM (#5941569)
      It's really rather quite simple. A small subset of 3D games are OpenGL games. These hardware accelerators are incidentally designed to adhere to standards defined by DirectX. They simply expose this functionality as part of their OpenGL implementation, either as vendor-specific extensions or otherwise. Doom 3 will make use of features standardized between DirectX 8 to DirectX 9 3D hardware. No one is going to enumerate every possible OpenGL extension the engine can be run with, as there're numerous render paths. They're not incorrect for using DirectX as a benchmark for functionality the engine will make use of, even if it doesn't use the API. Most 3D engines, though, actually do use DirectX. Source and Unreal both do, for instance.
      • You right, but that will change as Linux popularity will grow, developers will find it easier to use OpenGL as it cross-platform and DirectX isn't. SDL and OpenAL will come into play as well. People may say that OpenGL is lagging in progress but games like DOOM3 make me somewhat skeptical of these people. Long live Carmack.
        • by Trepidity ( 597 ) <[gro.hsikcah] [ta] [todhsals-muiriled]> on Monday May 12, 2003 @10:13PM (#5941710)
          As much as I'd like that to happen, it doesn't seem very likely to happen anytime soon. Really, John Carmack singlehandedly keeps OpenGL alive; if he didn't have such a strong preference for it, DirectX would have just about all the major games out there and hardware support would be significantly worse.
    • Yes, but, unfortunately my S3 Trio64V+ doesn't support either OpenGL or DirectX9, so I guess I won't be playing Doom3. Sigh. And I was so worked up over it!
    • It doesn't matter because the featuresets of DX9 and OpenGl 2.0 are about 99% overlapping. A card that can implement one can very easily implement the other.
    • Doom3 will not use direct3D, but it does use other components of DirectX. Of course you don't need a DX9 3D card to drive those, but still, Doom3 WILL use parts of DirectX (unless you are running on Linux).
    • by Taco Cowboy ( 5327 ) on Tuesday May 13, 2003 @05:08AM (#5943166) Journal


      Just a very curious question:

      Does Linux do Direct-X ?

      If Linux doesn't do Direct-X, then ....

      How can we know which one runs better under Linux ?

  • by Anonymous Coward
    ... but will some smart /.er out there create a way for me to pirate hardware?

    That'd be really nice. Thanks!
  • by Anonymous Coward on Monday May 12, 2003 @09:35PM (#5941502)
    My basis is being in a zone with about 20
    other people with a high GHz and Mbyte machine
    and see if the card allows the graphics without
    slowing down the game.
  • by drwhite ( 456200 ) on Monday May 12, 2003 @09:35PM (#5941505)
    "...serious dent in their wallets."

    Dont you mean 'hole'.
    • by Anonymous Coward
      or perhaps 'crater'
    • My 80 column video card [slashdot.org] says that it's top of the line, and it isn't nearly as expensive as this stuff.

      The hole in my wallet seems to be filled. I don't play games, so I don't need more modern hardware. I keep movies on disc, so I don't need a larger hard disk. My zaurus covers all my portable media needs. About all I have left to get is a scsi tape drive, but discs cover my backup needs.

      Is there any "must have" Linux hardware?
    • nVidia shot up $5.39 per share to $21.37 on Friday alone when the news of the soon chip release became mainstream. That's a HUGE increase in share value for any company of that size. It's almost unheard of. I sold all my shares before today. Now I can purchase my new Radeon and still have cash leftover with the $2000 profit from only 300 shares bought a week ago for about $4500. =) The stock market kicks ass!
      • by Anonymous Coward
        Yeah, and then at the end of the year your taxes will force you to sell off that Radeon and you'll still end up owing money. Don't count that money as free just yet...
  • Easy choice (Score:2, Funny)

    by Anonymous Coward
    The choice is easy: go for the hardware with the higher number after it's name.
  • Please clarify... (Score:3, Interesting)

    by jpt.d ( 444929 ) <.abfall. .at. .rogers.com.> on Monday May 12, 2003 @09:37PM (#5941517)
    Is DetrimentalFiend [slashdot.org] correct when he says that only parts of doom3 may be dx9? The rest would in fact be openGL correct?
    • Re:Please clarify... (Score:5, Informative)

      by DetrimentalFiend ( 233753 ) * on Monday May 12, 2003 @09:59PM (#5941646)
      Every graphics engine since Quake 1, that John Carmack has made, has used OpenGL. In his latest .plan update he makes many comments about using OpenGL, though the most obvious is this: "Trying to keep boneheaded-ideas-that-will-haunt-us-for-years out of Direct-X is the primary reason I have been attending the Windows Graphics Summit for the past three years, even though I still code for OpenGL." Anyway, if an interesting read is his .plan update when he was first experimenting with OpenGL in quake. Basically, there are not as many problems with DirectX anymore, but he still uses OpenGL. Personally I like OpenGL better because of its design philosophy and because it's cross platform. Anyway, some links are below for those interested.

      http://www.bluesnews.com/plans/1/ [bluesnews.com]

      http://www.exaflop.org/docs/d3dogl/d3dogl_jc_plan. html [exaflop.org]
    • Re:Please clarify... (Score:5, Informative)

      by sjelkjd ( 541324 ) on Monday May 12, 2003 @09:59PM (#5941647)
      People call games "DX9 games" because the various DirectX revisions give a rough dilineation of the different generations of graphics hardware. Roughly, they are:

      DirectX 6: Software Transform and lighting. Most games from this category use lightmaps for lighting, rather than goraud(per vertex) shading.

      DirectX 7: Hardware T&L. All those new T&L enabled games you heard about belong here. The opengl equivalent is calling glTranslate, glRotate, etc do to transformations, and using glLight to do lighting

      DirectX 8: Vertex and Pixel Shaders. Let's you program the vertex transform and lighting part, and to a lesser extent, the pixel processing part, of the graphics pipeline. Corresponds to the OpenGL extensions NV_VERTEX_PROGRAM, NV_TEXTURE_SHADER, and NV_REGISTER_COMBINERS(for nvidia, similar extensions for ATI)

      DirectX 9: Highly programmable Vertex and Pixel Shaders. The old pixel shader model let you do something like 8 operations max, while the new model greatly extends this number. OpenGL extensions are ARB_VERTEX_PROGRAM and ARB_FRAGMENT_PROGRAM.

      This is really only a brief overview, there are many, many more OpenGL extensions(which you can see here [sgi.com], some of which have no DirectX counterparts. It's easier to tell non-graphics programmers "It's a DX9 game" than "Oh, it uses OpenGL 1.4, ARB_VERTEX_PROGRAM, ARB_TEXTURE_PROGRAM, etc", especially since DirectX is a well-known name. People generally aren't as aware of the various revisions of OpenGL(which are mainly exposed through extensions).
      Doom 3 uses OpenGL for its graphics. In fact, the basic tech required is really DirectX 8 level(bump mapping and stencil buffer), but it looks better on DirectX 9 hardware(due to the higher programmability). It likely uses other Direct X APIs for sound, networking, etc on Windows.
    • I guess my above post may not have exactly answered your question. DirectX is a suit of media components including network, sound, input, graphics. Most developers use it for sound and input. In fact, many libraries, like SDL, simply are a layer between DirectX on windows. I don't know of anyone who uses the network component (direct play), but many people do choose to use Direct3d (the graphics component). Quake 2 and 3 used DirectX for input and sound, if I remember correctly, and used OpenGL for gra
  • by ejaw5 ( 570071 ) on Monday May 12, 2003 @09:37PM (#5941521)
    when a new video card has more memory than what you have in system memory
    • by geeber ( 520231 ) on Monday May 12, 2003 @09:41PM (#5941539)
      When a new video card costs more than your entire system is worth.
      • Funny enough I just built a pretty decent system for significantly less then the cost of either of these cards. It's a AMD Athlon XP 2100+ based system using an SIS746 mobo, 512MB of DDR333 ram, CDRW, 400W server class case, lowend graphics card and a pair of 20GB hdd's I had laying around using software RAID1. Total cost $350, around $100 less than just these cards. Sure it can't play the latest games because it has no 3D accelerator, but that could be remedied for around $120 with a Geforce 4 Ti4200 128MB
  • by Anonymous Coward on Monday May 12, 2003 @09:39PM (#5941526)
    256 MB RAM???

    My first freakin' PC had 20 meg HD.

  • by Anonymous Coward on Monday May 12, 2003 @09:39PM (#5941531)
    I'm afraid until ATI starts producing better Linux drivers, I'll have to stick with nVidia's cards for the time being. nVidia has really gotten their act in gear as of late and their latest drivers work great for me under Linux. I see on ATI's website that their drivers don't even support XFree 4.3 yet. Weeeeakk! :)
    • /agree I love the linux drivers for Nvidia.
    • by jimbobborg ( 128330 ) on Monday May 12, 2003 @09:48PM (#5941576)
      ATI's drivers were given to the X crew, they didn't commit them. Check out their archives for more info.
    • Absolutely right...

      I really dont care about Nvidia's drivers not being open-source as long as they promptly release the official version of their drivers for all the major linux distributions. Ease of installation matters, and full points to Nvidia for understanding that.

      • I have to agree. As long as their not shit, don't complain. I'm not a Open Source fanatic. Sure I use an Open Source OS. Doesn't mean eveything has to be open source. I'm sure most of you can agree, which on would you choose the one that works, or the one that works well. Though I don't know why they can't release the specs to the card so Open Source Drivers can be made.
      • It would be nice to not have to recompile Nvidia's modules every time you changed your kernel though.
        Or to not have a downloadable one /w some wack'd out latest beta kernel.

        Small complaint, I know....

        Later!
        -b
    • by MarcoAtWork ( 28889 ) on Monday May 12, 2003 @09:50PM (#5941588)
      Definitely, I don't care if -any- ATI card has a 2%-5%-10% performance advantage, having absolutely great drivers from NVidia (for Linux & windows) far outweighs any small performance gains the ATI card might supposedly have.

      If the situation is like this (where the cards are pretty much neck & neck) the balance swings even farther towards buying NVidia. The only NVidia card I'd have never ever considered buying would have been the dustbuster...

      Given that I'm running an (ancient) dual p3-450 bought 3 years ago, I guess this Fall it might be time to upgrade :)
    • ATI has never wanted to trouble themselves with Drivers. Historically they have abandoned hardware as quickly as they thought they could get away with. I got bit by this back with the introduction of the "new" windows driver model. A card less than two months old was "unsupported". I made the mistake of buying an ATI PCI TV Wonder while experimenting with HTPC setups. Fortunately that one is still quite useful in Linux. ATI dropped windows support for IT over a year ago. Shortly after I purchased one NEW. The ATI Windows apps still don't work right. Every time they invoke Windows scheduler to set up a scheduled show, they GPF.

      I will never forget or forgive that blatant attempt to obsolete brand new hardware. The fact that they can't be bothered to stay current with Xfree doesn't help their case in my eyes.

      The only windows box I have left is the one that I play most of my games on. Every machine I own runs only NVidia hardware. The fact that NVidia's drivers support every piece of hardware they've made back to the original GeForce (and I think the Riva) makes me much more comfortable in investing in hardware from them.

    • by Anonymous Coward
      dude. i won't even look at the responses, because you are probably getting flamed.

      my response: forgetting about 3d (both will acceptbly play ut2003, sof, quake, etc...both wil do good opengl)

      the real problem? 2d.

      take anyone new to linux (but with xp or os-x experience) and put them on a gnome/kde desktop. their first experience just click around will be vastly different.

      1. xfree86 nv or radeon driver...the interface feels "laggy", and not quite as snappy as your typical os-x, 2k, xp desktop (all hardwa
    • Re:Yes, but... (Score:4, Informative)

      by Sesse ( 5616 ) * <(sgunderson) (at) (bigfoot.com)> on Tuesday May 13, 2003 @07:35AM (#5943509) Homepage

      Try this link [schneider-digital.de] -- they have later drivers, and they work quite well for me (though nVidia's offerings still are a lot more stable).

      /* Steinar */

  • decisions (Score:5, Funny)

    by DanThe1Man ( 46872 ) on Monday May 12, 2003 @09:42PM (#5941546)
    Hmm, spend $500 for a video card or eat this month. Video card or food, video card or food. Hmm...
  • Some better reviews (Score:4, Informative)

    by sjelkjd ( 541324 ) on Monday May 12, 2003 @09:43PM (#5941551)
    Anandtech and Tom's Hardware [tomshardware.com] are more reputable sites than the story poster mentioned. They also perform more comprehensive benchmarks, including Doom 3 and Unreal 2, at multiple resolutions, with and without anisotropic filtering. The other reviews just seem shallow by comparison.
  • So?! (Score:5, Funny)

    by Ignorant Aardvark ( 632408 ) * <cydeweys@noSpAm.gmail.com> on Monday May 12, 2003 @09:43PM (#5941552) Homepage Journal
    I'm not impressed with the Radeon 9800 Pro. What I really want is the Radeon 9500 ASC [bbspot.com]. The price is steadily coming down. Mmmmm, I can't wait to play Nethack in full 3D :-)
  • Don't forget... (Score:4, Informative)

    by Anonymous Coward on Monday May 12, 2003 @09:46PM (#5941563)
    This is using unoptimized nvidia drivers on a pre-release card. I saw benchmarks that were pulled due to NDA that showed that with the Detonator 50.xx, the NV35 chip performs SO much better than with the current drivers. I say wait, before judging the performance of NV35.
    • Actually, no. (Score:3, Interesting)

      by voxel ( 70407 )
      Actually, alot of times the "beta" hardware with the "beta" drivers runs FASTER than the final product.

      Hardware: The problem lies in that the "beta" hardware is carefully crafted and selected so that it lies in a very high yield of the manufacturing build. Later on, when mass production starts you have to clock things down and tone things down in general so you get a nice output yield. Otherwise you will run into the problem Nvidia already did with the 5800 Ultra, they tried to make the cards run like the
  • by DataShark ( 25965 ) on Monday May 12, 2003 @09:46PM (#5941564) Homepage
    closed source or not, the fact is that the NVIDIA drivers on Linux are as good or better that it's win* counterparts ...

    ATI is starting to try but has anyone tryed ATIs drivers and compared them, both fetaure-wise, performance-wise and stability-wise with the NVIdia ones ?

    so unless /. started covering HW 99% focused on MS platforms the duel is a non issue :-) Nvidia wins by K.O.under linux, and under even BSD :-) ...

    • ATIs drivers are by all accounts fine, as long as you've got a single screen. Dual head - that's thier Achilles Heel. Xinerama disables all 3D extentions on the ATI driver, and AFAIK they have nothing like NVidias TwinView. Not sure about the GATOS project drivers either, since I dumped my Radeon 9000PRO for my current GeForce Ti 4200 in order to run 2 screens. (UT2K3 demo runs sweer in a window on the second screen BTW)

      Hopefully they'll get thier act together. Competition is nice.

      Soko
    • But since I don't play games under Linux, the question's moot. In fact, I have two Redhat machines here and one *BSD machine, none of which even have X installed.

      Samba, dhcpd, apache, squid, and the rest don't run any better with X installed, so why bother?

      I also don't game much under Windows, so I'm asking out of ignorance rather than malice whether there are enough recent Linux games to justify the hassle. Is linux a reasonable alternative gaming platform to Windows?
    • by Trepidity ( 597 ) <[gro.hsikcah] [ta] [todhsals-muiriled]> on Monday May 12, 2003 @10:15PM (#5941723)
      Unless Linux suddenly got a bunch of new latest-generation games, the issue of Linux drivers is a non-issue. 99% of gamers use Windows to play games, even those who use Linux for everything else (hell, CmdrTaco even reboots to Windows to play games).
      • You know, some of us *DO* use OpenGL for things other than games. (Hint: It was oringally designed by SGI for use in engineering apps)
      • Yea most games are for Windows, so what? The parent said under linux Nvidia kicks ATI's ass. This of course is true and has been for a while now. For people considering a video card for linux this is a fairly important piece of information.

        So obviously for those of us who do game under linux drivers ARE an issue. So what was your point besides trolling?
    • closed source or not, the fact is that the NVIDIA drivers on Linux are as good or better that it's win* counterparts

      It's a unified driver. Has been for a LONG time. Obviously the kernel hooks etc are different for Windows versus Linux, but the rest of the code is all the same. Claiming the "linux drivers are better" is clueless linux zealotry(sp?)

  • Nice to see (Score:4, Funny)

    by bobbozzo ( 622815 ) on Monday May 12, 2003 @09:46PM (#5941566)
    Nice to see they got rid of the leaf blower that was on the 5800.
  • A Question (Score:2, Interesting)

    by nate nice ( 672391 )
    I know a Z-Buffer demands that you double the memory used so I was wondering if anyone knows if that doubles the video memory or if there is a special memory unit for hidden surface removal that the z-buffer makes use of. In this case, it would mean that you actually have 128MB of video memory and 128MB z-buffer. Anyone know?

    • The Z-buffer doesn't have to have the same depth as the frame buffer. Depending on the application, 8 bits per pixel in the Z-buffer is sufficient (as opposed to 24 bits per pixel in the true color frame buffer). To be honest, I'm not exactly sure what depth the Z-buffers on commodity cards commonly have. Anyone know?
      • An 8 bit z-buffer? If you're that sure your objects will never touch, just draw them from back to front. If they do touch, you'll get a big jaggy ol' line along the intersection.

        I had a card that allowed you to select a 16 vs 32 bit z buffer in the setup panel, and even then it did make a difference on some (poorly implemented?) games.

    • Re:A Question (Score:3, Informative)

      by Magila ( 138485 )
      Every consumer graphics card for several generations has had a unified memory arch. Everything from the z-buffer to shadder programs gets thrown in the same heap until all the onboard memory is ocupied and things start being swaped to main system memory (a situation to be avoided). And the z-buffer doesn't double memory ussage, it uses the same amount as the primary framebuffer (well, not necessarily but now-a-days it's usualy the case).
    • modern consumer level cards do not have a seperate Z buffer memory area. The z buffer depth for the Ati 9700Pro is 24bits, the Nvidia Geforce FX line is 32 bits. But z buffer memory is not half of all memory used, due to Z culling and hidden surface removal it is probably a very small % of the total ram used, the majority in most DX8 level games is probably texture space, and on DX9 level games it is probably split between texture and scratch space (need to store the results of the shader and pixel programs
  • by Anonymous Coward
    I just need a graphics adapter - not a hot and noisy nuclear power source.
  • by YetAnotherName ( 168064 ) on Monday May 12, 2003 @09:54PM (#5941611) Homepage
    Whenever I've given into hype, my wallet's regretted it. But buying the current way-cool game a year-and-a-half or more later almost always guarantees it'll run just fine on my current hardware.

    There's all the free walkthroughs, hints, and cheat codes on the web by then, too.
  • by Anonymous Coward on Monday May 12, 2003 @09:55PM (#5941622)
    I bought an 9700 All in Wonder and it produces 'waves' on any resolution under 85hz. This seems to be a common problem with the 9700 while searching for google groups. Is this common with all ATI cards?
    • Yes, I have observed this with my 9700 AIW. Check Rage3D [rage3d.com] and search through their forums. I didn't have any problems with mine until I got a new hard drive, reinstalled XP and decided to use Service Pack 1. All the common problems, waves, some games crashing, TV stuttering...

      A small proggie from Rage3D fixed some problems but I think I am going to have to reinstall.

      Card works fine in Redhat 9 btw, and is otherwise stable.
  • Canopus (Score:3, Interesting)

    by zoid.com ( 311775 ) on Monday May 12, 2003 @10:01PM (#5941654) Homepage Journal
    I bought my first 3D card from canopus because it had 6 meg. It was the absolute best 3DFX card available. It cost around $250 at the time. It was a sweet card but within 6 months a better and cheaper card came out and I decided I would never buy the latest and greatest card again. My rule of thumb is to stay 2 generations behind the best and you will have a card that can play any game out there. This may change as soon as a DX9 game comes out but I really can't see a game company "require" anything greater than a DX7 card or they wil really linit their audience....
    • Re:Canopus (Score:5, Funny)

      by guacamolefoo ( 577448 ) on Monday May 12, 2003 @10:25PM (#5941774) Homepage Journal
      I bought my first 3D card from canopus because it had 6 meg. It was the absolute best 3DFX card available. It cost around $250 at the time. It was a sweet card but within 6 months a better and cheaper card came out and I decided I would never buy the latest and greatest card again. My rule of thumb is to stay 2 generations behind the best and you will have a card that can play any game out there. This may change as soon as a DX9 game comes out but I really can't see a game company "require" anything greater than a DX7 card or they wil really linit their audience....

      This is the cardinal rule of technology -- buy the newest and the best, only do it 12 to 18 months later. Works for lots of things -- Games, computers, HDTV, processors, cell phones, OSes, PDAs, and video cards. Heck, even cars.

      Let some other schmuck take the depreciation. Take your cue from me, and you can't go wrong. As soon as the prices come down on those swanky new 286s, I can finally get rid of my PCjr.

      GF.
    • I agree with you here. I ordered a new computer with a gforce 4 ti 4600 card. While this is high end, it is not cutting edge. Often, people get caught up in all the benchmarking (i know that its easy for me to do as well) and think that their system will suck if it doesn't perform the best. What you have to realize here is that these benchmarks are often in the 100s of fps range (with the differences between cards being a handful of fps). To me, I rather save some $$ and have 40-60 fps. I think that a
      • i've heard that you're fine over 30fps and won't noticve a difference over 45. even with 2 machines to compare between. I think the reason they test like that is that no games need all thet grafx power so they can't use reasonable tests. This actually shows you don't need a card, but reviews spin it so you want a new card from their advertisers.
  • This is obviously the mid-range chip of the 35 series. The article(the ati is better one) states it can go higher, why do you think its lower, so that can jack it up and call it Ultra. This is like your BMW M5 of the bunch, the Ultra is like your Ferrari Barchetta(Yes you know what I'm talking about you Rush fans out there)
  • by acomj ( 20611 ) on Monday May 12, 2003 @10:11PM (#5941696) Homepage
    Why would anyone spend 400-500$ on a video card. Unless you really NEED to be cutting edge for the next 6 month or so before the next batch comes out and the price of these cards becomes more reasonable.

    I'm not a hard core gamer. I have a Radeon something or other I got with my current machine (powermac g4). It plays wolfenstien and quake 3 great at 1024x768 with lots of eye candy on. I thnk a lot of people get way too caught up in frame rates and technical specs..

    • These reviews are like teaser trailers for LOTR or Reloaded that came out six months before the movie. Except if you pay $100 a ticket you can see it early. Most people will wait, but it's still interesting to see what we're waiting for.
    • Why would anyone spend 400-500$ on a video card. Unless you really NEED to be cutting edge for the next 6 month or so before the next batch comes out and the price of these cards becomes more reasonable.

      You're assuming that everyone uses these cards to game on. Certainly there are lots of people and even industries who absolutely need to be on the cutting edge. One example would be animators who work for special effects companies like Industrial Light & Magic or Weta. Time is money to these compani
  • from reading timothy's "article" that had more hyperlinked texts than actual text! My graphic card can't render all that.
  • Doom 3 benchs (Score:4, Informative)

    by jwdeff ( 629221 ) on Monday May 12, 2003 @10:13PM (#5941711) Homepage
    Anandtech [anandtech.com] and Tom's Hardware [tomshardware.com] have much better hardware reviews than that ZD reviews specified. They also have Doom 3 bench [tomshardware.com] marks [anandtech.com], which put the new NVidia card significantly ahead of the ATI counterpart.
  • by swordgeek ( 112599 ) on Monday May 12, 2003 @10:16PM (#5941728) Journal
    I have had a handful of video cards since my original trident 8900. Pretty much every time I plug the card in, boot to VGA resolution, install the drivers, and reboot. Everything is done.

    I just got an ATI 9500 pro--my first ATI card. The driver installation was a five hour nightmare of crashing Windows, exception errors, hangs, and black screens. When I was done, I couldn't set the refresh rate. Nothing I did (including installing the latest drivers, and trying to use the 'secret' max. refresh setting in the ATI display controls--it wasn't there at all) could get me off of 60Hz.

    Games crashed. Windows hung. Horridness. I talked to the manufacturer, and they said it was a bad card--get an RMA, and ship it back. This I can believe.

    The problem is, I can no longer set the refresh rate on my OLD video card anymore! These damned drivers screwed up my system substantially! Removing them didn't help at all. I'm going to have to dig into the registry most likely.

    If the replacement ATI card doesn't work any better (hardware AND software), then I'll be going back to nVidia permanently, or at least for another two generations. At least their stuff works.
    • Let me add my own tale of woe. I got a Radeon 8500 card recently (last month). Installed it, installed the drivers from the CD, found I could not play Metal Gear Solid 2 anymore (nice when you are halfway through).

      Downloaded latest drivers from ATI. Uninstalled previous drivers, as required. Installed new drivers - setup crashes during installation. Repeat, same result.

      Filed a bug report with ATI. First they want to know everything about you, then they give you a google-eye view of their problem databas

  • by MoeMoe ( 659154 ) on Monday May 12, 2003 @10:22PM (#5941751)
    What it comes down to isn't which one is more powerful, but which one can become the most powerful... In this case I would take nVidia since a few registry mods will open up an overclocking menu in the video properties...

    That's just one of the many "secrets" I know, let me tell you about Area 51, if you really want to fi-

    Just a sec, someone's at the door...
  • Is it really worth paying an extra 200% for an improvement in performance (over say a GF2 or 3) that will amount to maybe 50%?

    The smart thing to do is to find reliable benchmarks on the graphics cards for a taxing game (e.g., Quake 3 Arena at 1600x1200 with all the goodies). Then divide the price of the card by the average benchmark score. The one with the best price/performance ratio is the best card to buy: all the others, you're getting fucked over on.
    • 50% is generous. Until the next-gen games come out you won't see an advatage. My gf3 can pust 1600x1200x32 in anything i've thrown at it. My monitor does 1280x1024. So except for FPS (which doesn't matter over a point and my gf3 easily does that, at least for what i can test it with) ther is no appreciable diffrence. Now you won't be able to play doom3 on your okder card at full everything, but i bet i'll still be playable.
  • I was using a TNT2 video card for the longest time until I found a Geforce2 card that someone abandoned. Do a lot of people actually keep up with all the new video card technology by buying every new product release?
  • by chameleon_skin ( 672881 ) on Monday May 12, 2003 @11:12PM (#5942000)
    ...because I love computer games, but I haven't owned a cutting-edge video card in about five years, and if anything my gaming experience has *improved.* Why? Because ninety percent of the time games that are written to use the features of brand-spankin'-new video cards are so intent on milking the most out of the card's technology that they fail to concentrate on the most important aspect - gameplay. If a game is actually innovative, challenging, and involving, then it's still going to be enjoyable two years from now despite the fact that its graphics aren't quite up to par with the latest offerings. Because I've got a wimpy 10Mb video card, all of the games I can play on my machine are a year or two old. Sure, this means that I miss out on a lot of the online gaming experience - a lot of the multiplayer servers for a game are dead by the time I get around to playing it. But if those servers have disappeared inside of eighteen months, then how good was the game in the first place? Half-Life is pushing five years now, and there are still tons of places to play it. $450 for a freakin' video card? Sheesh. Give me a break. I'll wait until they're $100, by which time all the mediocre games will have disappeared into a much-deserved oblivion while I'll just be ready to tackle the top ten of the bunch. Sure, a year and a half is like an eon in computer gaming, but the ones that last the eons are the best anyway. Chess, anyone?
    • Give me a break. I'll wait until they're $100, by which time all the mediocre games will have disappeared into a much- deserved oblivion while I'll just be ready to tackle the top ten of the bunch.

      Some games have a lot of replay value - but other games I find are games that are to be played once, enjoyed, then shelved, particularly single-player games. Not because they're bad, but because playing something where I already know the plot twists and the solutions to puzzles and riddles is boring. Not to ment
  • by VoidEngineer ( 633446 ) on Monday May 12, 2003 @11:28PM (#5942070)
    Yeah, but does either of them have 3 Gbit DDR SDRAM for 360 degree autostereoscopic 3D viewing? I think not...

    I quote the Resolution / Color / Performance / Memory specifications [actuality-systems.com] of the Perspecta 3D [actuality-systems.com], which is available from Actuality Systems [actuality-systems.com].

    - Volume comprised of 198 2-D slices (1.1 slices / degree)
    - Approximately 768 x 768 pixel slice resolution
    - 24 Hz volume refresh
    - Full color (21-bit hardware-based stippling)
    - 8 colors at highest resolution
    - Polygons / sec.: To be announced
    - Dual volume buffers
    - TI(TM) 1600 MIPS DSP high-performance embedded processor
    - 3 Gbit DDR SDRAM (100 Mvoxels x 3 colors x 2 buffers)

    Granted, there are only 8 colors available at high resolution, but it points out the fact that 3D graphics cards and monitors have a long way to go yet. I don't mean to be a troll, but I get rather pissed-off when these video card manufacturers, with their planned-obselesence, talk about their latest-and-greatest "3D" video cards. Please; these are pseudo-3D video cards; and if you've worked with a stereoscopic video system (virtual reality system) or an autostereoscopic video system (3D television system), you'll know what I mean...

    (Granted, I only got to work with this kind of technology for a couple of months in college, so I'm not an expert on this stuff... still, I know stereo3D from pseudo3D when I see it...)
  • Hacked 9500's (Score:3, Informative)

    by Myriad ( 89793 ) <myriad@the[ ]d.com ['bso' in gap]> on Tuesday May 13, 2003 @12:12AM (#5942243) Homepage
    As most of you are probably aware, it is possible to mod a 9500 (non pro) into a 9700 card (great guide about it here [designtechnica.com]) . You can do either hard or soft mods.

    I've successfully used the software only method and doubled the performance of my 9500.

    Now there is an upgrade to 9800 (grab it here [maxdownloads.com]). Well, not entirely a true 9800, as there are hardware differences between them. But I am able to run the ATI 9800 Gargoyle and Caves tech demos [ati.com] perfectly on a non-overclocked card! Haven't gotten the Chimp Demo working yet, but maybe if I OC. Maybe not. It also runs all the 9700 tech demos perfectly.

    Still, damned good performance for a card that costs waaaaaaaaaaaaaaaaaaaaaay less. I'd recommend the Sapphire 9500non-pro if anyone is interested.

    Funny timing on the article, I was just screwing with it this evening...

    Blockwars [blockwars.com]: a real-time, multiplayer game similar to Tetris.

  • by AliasMoze ( 623272 ) on Tuesday May 13, 2003 @01:24AM (#5942566)
    I see the Radeon 9800 on the shelf. I see the GeForce 5900 on the shelf. They're comparable in speed. Each supports next generation games. But I think the biggest feature, the thing that makes the choice for me, is the size of the box. That's what determines which one I steal.
  • Hmmm... (Score:3, Insightful)

    by klui ( 457783 ) on Tuesday May 13, 2003 @03:55AM (#5942952)
    If both cards perform relatively the same but the nVidia card takes up an extra slot, my vote would go to ATi. I get the sense ATi and nVidia would just continue to one-up the other and continue to produce products at a furious pace. Will they get enough revenue to continue with their new product release. $400, $500... $600... where will it end? Sure they can push the state-of-the art, but if less people can justify buying these expensive parts, does it matter whose product is better?

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...