Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Hardware

ATI vs. NVIDIA: The Next Generation 239

doppler writes: "There's a killer graphics card round-up at TR today that compares the new GeForce4 and Radeon 8500 128MB cards against each other in extensive testing. Very good stuff. Most interesting: a visual representation of a texture upload problem in OpenGL on the Radeon 8500 chip."
This discussion has been archived. No new comments can be posted.

ATI vs. NVIDIA: The Next Generation

Comments Filter:
  • Sweet (Score:1, Troll)

    by k_d3 ( 559373 )
    It sure as hell beats my motherboard graphics... Now if only I can get some cash...
    • I'm with you, but this round time I HAVE the cash so I might finally upgrade. (Hell, for only $75, how could I NOT have the cash?) I know my jaw almost actually hit the floor (I would quote the article, but see below)

      On a completely unrelated note, or maybe not so unrelated after all, I can no longer read the article! Perhaps my internet is screwy, or perhaps the tech report was slashdotted in less than 10 minutes. That would be somewhere deliciously between really amazing and really scary. Well, /. DID take out Apple's servers, so I suppose anything is possible...

      --Anyone downing on .sigs just can't think of a good one
  • by splume ( 560873 )
    That has more memory than my Webserver running FreeBSD!(64MB) Sheesh.

    • Raise your hand if you remember a time when one company would make fun of the other for adding more and more memory because "You'll never need 32MB of video memory!"

      • /me raises hand
      • Re:128! Wowzers (Score:4, Interesting)

        by Com2Kid ( 142006 ) <com2kidSPAMLESS@gmail.com> on Thursday April 04, 2002 @06:41PM (#3287442) Homepage Journal
        This is noting that having over 32MB of memory has proven to be of NO benefit in benchmarks outside of the occasional 1 or 2 FPS difference (and when you are getting over 100FPS any ways. . . .).

        Texture size is REALLY not a problem. Do you realize how fr*gin big textures can be byte wise before you get to being just plain old silly?

        It is NOT the size of textures people, it is how COMPLICATED those textures can be.

        Currently LOD is used in order to keep video cards from having to render full 256x256 textures when an object, say, only appears as 25 pixels in its entirety on the screen. You know, that sniper across the street with that gun? Yah that one, (duck).

        This works quite well, until you get up close to the object. Shoddy unrealistic Bumpmapping (I highly disprove of bumpmapping, more on this later) can come into play at really close distances, and games like Serious Sam even make this look halfway decent, but it still is not real, or realistic.

        The ONLY way to get good texturing done is to DISPENSE with the concept of textures all together. Polygons do not make this easy in themselves, and competing technologies can even make it worse. Some technologies like vertex coloring are a bit useful, but not much and they are just the texturing model relabled.

        But once you DO dispense with textures, ooh yah.

        Now for bumpmaps.

        Bumpmaps are often times just a cheap shortcut to REAL modeling. Geometry deformation texturing is the next step, but until we get some video cards that can model each little crack and bump of an object we are not going to get anything near 100% photo-realism. Not to mention characters with actual nostrils. Yes there is a level of diminishing returns, but quite frankly, until I can model every last little crack bump and lump in a model and have it render real time on a home users computer, bumpmapping is what we are stuck with, and I don't like it.

        But I repeat, I REPEAT, larger textures (and bumpmaps) are just a cheap low quality shortcut They DEFINITELY have a point of diminishing returns, and it is one that HAS ALREADY BEEN REACHED. Most new games do NOT do just plain old texturing any more, and a lot of what is happening now days in relationship to textures (Bilinear filtering and such) is just in fact ways to correct errors in the original texturing model of thinking. Or at least further refine the mathematical model used to show those textures.

        But why do games look better you ask?

        Mostly because video cards have any number of fancy TnL units that can independently create some rather nifty effects while working AROUND or OUTSIDE of the plain old texturing model. At the very least the texturing model of thinking has some. . . rather funky. . . math applied to it in an artistic manner with the results rendered to the screen.

        Look at Nvidias werewolf model as an example.

        The HAIRS on it look great.

        The actual model though?

        Hell looks like shit.

        No it does.

        Notice the face people. Horrid. The textures. It is not the modeler or textures fault, it is just a fact that, well hell, you CANNOT do realistic skin textures without using Pixar level technology.

        Actualy, I recently read an article from awhile back that was an interview with someone at Pixar. They were describing the INSANE level of work that was necessary to even get something that SORT OF looked like a skin texture to render. The FF movie had kinda-sorta-maybe-ifyousquint real looking skin, it was nice, but it took a lot of work and it still was not perfect. Once again, diminishing returns.

        While NVIDIA is doing good work in relation to getting various funky technologies out on the market that move around the texturing problem, as long as we rely on textures as our main source of coloring objects, we as a community of people who love to Blow Things Up are going to have problems.

        Hell the very idea of textures themselves is exactly opposite to how existence works. Objects are not gray by default with colors added later. Objects are. . . . real. They exist. More or less. The color is an INTEGRAL PART of what an object is. You cannot separate the two.

        In other words

        I want molecular modeling please. :)
        • Re:128! Wowzers (Score:2, Insightful)

          by scot4875 ( 542869 )
          You seem to be upset with the current technologies available to developers simply because they aren't photo-realistic. Or, in the case of bump mapping, it's a "cheap shortcut to REAL modeling." Well, yeah. That's exactly what it is. And, used correctly, it's extremely effective. (See Star Wars:Rogue Squardon 2 for an example)

          But as for photo-realism, who cares? I, personally, think that the solutions people have come up with to maximize the hardware's potential are fascinating. And playing non-photo-realistic games has never been a problem for me, any more than watching a non-photo-realistic episode of the Simpsons.

          Not that this has anything to do with nVidia vs. ATi. My $.02: Buy nVidia's high end cards if you're rich or like to waste money. Buy ATi's high end cards if you just want to play games and don't care about an extra 2% on your framerate.

          --Jeremy
          • You seem to be upset with the current technologies available to developers simply because they aren't photo-realistic. Or, in the case of bump mapping, it's a "cheap shortcut to REAL modeling." Well, yeah. That's exactly what it is. And, used correctly, it's extremely effective. (See Star Wars:Rogue Squardon 2 for an example)

            When used correctly, then yes, it CAN help, but when used IN PLACE of proper modeling. . . .

            See ears on almost ANY character in any game. Horrid. Bumpmapping is used instead of putting an actual HOLE in the ear. . . . And most games have no geometric detail to the ears at all!

            . And playing non-photo-realistic games has never been a problem for me, any more than watching a non-photo-realistic episode of the Simpsons.

            Which is a 2d scenario anyways. :)

            But yah, photo-realism is just one example.

            Hell how about some realistic water colors, eh? Current systems use some VERY high end psuedo-reality modeling algorithms to make something that kindasortamabye looks like watercolor but they sure as hell are not watercolors.

            Same with arcylics. Hell does anybody even know of a program that DOES simulate acrylics? Painter 6 doesn't do it and it generaly tends to have one of the more advanced (consumer level) paint simulation systems out there.

            Of course some day computers WILL have the power to compleatly model the physical world (Ray Tracing is NOT a model of the real world. Raytracing is actualy backwards, light is projected from a virtual view back to the light source. VERY weird and it leaves some ... glitchs behind) and this will all be a moot point as all of us will be sitting in our VR chairs growing fat and old as we party on Internet3. :)
        • Re:128! Wowzers (Score:4, Informative)

          by UnknownSoldier ( 67820 ) on Thursday April 04, 2002 @07:40PM (#3287778)
          &gt This is noting that having over 32MB of memory has proven to be of NO benefit in benchmarks outside of the occasional 1 or 2 FPS difference (and when you are getting over 100FPS any ways. . . .).

          I agree.

          &gt Texture size is REALLY not a problem.

          It IS when your PC game is being ported to consoles and you ONLY have ~ 2.5 Megs of VRAM say like on a PS2 ! (Yes the PS2 has 4 Megs of VRAM, but you need space for the framebuffer and zbuffer.)

          Now consoles make up for the lack of video memory by having a high bandwidth (i.e. PS2 can DMA ~20 Megs of Textures per frame) but I'd rather upload my textures ONCE, not every bloody frame. Yes, you be more efficient at texture uploads (draw the last model from the last frame, first this new frame, etc) but you're still tying up the BUS.

          &gt The ONLY way to get good texturing done is to DISPENSE with the concept of textures all together.

          I don't compeletely agree, but you raise an interesting point, because of the fact that textures are a form of (color) compression. If we take this to its logical conclusion we should be able to have a triangle PER pixel, and that would negate the need for textures. Unfortunately that has its own problems -- there's no way we can send a million vertices across because we'd saturate the bus! Doh! (Give a reward to the person in the back who said, well let's move to paramateric surfaces then!)

          In the "Real World" (TM) we have a *unique* texture per pixel (ala ray tracing) however we don't have the memory to store that, unless we calculate them parametricaly. Sure we can get nice "marble" ala Perlin Noise, but it's going to be a while before we can mathmatically generate EVERY texture !

          &gt But why do games look better you ask?

          &gt Mostly because video cards have any number of fancy TnL units that can independently create some rather nifty effects while working AROUND or OUTSIDE of the plain old texturing model.

          You'd be amazed at what multitexturing and multipass render does. Even a simple repeatable base texture with a "random" noise texture overlaid with a bump-map, looks OK.

          &gt The color is an INTEGRAL PART of what an object is. You cannot separate the two.

          You *can* get away with this, but you have to be aware of the tradeoffs. One common "solution" is to crank up the bit-depth.

          i.e. If you use 16-bit color channels ala 64 bits per pixel, then you don't have to throw out your whole rendering functionality -- you just extend it. Not a perfect solution by any means, but "its good enough."

          Take a look at "Titanic" The ship was rendered via tradional textures, and it looks pretty good. The hard part is getting that quality in real-time with so little memory ;-)

          Cheers

          --

          "The issue today is the same as it has been throughout all history, whether man shall be allowed to govern himself or be ruled by a small elite." - Thomas Jefferson

        • Re:128! Wowzers (Score:2, Informative)

          by GrfxGuru ( 571310 )
          This is noting that having over 32MB of memory has proven to be of NO benefit in benchmarks outside of the occasional 1 or 2 FPS difference (and when you are getting over 100FPS any ways. . . .)...blah...blah...

          This isn't true. Textures are not the only things stored in local video memory (i.e. the 32 MB you are talking about). Vertex buffers can also be stored in local video memory. It is quicker for a video card to fetch the vertex data from a buffer in local video memory, than it is to read it from system or AGP memory and feed it to the chip. Simply put, bandwith is less of a bottleneck with local video memory than it is with AGP.

          Don't believe me? Try it yourself! There are many example OpenGL and Direct3D apps out there that you could hack (just check out the DevRel sites for both nVidia and ATI). Throw a frame counter in there, and measure it when you create a vertex buffer in system memory, in AGP and local video memory.

          Now, you're correct that with "older" apps (pre-2000...heh, only in CS can you call something that's 2-years-old, old), there will be no difference in performance...but that's because those apps were written with 32MB boards as the high performance parts...so they didn't try to use much more than that out of fear of running too slow on what was current hardware. There will be a difference on any graphics-intensive app that was made in the last couple of years.

          As games begin to use more complex models (and larger textures), even 128 MB will someday be too small.

          • Re:128! Wowzers (Score:3, Interesting)

            by Com2Kid ( 142006 )
            A modeler has to be a complete nitwit to fill up 128MB with verticies, heh. Or even 12MB with vertices. . . .

            It is not like verticies have to be loaded THAT often, and when they are they can often times be predicted, though how Messiah did things sucked (wow, look at that! SERIOUS texturing problems AND ass end load times AND the scripts get fucked up! Bah) but a GOOD loader can load a level dynamicaly and Not Suck.

            I would MUCH rather 64*2 Megabytes of ram with a half clock seperation between them (In other words, fast ass access. :) ) then 128MB of RAM that, err, uh, costs an arm and a leg and MAY provide some future performance, but by that time the texturing units on the video card will be old hat anyways and 'everybody' will have moved up to the next best thing.

            I myself will likely still be using my Matrox G400 MAX. :)

            I cannot believe that some complete IDIOTS credit ATI with first having dual desktop displays. . . . grrr. Idiots. :(
  • yes! yes! (Score:3, Funny)

    by Anonymous Coward on Thursday April 04, 2002 @06:02PM (#3287192)
    YES! Now I can have an expensive video card that I can use for displaying xterms, emacs, and mozilla. Where do I sign?
  • ATI and drivers (Score:3, Interesting)

    by NovaScorpio ( 127710 ) on Thursday April 04, 2002 @06:02PM (#3287193)
    I have an original radeon - I've always felt that ATI makes crap drivers... Their chipsets, if you ask me, are on par with NVIDIA's, it's just that their driver support is crap... If only they actually let 3rd parties develop like they said they would...
    • Are the nvidia drivers really better? With these two companies I can't help feeling that their hardware department is much better than the driver developers. Maybe they are not supposed to make optimized drivers. After all - when you buy one of their top of the line boards you know that its fast. If it doesn't really perform that well you will have to upgrade. Because the next one is really going to rock...
    • Re:ATI and drivers (Score:5, Insightful)

      by dimator ( 71399 ) on Thursday April 04, 2002 @06:13PM (#3287277) Homepage Journal
      NVidia deserves a lot of credit -- especially from Linux folks -- for their top notch drivers. Installation is a snap (two tarballs, sudo make install), and once they're up and running, they're very stable and quick. And they're maintained. New versions are released fairly often, and the very latest cards are supported as well. I tried a radeon card once, but prompty returned it because there were no drivers, and only after a while did they finally appear.

      I've sent NVidia some mail stating that because of their support for my OS, I plan to continue buying their products. It's good to give them that kind of feedback, I think.

      • Re:ATI and drivers (Score:4, Insightful)

        by realdpk ( 116490 ) on Thursday April 04, 2002 @06:44PM (#3287460) Homepage Journal
        Probably not a bad idea to register your hardware (with those mail-in cards they include in the box (if nVidia doesn't, forgive me for my mistake. :)), making sure to select Linux as your OS. That way the real number counters get the message. :)
      • At a previous job we ran a mix of Linux and NT4 boxen, with a mix of ATI and nVidia cards. While the Linux boxen never locked up, the NT4 ones would at least a couple times a week.

        On the blue screen of death was the module stack. And always, always at the top of the list would be the offending culprit, atirage.dll .

        Need I say more?
      • Re:ATI and drivers (Score:3, Insightful)

        by xercist ( 161422 )
        Personally I've become very frustrated with nVidia's linux drivers. They cause my machine to crash randomly. This is not just me, either. All of my friends who have nvidia cards under linux seem to experience the exact same problem. All my friends using different video cards are stable as hell. Coincidence?

        I'll give you that when the drivers work, they work quite well. They look good, and run fast. But part of the reason I started using linux in the first place was to avoid the constant rebooting that comes with the alternative. Being totally closed eliminates the possibility of someone else coming in and fixing the problem, so all I can do is wait and hope they fix it on their own....and so far, they haven't.
      • Re:ATI and drivers (Score:3, Insightful)

        by Ogerman ( 136333 )
        NVidia's linux drivers are available only as closed source binaries. That is 100% unacceptable and anyone with a brain will not use their hardware until they change their policies on giving us developers documentation. Of course ATI doesn't exactly have the best record for releasing timely or complete documentation either..

        You know why 3D still generally sucks on the PC? Because the market has been ground to a halt by patents and restrictive licensing. Imagine if the Internet developed this way. Stupid greed.
        • That is 100% unacceptable and anyone with a brain will not use their hardware until they change their policies on giving us developers documentation.

          Well, zealots can stick to "open" crappy, out-dated, and unmaintained stuff [ati.com], and non-zealots can stick with working, timely, stable stuff [nvidia.com]. Usually, I let practicality choose instead of politics.

          You know why 3D still generally sucks on the PC?

          Interesting, I've never noticed that. Compared to what? Game consoles that are built solely for churning polygons, or the latest flick from Pixar?

          Stupid greed.

          Imagine that! A company wanting to protect the hard work it has put into their products! And on top of that, they want to turn a profit! How disgusting!

      • I tried a radeon card once, but prompty returned it because there were no drivers, and only after a while did they finally appear.

        I have a Radeon gathering dust for similar reasons. Its capabilities are much beyond the (now fairly crappy) GeForce 2 MX I'm using currently, but the drivers just aren't there for the Radeon. I spent days (yes, days) getting DRI working properly with the Radeon, and what did I find?

        • Tribes 2 (my then-main 3D game) ran at a whopping 5fps indoors and around 2 outdoors. Not acceptable.
        • If I started X, then exited, and started X once more at any time afterwards, the system would die. Not just the graphics, the entire freakin' system. It died so fast, nothing even made it into the logs. Sorry, but even Windows would offer better stability than that.
        • Random crashes. I used to have such problems with the NV drivers, but they've been gone for a loooong time.

        I'd prefer NVidia's drivers were open source (some fixes just take far too long), but they do a good job so far. Considering this is a desktop-oriented company supporting an OS with such an insanely small portion of the desktop market, I'm very pleased. Everybody should stop bitching about NVidia, and praise them for investing their resources in making things possible for us. As long as they do a good job, we should support them (yes, including with dollars).

  • by gmarceau ( 119282 ) <dnys2v4dq1001@sneakemail.com> on Thursday April 04, 2002 @06:07PM (#3287225) Homepage
    I just wish one benchmarking site would release the raw data in some kind of ascii based table. I would love wasting coutless hours of gnuploting, generating variations on plots like those.

    Does anybody have a pool of varied cpu & motherboard machines, new and old? There are a couple of statiscal tools I would like to throw at the benchmarking problem - if only I had the data.

  • Game Programming (Score:5, Insightful)

    by saveth ( 416302 ) <cww&denterprises,org> on Thursday April 04, 2002 @06:08PM (#3287233)
    As an amateur game programmer, I must say I prefer NVIDIA-based cards to ATI-based cards, simply because NVIDIA takes care of their customers.

    I've used the latest flavours of the ATI Radeon series, and the drivers always seem to be a bit unstable. Downloading updated drivers doesn't always fix the problem, either; sometimes, it makes the problems worse. It's hard to tell whether they're even trying. It seems ATI, at this point, is just trying to keep up with NVIDIA in terms of speed, rather than in both speed, quality, and stability.

    NVIDIA, on the other hand, fixes bugs properly *the first time*. They don't really produce many bugs, either, which means they can put forth more effort toward making everything more featureful.

    There's no contest, in my opinion. NVIDIA wins, hands down. It will take quite a bit for ATI to change my mind, or the minds of my game programming colleagues, about this one.
    • Newest drivers (Score:2, Informative)

      by Hamshrew ( 20248 )

      I agree, for the most part... when the 8500 came out, it was months before ATI released official, updated drivers. When they did, they were an improvement, but still had some stability issues. I was disappointed that after all that time, they still hadn't gotten it right. Especially after they kept talking about their "new commitment"

      But then they released newer drivers pretty quickly. Fixed some rendering bugs, seem much more stable... I'll wait and see a little longer before recommending them to anyone else, but it looks like they may be getting their act together.

    • Re:Game Programming (Score:4, Interesting)

      by brer_rabbit ( 195413 ) on Thursday April 04, 2002 @06:28PM (#3287375) Journal
      I know you're talking about software, but I can confirm on the other side of the fence that Nvidia's chip designers are absolutely picky when it comes to their work. I used to work for a standard cell library vendor a couple years back. Nvidia tore apart our 0.25um library when it came to timing characterization. Those guys were pushing the envelope -- they needed timing on the cells accurate to better than couple percent. I'm not talking simple propogation delays, these were setup and hold times of flip flops and latches. We ended up giving them tables of setup and holds, not just a couple numbers like most of our customers were happy with. Real interesting job for a just out of college EE.

    • I have to disagree with this idea. I used to have some random crashes under linux when using the nvidia drivers and when I dual booted to windows also. When I replaced the nvidia card with an ati card all of the crashes went away. I had tried lots of nvidia drivers and they all had the problems.

      Even more annoying was that I found that the nvidia drivers would break up the audio on my sblive. I would get all these crackles in the audio which where very annoying. When I replaced the nvidia geforce with a radeon all of those problems went away. Overall I am not impressed with nvidia quality at all. ATI is better but for a really stable video card I would go with matrox. On a box I have with a G200 X has NEVER crashed. I have never had a single issue with that card.
  • by sgtsanity ( 568914 ) on Thursday April 04, 2002 @06:08PM (#3287240)
    Firingsquad just posted a report [gamers.com] about the new GeForce TI 4200. They're coming out with two seperate versions, one with 64mb of faster memory, and one with 128mb of slower memory. The 64mb one was faster in the benchmarks that they ran, even though it was $20 cheaper than the other variant. Plus, it even beat their comparison TI 4400 in some of the benchmarks.

    But it gets better. The TI 4200 can be overclocked to speeds comparable to the TI 4600, Nvidia's fastest card. Get the fastest performance available for half the cost!
  • by brer_rabbit ( 195413 ) on Thursday April 04, 2002 @06:09PM (#3287247) Journal
    Is anyone doing decent PCI cards these days? I realize I'm behind the times here, but my motherboard (Asus CUR-DLS) has no AGP slot, leaving me with a GeForce2. Still, my dual P3-1.26 ghz setup isn't far enough behind the game to warrant buying a whole new setup. I do have a couple 66mhz 64bit PCI slots going unused in the motherboard, any graphics cards go that route?

    • PCI video card (Score:1, Informative)

      by Anonymous Coward
      Because of the bandwidth limitations of PCI (compared to AGP), not many manufactures (read: none) are making PCI video cards any more.

      I know that ATI still does occasionally put out a batch of 32MB PCI rage128 pro cards, and matrox has some PCI cards designed for multiple monitor configs... but overall there are no other PCI cards.

      Compare PCI (33Mhz or 66MHz if you are lucky shared between your PCI devices) to AGP (133MHz+ on a dedicated channel) and see which one you would rather have :)

      Also, the speed of most GPUs these days (graphics processing units) is too fast for the PCI bus to give it its data. That is why you will never find a Radeon PCI or Geforce2 GTS or better PCI. Heck, AFAIK the Geforce2 MX-400 cards are not PCI.

      So that is what happenned to PCI cards.

      Please mod me up as I am not logging in :)
    • I feel your pain man. I took an old web server and turned it in to my desktop. The thing that's keeping me from upgrading is the onboard SCSI (P2B-S, BX chipset baby! The last stable chipset in some time..). I've yet to find an AMD motherboard with onboard SCSI (granted, I have only looked in a few places but.. you'd think they'd be common from ASUS). 'course, I'm still running a p2-350, so I guess I don't feel all of your pain. ;)
      • Re:Even more OT (Score:3, Informative)

        by ncc74656 ( 45571 )
        I've yet to find an AMD motherboard with onboard SCSI (granted, I have only looked in a few places but.. you'd think they'd be common from ASUS).

        The Tyan Thunder K7 [tyan.com] includes dual-channel Adaptec Ultra160 SCSI, dual 3Com Fast Ethernet NICs, an AGP Pro 50 slot, 64-bit PCI, and a bunch of other stuff. It's also a dual-processor board, so you get twice the Athlon goodness. :-)

    • by bbk ( 33798 ) on Thursday April 04, 2002 @06:26PM (#3287354) Homepage
      Try and find a Voodoo 4 or 5. They've got decent (Geforce 2ish) 3d capabilities, will work at 66Mhz in a PCI slot that supports it, and have quite decent linux drivers.

      They're also dirt cheap on ebay, as WinXP and MacOSX don't support Voodoo cards, and people are selling them off for better cards.

      You may also look for Mac cards - for the longest time, there was no AGP slot on the Mac, and I think you can get a Radeon PCI with mac roms. Flash it to be x86 compatible, and there you.

      BBK
      • Keep in mind that there's no decent official driver support for Voodoo cards now that 3dfx is gone. There's already some games that are DirectX 8.1 only and the list keeps growing. Many of these games wont run properly on Voodoo cards because there are no updated drivers.

        They do have PCI GeForce 2 boards though for obvious reasons they suffer a performance hit when compared to the AGP versions... That's your best bet until you upgrade your mobo.

        • There may not be official drivers, but there are a number of people creating new drivers by themselves. See x3dfx.com [x3dfx.com] for some of them, or just go to the old 3dfx site.

          So far they appear to work ok in Win98, but since they're not certified, I couldn't get them to install properly in XP. Many of the new drivers at least claim to be Directx 8.1 compliant.
  • What's the point? (Score:3, Interesting)

    by Jucius Maximus ( 229128 ) on Thursday April 04, 2002 @06:12PM (#3287266) Journal
    These things can already render graphics at insanely high resolutions and refresh rates with framerates above the refresh rate of the monitor.

    I don't see a reason for most people to upgrade to one of these things unless they are developing 3D technology.

    • sure, you can get your fram rate up higher then the refreshrate, and of course your vision. But its about keeping the framerate above them with 20 people shooting each other on the screen.
    • by barjam ( 37372 )
      I have a geforce 3 ti200 and there are many games that will push this card well below 15fps.

      Everquest
      RealFlight
      Wolfenstein (everything turned to max)
      Microsoft Flight Simulator

      James
    • by foobar104 ( 206452 ) on Thursday April 04, 2002 @07:41PM (#3287785) Journal
      ...refresh rates with framerates above the refresh rate of the monitor.

      Strangely, most people don't seem to realize that this is a BAD THING, unless your app is running at an integer multiple of your monitor refresh rate.

      To make it simple, imagine your monitor scans at 60 Hz. So every 60th of a second (16.67 msec) you get a whole new frame drawn on the screen. Assume, for sake of argument, that drawing the screen takes zero time. It's just instantaneous.

      To achieve smooth motion, the same amount of time must pass between each frame. This is guaranteed if your application renders 60 frames per second. Unless you drop a frame somewhere, you'll see one rendered frame for every screen refresh, and you'll perceive smooth motion.

      But what if you drive your screen at 60 Hz, but your application renders 95 frames per second. (Assume that it's exactly 95 fps all the time, rather than a variable frame rate, just to make the math work out for this example.)

      When you run your game or whatever, the clock starts at zero. The first frame from the graphics pipeline is in the display buffer, so when the monitor gets ready to draw the screen, it draws frame zero.

      10.53 msec later, the application has drawn the second frame, so it swaps buffers. The display buffer now has frame 1 in it. The application now starts drawing frame 2.

      But the graphics card isn't ready to draw frame 1 on the monitor until a little over 6 msec later, at t = 16.67. At that time, though, the application hasn't finished drawing frame 2 yet, so frame 1 is still in the display buffer. The monitor draws frame 1. Game frame 1 comes after game frame 0, so we're still in sync.

      During this time, the application has been working on frame 2. It finishes frame 2 at t = 21.05 msec and swaps buffers. Frame 2 is now in the display buffer, and the application starts drawing on frame 3.

      The monitor is ready to draw frame 2 at t = 33.33 msec. So it reaches for the frame from the display buffer... but what's this? The frame in the display buffer isn't frame 2. It's frame 3! We dropped a frame somehow!

      In the meantime, at t = 31.58 msec, the application had finished drawing frame 3. It swapped buffers again, before the graphics card got a chance to display frame 2. Frame 2 disappeared from the display buffer, never having been shown on the monitor. That's a dropped frame, and it's a bad thing.

      Games aren't hard-real-time applications, of course. They run freely, sometimes drawing frames more quickly, and sometimes less quickly, depending on the load. This is okay. But don't just assume that because your game runs consistently at a rate higher than your monitor, you won't be dropping frames. In fact, you'll drop frames like crazy, at a rate determined by how far your game frame rate is from your monitor rate, in modulo arithmetic.
      • In the meantime, at t = 31.58 msec, the application had finished drawing frame 3. It swapped buffers again, before the graphics card got a chance to display frame 2. Frame 2 disappeared from the display buffer, never having been shown on the monitor. That's a dropped frame, and it's a bad thing.
        ...
        But don't just assume that because your game runs consistently at a rate higher than your monitor, you won't be dropping frames.

        Isn't that what the vsync setting is for??? (to only render a new frame each time the monitor is refreshed, so you don't get "tearing"). It's turned on by default with the few cards I've installed, but they always need to turn it off to run these benchmarks.

        • I don't think so. Most of the consumer 3D apps I'm familiar with (games and whatnot) have a free-running internal clock. If they can run hard-real-time off the vertical refresh, then that's cool and all. But it's news to me.
      • It is possible to configure most games to ignore the vertical synch wait, and swap buffers while the screen is being refreshed. Of course this isn't really a solution because it leads to tearing, where the swaps become visible as horizontal discontinuities on the screen.

        Just adding to the information pool...
    • These new cards would be of interest to people who want dual monitor and/or DVI support that haven't heard of Matrox cards. It's easier to find ATI and nVidia cards at your local computer shop than Matrox cards. In fact, I don't recall ever seeing a Matrox card at a retail store.
  • I'm Glad (Score:4, Insightful)

    by BiggestPOS ( 139071 ) on Thursday April 04, 2002 @06:13PM (#3287274) Homepage
    That ATI was able to emerge as an actual competitor for Nvidia. Once 3dfx finally died, it looked as though Nvidia might have a stranglehold on the market. The first couple of offerings from ATI were crap, and didn't look too promising, but the 8500 is a perfectly decent chipset, and some of the ViVO features put it way ahead of the GeForce 4 for some people.

    I have a DV cam with RCA inputs, and firewire, so my video card doesn't need to be able to capture, just a nice S-Video out for watching downloaded southparks on my Wega in the living room.

    • Once 3dfx finally died, it looked as though Nvidia might have a stranglehold on the market.

      Actually, Nvidia purchased most of the intellectual property and rights belonging to 3Dfx back in December of 2000.

  • ATI and NVidia (Score:1, Flamebait)

    by Anonymous Coward
    Personally I favor nVidia. I just through out my ATI Radeon 32Mb DDR card because it was causing to much grief. With the latest ATI drivers and the latest via chipset drivers I couldn't get the card to work unless I set the motherboard to use 2xAGP! Even then most 3d games would crash after 15minutes of game play. Just a couple weeks ago I replaced it with a GeForce 3 and not a problem since!

    So now it nVidia all the way for me.
  • by Anonymous Coward on Thursday April 04, 2002 @06:18PM (#3287311)
    Instead of bruteforcing polygons the MAN'S way, ATI decided to be a bunch of sissies and implement HyperZ technology. 'Discard unseen pixels'? BAH! I'd much rather have these unseen pixels rendered than let them go to waste. Their proprietary TRUFORM technology is good, if you like seeing rounding errors (see Serious Sam SE's shotgun model). Moreover, their names are misleading. 'Pixel tapestry', 'Charisma Engine' - what do these names mean? How can a pixel have tapestry?

    Meanwhile, NVIDIA continues its dedication to their customers by giving them 128MB of VRAM; conveniently providing the customer with 32 extra MB of VRAM to use as a RAMdrive. Instead of fudging around with names like ATI does, they've simply decided to follow 3DFX's naming scheme and simply name their cards GeForce(n + 1). I look forward to the day when the GeForce requires an input from the +5V power supply.
    • ATI decided to be a bunch of sissies and implement HyperZ technology. 'Discard unseen pixels'? BAH! I'd much rather have these unseen pixels rendered than let them go to waste.

      Nvidia does this too. The call it Lightspeed Memory Architecture II [nvidia.com].

      Instead of fudging around with names like ATI does, they've simply decided to follow 3DFX's naming scheme and simply name their cards GeForce(n + 1).

      But they do this so badly. 1 -> 2 was just an increase in clock rate. 3 was a new generation. 4 is a clock rate increase -- except for the G4 MX, which is SLOWER than any of the G3 cards. Stupid.

  • by chronos2266 ( 514349 ) on Thursday April 04, 2002 @06:19PM (#3287316)
    NVidia's developer site [nvidia.com] is why they will win the GPU war. Only because they help developers by prodiving an extensive forum in which they can educate themselves about their technologies. I recently started researching vertex programming, I went to NVidia's site and they had a entire SDK dedicated just to it. I haven't see anything like that on ATI's site. Keeping the people that develop for your hardware informed is the only way to win support, ATI hasn't realized that yet.
  • nvidia vs. ati (Score:4, Interesting)

    by soap.xml ( 469053 ) <ryanNO@SPAMpcdominion.net> on Thursday April 04, 2002 @06:22PM (#3287337) Homepage

    When its all said and done, I have to place my vote for nVidia, hands down. There are many reasons for this... howerver this is the most compelling...

    nVidia Drivers page link [nvidia.com]

    • Windows 95/98/Me Drivers
    • Windows XP/2000 Drivers
    • Windows XP 64-bit Drivers
    • Windows NT Drivers
    • Linux Drivers

    ATI Drivers page link [ati.com]

    • Windows XP
    • Windows ME
    • Windows 2000
    • Windows NT

    At home I run about 7 computers, a mix of linux winXp 2k and 98. The fact that my geforceX cards can and will run great in all of the above os's using proper driver support is all I need to buy from nVidia. Good customer support, and good OS support. That will bring in my dollars...

    • Re:nvidia vs. ati (Score:2, Insightful)

      by Hamshrew ( 20248 )

      True... but the ATI cards do run in linux, and ATI does provide a link to the drivers. And ATI provides the specs to do open-source drivers. For me, it isn't that much of a concern that the geforce drivers are proprietary... but it is a concern. And for purists, it's a major concern.

      It wasn't a concern at all, until I ran into a situation where I actually wanted to look at the source...

      • I have not done much more than a cursory look through the ATI dirvers pages. After not finding them quickly I, just as quickly, ditched ATI in favor of nVidia. If they do provide them, it would be quite helpful to make them more easiliy accessable.

        And yes, I'm aware that if I really wanted to find them I could quickly google for the driver also.... ;)

        -ryan
      • Re:nvidia vs. ati (Score:3, Informative)

        by HeUnique ( 187 )
        Oh really? is ATI that great...

        Then you wouldn't mind showing me the specs where I can switch on/off the Macrovision part, would you?

        Oh, how about giving me very fast 3D drivers? oh, that will be only available in June..

        What about The Rage 128Maxx 2 processor use in Linux? no support..

        Maybe can I get a full support for both TV out and VGA without Xinerama (a-la Nvidia's Twinview)? nop, not supported...

        Yes, nVidia doesn't give the source or specs, but I can use ALL the features of my Geforce card - top to bottom, while with ATI Radeon I can't (currently), not mentioning Matrox G450/G550..

        I don't give a damn about the source - I give a damn about full feature driver (which got some nice extras - true dual head without need of Xinerama, shadow mouse, 2 versions of AGP for compatibility, and tons of other feature) which I don't get with others...

        Sad, but true..
        • I don't give a damn about the source

          Good for you. Some of us do give a damn about the source (and not for purely philosophical reasons, for practical reasons).

          Dinivin
      • Actually, NVidia does worse than just keeping its drivers closed as hell. Reportedly, when they bought up 3DFX, they had the XFree developpers give them back all the stuff 3DFX had given them to play with and develop a driver.

        As a result, the XFree guys had to stop developping for the Voodoo series, and I find myself with a card that won't ever be totally supported, nor will the current driver ever be debuggued. Only way I can get a stable X server now, without my current weekly or so weird crash, is by buying a new card. Needless to say, it will not be an NVidia, trust me on that one.
    • Re:nvidia vs. ati (Score:4, Informative)

      by felipeal ( 177452 ) on Thursday April 04, 2002 @06:39PM (#3287430) Homepage
      So maybe they are just missing a link to:

      http://www.ati.com/support/faq/linux.html

      I have 2 computers at home, one with a nVidia TNT2 card and the other with an ATI Rage Pro 128, and I can tell you, I'm much happier with the ATI one (the nVidia one sometimes freezes the whole system, for instance).

      The overall situation (If I'm not wrong) is that even though nVidia provides the drivers (and even the source), they don't disclose technical information about the cards, while ATI does the opposite.

      • Good point. I would say that from an end user standpoint and somebody that doesn't always want to spend hours tinkering with my boxen, having that link readily available from the drivers area would be very helpful.

        Granted if I was really interested in finding out more about the ATI cards and if they have linux support, I would have spent much more time during my decision process. However I wanted a quick and simple solution for multiple machines without a hassle. nVidia made it clear to me as an "end-user" that they supported all of my OS's from the download page. ATI didn't. This is pretty much just a "bad marketing" or "bad website" issue, but none the less, it was enough for me to buy 7 nVidia cards ;)

        -ryan
        • I have the feeling that they have the idea that a linux user is smart enough to find the link (as you mentioned, you would spend more time in your decision process), which is the perfect example of what the companies think about linux (i.e., it's not used by your average Joe AOL user).

          But you're right, it wouldn't hurt for them to have such a link.

          This is pretty much just a "bad marketing" or "bad website" issue, but none the less, it was enough for me to buy 7 nVidia cards ;)

          Maybe you should tell them that (as NOT having the link actually hurt them :)
      • The last time I bought an ATI card I was shocked by the fact that the Linux driver had incredibly bad 2D performance. ATI had paid the Precision Insight folks to write the 3D driver, but didn't include any money for accelerated 2D performance.

        I don't know if this has been remedied since then, this was the status 6 months ago.

        thad
    • Well, nVidia isn't ALL good. The nForce drivers are not available for 95/98. Having to buy XP added about 50% to the cost of my parent's new machine, more than negating the cost advantage of the integrated motherboard.
      • Gee, let's see... nForce came out when? Oh yes... about a month before Win95 was officially decommissioned.

        Not having Win98/ME drivers is vaguely surprising, but not too much so.

        And, of course, like everyone else, I have to question just how smart you were to buy the nForce board when there's no driver support for the OS you wanted to run. If you bought the board first and figured it out second, well, that should damn well teach you to do your homework next time.

        You do realize that there are highly integrated KT266A and KT333 motherboards out there, right? The only thing the nForce 420 has that they don't is integrated video.

        Finally... uh... added 50%? What exactly did you build? WinXP Home OEM is $88. OEM Pro is $140. The home edition is already less than the cost of the motherboard, the Pro is pretty close to the cost. Once you add a hard drive, memory, CPU, keyboard, mouse, and monitor there's no way in hell that it's 50% more. Yeah, you were probably upgrading piecemeal. Again, you fucked up and didn't do your research, but want to blame someone else for it instead of accepting your own screwup.
    • You went to the OEM driver section (Powered by ATI), not the Retail driver section (Built by ATI) [atitech.com].

      If you go to the retail section, there are is an OS menu with Windows, MacOS, Be OS (!), and Linux.

      • Ouch, I stand heavily correct ;) Thanks for pointing that one out. I have done that multiple times. Until you pointed that out I have not even noticed the "Built my ATI" part of the website. Must be the way that my eyes move along the home page or something. Once again thanks for pointing that out. I will have to investigate an ATI card for the new box I am putting together ;)

        -ryan
    • At home I run about 7 computers, a mix of linux winXp 2k and 98. The fact that my geforceX cards can and will run great in all of the above os's using proper driver support is all I need to buy from nVidia.

      I am happy to tell you, that you can use ATI cards in Linux. I am sorry to tell you, that nVidia drivers are much worse, than any other drivers for Linux, becouse they are closed source. It means, that you could have stable system if you have a luck or russian rulette if you don't have it.
    • Don't be such a mental midget. Read their faq about Linux drivers at ATI. They actually give source code and are helpful to the folks over at XFree86.org. NVidia? Hell no, they want you to use those shitty closed drivers... I'll stick to the stuff that will still be available years from now.
  • by cREW oNE ( 445594 ) on Thursday April 04, 2002 @06:27PM (#3287360)
    The benchmarks were performed with the latest "official" ATi drivers. (6037) They are, ofcourse, right to do that but let me point out two things:

    1) Other manufacturers that produce ATi based cards have released more recent drivers. (6043 or even 6052 [rage3d.com])

    2) In version 6043 a very large bug was fixed that increased openGL performance a lot in some cases. (In extreme cases from 55 to 170fps)

    As a result some of their 8500 results could improve if they used more recent drivers.

    • good service... (Score:2, Informative)

      by dollargonzo ( 519030 )
      what bothers me is the fact that although i think that ATI drivers are good because they are open source et al, the basic response people give to a benchmark in which ATI loses, is "you should have used the most recent drivers." they ARE correct, because in most cases, it DOES fix the problem. and i am NOT calling this bad sportsmanship, but no one ever says this type of thing about NVIDIA cards. basically, although ATI does make good stuff, they need to get the service end of their deal together. The radeon drivers (default, hehe) in the kernel are great, cause that means that for ppl who have them, and i am helping them install, i dont have to tell them how to obtain nvidia drivers, which makes instalation SOOO much easier,
      but:

      when i am thinking of getting a new video card, and wondering which will have linux support, i KNOW that nvidia (although proprietary) will already have drivers to download. although ati provides people with the spec to write drivers with, the 8500, for example, it took a while for drivers to come out...which, at least for me, is a setback.

      QED
      • by himi ( 29186 )
        There's a simple reason for people rooting for ATI and booing NVidia: NVidia is, and has been for the last several years, the 800lb gorilla in the graphics card market. Yes, they make good chips, yes they make good drivers, and yes, they support Linux well, but they're still really big and really powerful and it's really nice to see them being beaten . . .

        Also, I imagine there's still a lot of residual feeling from the days when NVidia said they were going to release open source drivers for their cards. I spent $360 AU on a TNT2 Ultra on the strength of that promise, and for six months I played Q3demo at about 12fps on a system that could have done 35 at least. I /hated/ NVidia for that, and vowed never to get one of their cards ever again. I still don't like the way they support Linux - I really don't like binary only modules, particularly given I tend to track the latest kernel releases and pre-releases. I don't want to have to reboot to remove the tainted flag when I have a bug to report.

        To top that off, ATI, although they don't write Linux drivers themselves, /do/ release the specs for their cards - I can currently run my original Radeon under Linux with hardware t&l support, thanks to this, with drivers that are open source. They're great, even though they're not as complete as they could be, or as bug free. Given there are probably about three or four people working on them, they're /amazingly/ good.

        So yeah, it's kind of dodgy to say "use the most recent drivers!" and discount any performance differences, but many people have reason for being less than happy with NVidia's dominance.

        himi
  • Why?

    Many reasons:
    1. They produce the chipset, others make the video cards, thus each company is trying to outdo the others on features/price. For example there was a video card i found for 30 dollars cheaper, just because it did not have svideo out, but otherwise the same card

    2. MOST IMPORTANT. nvidia seems to care. Although they do not release all their 3d specs, they released enough for xfree/whoever guys to create the nv driver. But their own driver is really great. I think that is the only 3rd party that is actually writing the drivers for linux. My only disappointment with their driver is that it failed to work with the kernel framebuffer, and caused a hardware freeze, when running X. (Could be an AGP problem) Does anyone know if that has been fixed?

    That said, I will probably not buy any video cards for the next year. Damn it but I do not need that much juice to run CS.
  • by Phoenix ( 2762 ) on Thursday April 04, 2002 @06:29PM (#3287378)
    Really, I don't care who is the best one on the market. I know that there are thoes that think that nVidia is the best (and from what I've seen I'd have to agree), and others that think that ATI rocks.

    Honestly, I don't want to see any one company sitting unchallenged at the top. M$ is sitting there with the desktop OS market and look at the 'quality' product that they bash out.

    The fact that nVidia and ATI are fighting over the same bone means that there will be continued innovation by one to out class the other.

    Result? Better product since one company can't afford to sit on thier laurels and must keep striving to better themselves.

    Who's best? Does it matter? Considering that next year/month/week someone is going to out shine and make the others scramble to keep up and/or beat it.

    Just my two cents worth.
  • I have no loyalty (Score:2, Interesting)

    by moankey ( 142715 )
    I was first a Voodoo man, then they died. So I went Nvidia, so far I am still GeForce. If for some reason one day Nvidia slips and ATI swoops in with a better product I will move to ATI. As it stands now ATI hasnt been able to convince me yet.
    I dont care who has what, what ultimately motivates me is if the card can do what I want it to do and for now its make my games looks good and smooth at a price I am happy with.
  • There are hardware issues with nVidia cards. I had an ASUS GeForce 2 MX with 64Mb which I had to return because of a RAMDAC problem: the screen was getting garbled with time, the memory was becoming corrupted. I got a refund and bought a flawless radeon.

    Now, in the latest Powermac g4s (first to release those cards, even before they were officialy announced), the GeForce 4MX have a very nasty looking problem that appears to be electrical. People are freaking out, that problem look like it could damage the screens attached to the card.
    The apple discussion board discusses that at lenght. Of course, neither apple nor nVidia acknoledge the problem.
    Apple discussion thread [apple.com]
    Mixed in that discussion you can also see there are OTHER issues reminiscent of the RAMDAC problems of the 2mx that pops up as well.

    I truely wish I had selected the ATI 7500 when I bought the g4. I would have spent the money I saved on a 8500!
    • nVidia makes the GPU. nVidia is not Asus. nVidia is not Apple. If Apple or Asus have issues implementing nVidia's GPU, that is their problem. I have personally used and sold MANY GeForce2MX cards.... MX, MX-200, MX-400.... all of the above, and I have never seen an overwhelming problem with *any* of them.
      I'm not sure if Asus uses the "reference" model or not, but I can almost guarantee that Apple does not. If there is a flaw with one of ATI's products, there is no-one to blame but ATI.

      -kwishot
      • We just got some G4's with the GF4MX at work, and they are definitely an NVidia design, although because of the ADC they are not the 'reference' card. One possible issue with it would be that the power for the display is passed along the ADC, and if there was some design fault there, the consequences might be bad.
  • by Steveftoth ( 78419 ) on Thursday April 04, 2002 @06:48PM (#3287485) Homepage
    about pc hardware, and after reading people's responses to this article it just enforces my belief that PC hardware is really bad because the standards are not strict enough. I've had problems with so maney systems and you never know where to begin debugging a computer that doesn't work correctly. Sometimes a problem that seems like it was a 'video card issue' turns out to be a problem with your main memory. Even when useing the 'high quality' components, one low quality component or slightly defective card can bring a whole system down.
    Hell, just not having a pci card plugged in correctly can totatly trash a computer with a low quality MB. Ever pulled out a PCI card when the system is running? Sometimes it reboots, sometimes it don't.
    The point of this diatribe is that people seem very polarized on the subject of video cards, mostly due to the other guys card not working for them. When probably in many cases it wasn't the video card causing the problem at all, but rather an incompatibility in their system that was brought out by the video card.
    Guess it's the price we pay for getting such cheap, bleeding edge systems.
    • This is definitely true. I wonder if there's a generic troubleshooting guide on the internet anywhere. It would be very cool to write one. I could easily make one that's better than the Windows hardware troubleshooter.

      I'd recommend doing the following, if you're having trouble with system stability: replace your memory, power supply, video card, sound card, and motherboard, in that order. If you do it in that order, it will minimize the cost, while maximizing your chances of success.

      Also, don't underestimate the problem-solving ability of moving PCI cards to different slots. It really does work sometimes.
  • by BrookHarty ( 9119 ) on Thursday April 04, 2002 @07:04PM (#3287599) Journal
    I personally use the Nvidia chipset. If I want to use video in, I use a mpeg2 capture card that does a better resolution and doesnt skip frames. For output, I do get nvidia cards (Asus) with video out, but I perfer ATIs video out. ATI displays a better picture on tv out, I can display 1024x768 (about 500 lines on svhs out) and its clear. Its visible that ATI has better compression and output to TV/SVHS. ATI also polish's their driver tools, they look better and have more functions. Nvidia is lean and mean with their tools.

    I picked up a PNY GF4 4600 128 Megs, VIVO, (video in/video out). Not impressed with it over a GF3 Ti500. Check the benchmarks out and see what I mean. I cant tell the difference between 80 and 90FPS. The big part of GF4 was it running at 1600x1200 in 4x AA which the GF3 cant. 2X looks good enough for now.

    If anyone cares about some Benchmarks on GF and CPUs. I tested 3 video cards and 2 cpus. GF2MX, GF3Ti500,GF4 4600 (128 meg), P3-800 and a AMD 1800. I could swear I had GF3 benchmarks on the P800, Guess Ill need to do that when I get home. I wanted to show how a slower CPU can play newer games with just an updated GPU.

    AMD 1800 + GF4 4600 - 9697 3D marks - http://service.madonion.com/compare?2k1=3157957 [madonion.com]
    AMD 1800 + GF3 Ti500 - 8204 3D marks - http://service.madonion.com/compare?2k1=2777031 [madonion.com]
    P3 800 + GF4 4600 - 6170 3D marks http://service.madonion.com/compare?2k1=3167224 [madonion.com]
    P3-800 + GF2 MX - 2368 3D marks http://service.madonion.com/compare?2k1=2929648 [madonion.com]

    There is no overclocking done on these tests, but I did hit over 12000 3Dmark with minor overclocking.
  • I thought Slashdot said ATI and NVIDIA were merging? What happened to that? :)
  • ATI makes good products but I still have to give the nod to NVIDIA because of their all-in-one drivers that still support older cards. I installed the latest drivers on a 3 year old TNT chipset (Diamond Card) and actually noticed a performance gain. So if you are using an older (NVIDIA TNT/VANTA on up) video card, try the latest drivers [nvidia.com] (Detonator 28.32) they offer improvements across the board.
  • by Screaming Lunatic ( 526975 ) on Thursday April 04, 2002 @09:34PM (#3288261) Homepage
    That is one of the main reasons I choose nVidia. Whenever I have an OpenGL question, the nVidia driver writers are right there to answer questions. It is not hard to find them on the discussion boards at opengl.org and the opengl gamedevelopers mail list. There are also tons of opengl demos on the developer site.

    Secondly, their Linux drivers are quite good. I don't care too much if they are not open source, at least they work well.

    Btw, the reason why nVidia drivers are not open source. nVidia wanted one driver for all cards under their Unified Driver Architecture model. The open source community (XFree I believe, but correct me if I'm wrong) wanted the specs to the actual hardware. nVidia was willing to give the community exactly what their Windows driver writing team has and the community did not agree.

    Some agree with nVidia's point of view, others agree with the community. It doesn't really matter, the end result is closed source drivers.

    PK

    • The open source community (XFree I believe, but correct me if I'm wrong) wanted the specs to the actual hardware. nVidia was willing to give the community exactly what their Windows driver writing team has and the community did not agree.

      I keep hearing this, but I've never actually seen anything to suggest that this is the case. Does anyone have any proof of this?

      Dinivin
      • I googled and googled and googled, but I couldn't find any direct proof. I was able to find this [216.239.37.100] in the google cache though. This seems to be the only statement which nVidia has made with respect to Linux and open source drivers. If anyone else has more info, that would definitely be cool.

        PK

        • I googled and googled and googled, but I couldn't find any direct proof.

          If you have no proof, then why the Hell did you say it in the first place? Why not just admit that you have no idea if nVidia did, in fact, make that offer to the XFree86 or DRI developers? Now, unfortunately, with your +3 (Insightful) moderation, you've probably even convinced people that that load of crap is actually true.

          Dinivin
  • Summary: everybody's current generation card is within about 25% of the same performance. Nothing exciting. It's not like the days when there were 10x differentials.
  • until i see native support for linux with ATI technology, i don't care what the benchmark results are in comparing these 2 technologies. i'll continue to purchase nvidia based products.

    i love the linux support that nvidia provides via updated drivers. other hardware manufacturers should take note.
  • So Many card owning zealot's on both sides Have expressed there views and now I the confused consumer find myself attempting to interpret and hence pick and purchase.

    Initialy I was in the market for 2 new card's,one to play my current favorite fps and one to record tv to mpeg2,(at a decent resolution), both of which had to work under linux and both had to be within my budget.

    Now initialy the choice seemed simple , a nvidia card for gaming and some other card for tv.Then the 8500/7500radeon'swith all of the seemingly nice pvr options came out and suddenly my inital options seemed a bit broader.

    I have yet to see some one lay out the pro's and con's of the 8500/7500dv in a non biased way .
    There are plenty of reviews of the card's performance under window's , I have yet to see a review of how the card performes under linux.This I presume is due to the ati linux driver situation Which so many nvidia users have gleefully pointed out.

    So many ati fans laud the fact that nvidia's drivers are closed,(?), and so many nvidia fans
    point out that ati's linux performance is less than amazing and that ati's drivers are only slightly more open than nvidia's,(???).To me this whole situation is confusing.

    Basicly from what I can decipher atm ati's radeon 8500/7500 dv drivers for linux do not full fill what I want them to do,(capture tv and play games under linux ), and therefore atm these two cards are not for me.

    I would prefer to support ati over nvidia as they 'seem' the more open of the two companies
    However there performance or rather the performance of the 3rd party people who develop there drivers seems under somewhat under par.

    So in conclusion I think I am going to stick with an nvidia card for gaming under linux and shop around for a different capture card.I am really looking for suggestions as to what i should buy and would be happy to listen to any advice anyone has to offer on this matter
  • from Conclusions: "...Right now, there's a gaping hole in the middle of NVIDIA's product lineup, because the GF4 MX 460 is apparently stillborn (I challenge you to find a GF4 MX 460 for sale anywhere)..."

    right. well, it's 9:02 in the AM. what's there to do anyway?

    parlez-vous français?

    Leadtek - WinFast GeForce 4 MX 460 [materiel.net]

    MSI - G4 MX460 VT [materiel.net] (looks sexy in red!)

    MSI G4MX460-VT - GeForce4 MX460 64Mo DDR sortie TV [eurisko.fr] if Materiel isn't good enough...

    They're there. ;) But there aren't many...

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...