ATI vs. NVIDIA: The Next Generation 239
doppler writes: "There's a killer graphics card round-up at TR today that compares the new GeForce4 and Radeon 8500 128MB cards against each other in extensive testing. Very good stuff. Most interesting: a visual representation of a texture upload problem in OpenGL on the Radeon 8500 chip."
Sweet (Score:1, Troll)
Re:Sweet (Score:1)
On a completely unrelated note, or maybe not so unrelated after all, I can no longer read the article! Perhaps my internet is screwy, or perhaps the tech report was slashdotted in less than 10 minutes. That would be somewhere deliciously between really amazing and really scary. Well,
--Anyone downing on
128! Wowzers (Score:2, Funny)
Re:128! Wowzers (Score:3, Funny)
Re:128! Wowzers (Score:1)
Re:128! Wowzers (Score:4, Interesting)
Texture size is REALLY not a problem. Do you realize how fr*gin big textures can be byte wise before you get to being just plain old silly?
It is NOT the size of textures people, it is how COMPLICATED those textures can be.
Currently LOD is used in order to keep video cards from having to render full 256x256 textures when an object, say, only appears as 25 pixels in its entirety on the screen. You know, that sniper across the street with that gun? Yah that one, (duck).
This works quite well, until you get up close to the object. Shoddy unrealistic Bumpmapping (I highly disprove of bumpmapping, more on this later) can come into play at really close distances, and games like Serious Sam even make this look halfway decent, but it still is not real, or realistic.
The ONLY way to get good texturing done is to DISPENSE with the concept of textures all together. Polygons do not make this easy in themselves, and competing technologies can even make it worse. Some technologies like vertex coloring are a bit useful, but not much and they are just the texturing model relabled.
But once you DO dispense with textures, ooh yah.
Now for bumpmaps.
Bumpmaps are often times just a cheap shortcut to REAL modeling. Geometry deformation texturing is the next step, but until we get some video cards that can model each little crack and bump of an object we are not going to get anything near 100% photo-realism. Not to mention characters with actual nostrils. Yes there is a level of diminishing returns, but quite frankly, until I can model every last little crack bump and lump in a model and have it render real time on a home users computer, bumpmapping is what we are stuck with, and I don't like it.
But I repeat, I REPEAT, larger textures (and bumpmaps) are just a cheap low quality shortcut They DEFINITELY have a point of diminishing returns, and it is one that HAS ALREADY BEEN REACHED. Most new games do NOT do just plain old texturing any more, and a lot of what is happening now days in relationship to textures (Bilinear filtering and such) is just in fact ways to correct errors in the original texturing model of thinking. Or at least further refine the mathematical model used to show those textures.
But why do games look better you ask?
Mostly because video cards have any number of fancy TnL units that can independently create some rather nifty effects while working AROUND or OUTSIDE of the plain old texturing model. At the very least the texturing model of thinking has some. . . rather funky. . . math applied to it in an artistic manner with the results rendered to the screen.
Look at Nvidias werewolf model as an example.
The HAIRS on it look great.
The actual model though?
Hell looks like shit.
No it does.
Notice the face people. Horrid. The textures. It is not the modeler or textures fault, it is just a fact that, well hell, you CANNOT do realistic skin textures without using Pixar level technology.
Actualy, I recently read an article from awhile back that was an interview with someone at Pixar. They were describing the INSANE level of work that was necessary to even get something that SORT OF looked like a skin texture to render. The FF movie had kinda-sorta-maybe-ifyousquint real looking skin, it was nice, but it took a lot of work and it still was not perfect. Once again, diminishing returns.
While NVIDIA is doing good work in relation to getting various funky technologies out on the market that move around the texturing problem, as long as we rely on textures as our main source of coloring objects, we as a community of people who love to Blow Things Up are going to have problems.
Hell the very idea of textures themselves is exactly opposite to how existence works. Objects are not gray by default with colors added later. Objects are. . . . real. They exist. More or less. The color is an INTEGRAL PART of what an object is. You cannot separate the two.
In other words
I want molecular modeling please.
Re:128! Wowzers (Score:2, Insightful)
But as for photo-realism, who cares? I, personally, think that the solutions people have come up with to maximize the hardware's potential are fascinating. And playing non-photo-realistic games has never been a problem for me, any more than watching a non-photo-realistic episode of the Simpsons.
Not that this has anything to do with nVidia vs. ATi. My $.02: Buy nVidia's high end cards if you're rich or like to waste money. Buy ATi's high end cards if you just want to play games and don't care about an extra 2% on your framerate.
--Jeremy
Re:128! Wowzers (Score:2)
When used correctly, then yes, it CAN help, but when used IN PLACE of proper modeling. . .
See ears on almost ANY character in any game. Horrid. Bumpmapping is used instead of putting an actual HOLE in the ear. . . . And most games have no geometric detail to the ears at all!
. And playing non-photo-realistic games has never been a problem for me, any more than watching a non-photo-realistic episode of the Simpsons.
Which is a 2d scenario anyways.
But yah, photo-realism is just one example.
Hell how about some realistic water colors, eh? Current systems use some VERY high end psuedo-reality modeling algorithms to make something that kindasortamabye looks like watercolor but they sure as hell are not watercolors.
Same with arcylics. Hell does anybody even know of a program that DOES simulate acrylics? Painter 6 doesn't do it and it generaly tends to have one of the more advanced (consumer level) paint simulation systems out there.
Of course some day computers WILL have the power to compleatly model the physical world (Ray Tracing is NOT a model of the real world. Raytracing is actualy backwards, light is projected from a virtual view back to the light source. VERY weird and it leaves some
Re:128! Wowzers (Score:4, Informative)
I agree.
> Texture size is REALLY not a problem.
It IS when your PC game is being ported to consoles and you ONLY have ~ 2.5 Megs of VRAM say like on a PS2 ! (Yes the PS2 has 4 Megs of VRAM, but you need space for the framebuffer and zbuffer.)
Now consoles make up for the lack of video memory by having a high bandwidth (i.e. PS2 can DMA ~20 Megs of Textures per frame) but I'd rather upload my textures ONCE, not every bloody frame. Yes, you be more efficient at texture uploads (draw the last model from the last frame, first this new frame, etc) but you're still tying up the BUS.
> The ONLY way to get good texturing done is to DISPENSE with the concept of textures all together.
I don't compeletely agree, but you raise an interesting point, because of the fact that textures are a form of (color) compression. If we take this to its logical conclusion we should be able to have a triangle PER pixel, and that would negate the need for textures. Unfortunately that has its own problems -- there's no way we can send a million vertices across because we'd saturate the bus! Doh! (Give a reward to the person in the back who said, well let's move to paramateric surfaces then!)
In the "Real World" (TM) we have a *unique* texture per pixel (ala ray tracing) however we don't have the memory to store that, unless we calculate them parametricaly. Sure we can get nice "marble" ala Perlin Noise, but it's going to be a while before we can mathmatically generate EVERY texture !
> But why do games look better you ask?
> Mostly because video cards have any number of fancy TnL units that can independently create some rather nifty effects while working AROUND or OUTSIDE of the plain old texturing model.
You'd be amazed at what multitexturing and multipass render does. Even a simple repeatable base texture with a "random" noise texture overlaid with a bump-map, looks OK.
> The color is an INTEGRAL PART of what an object is. You cannot separate the two.
You *can* get away with this, but you have to be aware of the tradeoffs. One common "solution" is to crank up the bit-depth.
i.e. If you use 16-bit color channels ala 64 bits per pixel, then you don't have to throw out your whole rendering functionality -- you just extend it. Not a perfect solution by any means, but "its good enough."
Take a look at "Titanic" The ship was rendered via tradional textures, and it looks pretty good. The hard part is getting that quality in real-time with so little memory ;-)
Cheers
--
"The issue today is the same as it has been throughout all history, whether man shall be allowed to govern himself or be ruled by a small elite." - Thomas Jefferson
Re:128! Wowzers (Score:2, Informative)
This isn't true. Textures are not the only things stored in local video memory (i.e. the 32 MB you are talking about). Vertex buffers can also be stored in local video memory. It is quicker for a video card to fetch the vertex data from a buffer in local video memory, than it is to read it from system or AGP memory and feed it to the chip. Simply put, bandwith is less of a bottleneck with local video memory than it is with AGP.
Don't believe me? Try it yourself! There are many example OpenGL and Direct3D apps out there that you could hack (just check out the DevRel sites for both nVidia and ATI). Throw a frame counter in there, and measure it when you create a vertex buffer in system memory, in AGP and local video memory.
Now, you're correct that with "older" apps (pre-2000...heh, only in CS can you call something that's 2-years-old, old), there will be no difference in performance...but that's because those apps were written with 32MB boards as the high performance parts...so they didn't try to use much more than that out of fear of running too slow on what was current hardware. There will be a difference on any graphics-intensive app that was made in the last couple of years.
As games begin to use more complex models (and larger textures), even 128 MB will someday be too small.
Re:128! Wowzers (Score:3, Interesting)
It is not like verticies have to be loaded THAT often, and when they are they can often times be predicted, though how Messiah did things sucked (wow, look at that! SERIOUS texturing problems AND ass end load times AND the scripts get fucked up! Bah) but a GOOD loader can load a level dynamicaly and Not Suck.
I would MUCH rather 64*2 Megabytes of ram with a half clock seperation between them (In other words, fast ass access.
I myself will likely still be using my Matrox G400 MAX.
I cannot believe that some complete IDIOTS credit ATI with first having dual desktop displays. . . . grrr. Idiots.
yes! yes! (Score:3, Funny)
ATI and drivers (Score:3, Interesting)
Re:ATI and drivers (Score:1)
Re:ATI and drivers (Score:5, Insightful)
I've sent NVidia some mail stating that because of their support for my OS, I plan to continue buying their products. It's good to give them that kind of feedback, I think.
Re:ATI and drivers (Score:4, Insightful)
Re:ATI and drivers (Score:2, Interesting)
On the blue screen of death was the module stack. And always, always at the top of the list would be the offending culprit, atirage.dll
Need I say more?
Re:ATI and drivers (Score:3, Insightful)
I'll give you that when the drivers work, they work quite well. They look good, and run fast. But part of the reason I started using linux in the first place was to avoid the constant rebooting that comes with the alternative. Being totally closed eliminates the possibility of someone else coming in and fixing the problem, so all I can do is wait and hope they fix it on their own....and so far, they haven't.
Re:ATI and drivers (Score:3, Insightful)
You know why 3D still generally sucks on the PC? Because the market has been ground to a halt by patents and restrictive licensing. Imagine if the Internet developed this way. Stupid greed.
Re:ATI and drivers (Score:2)
Well, zealots can stick to "open" crappy, out-dated, and unmaintained stuff [ati.com], and non-zealots can stick with working, timely, stable stuff [nvidia.com]. Usually, I let practicality choose instead of politics.
You know why 3D still generally sucks on the PC?
Interesting, I've never noticed that. Compared to what? Game consoles that are built solely for churning polygons, or the latest flick from Pixar?
Stupid greed.
Imagine that! A company wanting to protect the hard work it has put into their products! And on top of that, they want to turn a profit! How disgusting!
Re:ATI and drivers (Score:2)
I tried a radeon card once, but prompty returned it because there were no drivers, and only after a while did they finally appear.
I have a Radeon gathering dust for similar reasons. Its capabilities are much beyond the (now fairly crappy) GeForce 2 MX I'm using currently, but the drivers just aren't there for the Radeon. I spent days (yes, days) getting DRI working properly with the Radeon, and what did I find?
I'd prefer NVidia's drivers were open source (some fixes just take far too long), but they do a good job so far. Considering this is a desktop-oriented company supporting an OS with such an insanely small portion of the desktop market, I'm very pleased. Everybody should stop bitching about NVidia, and praise them for investing their resources in making things possible for us. As long as they do a good job, we should support them (yes, including with dollars).
Re:ATI and drivers (Score:2, Informative)
Re:ATI and drivers (Score:2)
Incompatable? That shouldn't be happening. Remember that the nVidia drivers are kernel modules, so when you upgrade your kernel, you need to reinstall the driver. Just go back into NVIDIA_GLX-1.0-NNNN (if you got the tarballs, which I recommend) and su; make install.
Re:ATI and drivers (Score:2)
I have never had this happen with the 2.4 kernel series and I upgrade regularly. From the nvidia documentation
"
Or if you have a kernel crash and discover that you now get NO support from anyone since they can't debug that driver.
Once again I have never had a kernel crash with an nvidia driver. What makes you think it was at fault?
I've actually banned my employer's supplier from including nvidia card so I can be more flexable about what OS I want to put on the machines later.
Your and your employers loss. Besides if you don't like the accelerated drivers just use the Xfree86 one. It's not like you'll get good 3d for linux with any other card.
Re:ATI and drivers (Score:2)
So what? They have no obligation to open up their development and they might have good reasons for not doing so. The fact remains they provide linux drivers out of goodwill (i'm pretty sure they're not making enough money from linux users to justify development on them) and they deserve kudos for doing so. ATI doesn't provide squat for linux.
No GL or DRI support unless you are running a distribution that they support
I use Debian. It's not 'supported'. Both GL and DRI work fine.
Release the raw data (Score:4, Insightful)
Does anybody have a pool of varied cpu & motherboard machines, new and old? There are a couple of statiscal tools I would like to throw at the benchmarking problem - if only I had the data.
Game Programming (Score:5, Insightful)
I've used the latest flavours of the ATI Radeon series, and the drivers always seem to be a bit unstable. Downloading updated drivers doesn't always fix the problem, either; sometimes, it makes the problems worse. It's hard to tell whether they're even trying. It seems ATI, at this point, is just trying to keep up with NVIDIA in terms of speed, rather than in both speed, quality, and stability.
NVIDIA, on the other hand, fixes bugs properly *the first time*. They don't really produce many bugs, either, which means they can put forth more effort toward making everything more featureful.
There's no contest, in my opinion. NVIDIA wins, hands down. It will take quite a bit for ATI to change my mind, or the minds of my game programming colleagues, about this one.
Newest drivers (Score:2, Informative)
I agree, for the most part... when the 8500 came out, it was months before ATI released official, updated drivers. When they did, they were an improvement, but still had some stability issues. I was disappointed that after all that time, they still hadn't gotten it right. Especially after they kept talking about their "new commitment"
But then they released newer drivers pretty quickly. Fixed some rendering bugs, seem much more stable... I'll wait and see a little longer before recommending them to anyone else, but it looks like they may be getting their act together.
Re:Game Programming (Score:4, Interesting)
Re:Game Programming (Score:3, Interesting)
Even more annoying was that I found that the nvidia drivers would break up the audio on my sblive. I would get all these crackles in the audio which where very annoying. When I replaced the nvidia geforce with a radeon all of those problems went away. Overall I am not impressed with nvidia quality at all. ATI is better but for a really stable video card I would go with matrox. On a box I have with a G200 X has NEVER crashed. I have never had a single issue with that card.
Re:Game Programming (Score:2)
Full drivers with TV-out. If you run XFree86 4.2.0 you get g550 with dual-head support out of the box..
Re:Game Programming (Score:2)
Re:Game Programming (Score:2)
Re:Game Programming (Score:1)
Re:Game Programming (Score:2)
For what it's worth, I recently downloaded the latest drivers for my old TNT2 card off of NVIDIA's web site (for JK/II), and it absolutely screwed my system. Then I went back and downloaded them off my manufacturer's web site (Guillemot), and they worked great.
Moral is that it's better to get them off your manufacturer's web site.
The GeForce4 TI 4200 is the best (Score:5, Interesting)
But it gets better. The TI 4200 can be overclocked to speeds comparable to the TI 4600, Nvidia's fastest card. Get the fastest performance available for half the cost!
OT: non-AGP graphics card? (Score:4, Interesting)
PCI video card (Score:1, Informative)
I know that ATI still does occasionally put out a batch of 32MB PCI rage128 pro cards, and matrox has some PCI cards designed for multiple monitor configs... but overall there are no other PCI cards.
Compare PCI (33Mhz or 66MHz if you are lucky shared between your PCI devices) to AGP (133MHz+ on a dedicated channel) and see which one you would rather have
Also, the speed of most GPUs these days (graphics processing units) is too fast for the PCI bus to give it its data. That is why you will never find a Radeon PCI or Geforce2 GTS or better PCI. Heck, AFAIK the Geforce2 MX-400 cards are not PCI.
So that is what happenned to PCI cards.
Please mod me up as I am not logging in
Even more OT (Score:1)
Re:Even more OT (Score:3, Informative)
The Tyan Thunder K7 [tyan.com] includes dual-channel Adaptec Ultra160 SCSI, dual 3Com Fast Ethernet NICs, an AGP Pro 50 slot, 64-bit PCI, and a bunch of other stuff. It's also a dual-processor board, so you get twice the Athlon goodness. :-)
Re:Even more OT (Score:2)
Not according to this [slashdot.org], which links to an article [hardwarezone.com] on modding newer Athlon XPs so they'll work in multiprocessor configurations. (Older Athlon XPs have been said to work without this mod.)
Voodoo 4/5 might do for you (Score:4, Informative)
They're also dirt cheap on ebay, as WinXP and MacOSX don't support Voodoo cards, and people are selling them off for better cards.
You may also look for Mac cards - for the longest time, there was no AGP slot on the Mac, and I think you can get a Radeon PCI with mac roms. Flash it to be x86 compatible, and there you.
BBK
Re:Voodoo 4/5 might do for you (Score:3, Informative)
They do have PCI GeForce 2 boards though for obvious reasons they suffer a performance hit when compared to the AGP versions... That's your best bet until you upgrade your mobo.
Re:Voodoo 4/5 might do for you (Score:2)
So far they appear to work ok in Win98, but since they're not certified, I couldn't get them to install properly in XP. Many of the new drivers at least claim to be Directx 8.1 compliant.
What's the point? (Score:3, Interesting)
I don't see a reason for most people to upgrade to one of these things unless they are developing 3D technology.
Re:What's the point? (Score:2)
Re:What's the point? (Score:2, Insightful)
Everquest
RealFlight
Wolfenstein (everything turned to max)
Microsoft Flight Simulator
James
Re:What's the point? (Score:5, Insightful)
Strangely, most people don't seem to realize that this is a BAD THING, unless your app is running at an integer multiple of your monitor refresh rate.
To make it simple, imagine your monitor scans at 60 Hz. So every 60th of a second (16.67 msec) you get a whole new frame drawn on the screen. Assume, for sake of argument, that drawing the screen takes zero time. It's just instantaneous.
To achieve smooth motion, the same amount of time must pass between each frame. This is guaranteed if your application renders 60 frames per second. Unless you drop a frame somewhere, you'll see one rendered frame for every screen refresh, and you'll perceive smooth motion.
But what if you drive your screen at 60 Hz, but your application renders 95 frames per second. (Assume that it's exactly 95 fps all the time, rather than a variable frame rate, just to make the math work out for this example.)
When you run your game or whatever, the clock starts at zero. The first frame from the graphics pipeline is in the display buffer, so when the monitor gets ready to draw the screen, it draws frame zero.
10.53 msec later, the application has drawn the second frame, so it swaps buffers. The display buffer now has frame 1 in it. The application now starts drawing frame 2.
But the graphics card isn't ready to draw frame 1 on the monitor until a little over 6 msec later, at t = 16.67. At that time, though, the application hasn't finished drawing frame 2 yet, so frame 1 is still in the display buffer. The monitor draws frame 1. Game frame 1 comes after game frame 0, so we're still in sync.
During this time, the application has been working on frame 2. It finishes frame 2 at t = 21.05 msec and swaps buffers. Frame 2 is now in the display buffer, and the application starts drawing on frame 3.
The monitor is ready to draw frame 2 at t = 33.33 msec. So it reaches for the frame from the display buffer... but what's this? The frame in the display buffer isn't frame 2. It's frame 3! We dropped a frame somehow!
In the meantime, at t = 31.58 msec, the application had finished drawing frame 3. It swapped buffers again, before the graphics card got a chance to display frame 2. Frame 2 disappeared from the display buffer, never having been shown on the monitor. That's a dropped frame, and it's a bad thing.
Games aren't hard-real-time applications, of course. They run freely, sometimes drawing frames more quickly, and sometimes less quickly, depending on the load. This is okay. But don't just assume that because your game runs consistently at a rate higher than your monitor, you won't be dropping frames. In fact, you'll drop frames like crazy, at a rate determined by how far your game frame rate is from your monitor rate, in modulo arithmetic.
Re:What's the point? (Score:2)
But don't just assume that because your game runs consistently at a rate higher than your monitor, you won't be dropping frames.
Isn't that what the vsync setting is for??? (to only render a new frame each time the monitor is refreshed, so you don't get "tearing"). It's turned on by default with the few cards I've installed, but they always need to turn it off to run these benchmarks.
Re:What's the point? (Score:2)
Re:What's the point? (Score:2)
Just adding to the information pool...
Re:What's the point? (Score:2)
I'm Glad (Score:4, Insightful)
I have a DV cam with RCA inputs, and firewire, so my video card doesn't need to be able to capture, just a nice S-Video out for watching downloaded southparks on my Wega in the living room.
Re:I'm Glad (Score:2)
Actually, Nvidia purchased most of the intellectual property and rights belonging to 3Dfx back in December of 2000.
ATI and NVidia (Score:1, Flamebait)
So now it nVidia all the way for me.
Why ATI are a bunch of sissies (Score:5, Funny)
Meanwhile, NVIDIA continues its dedication to their customers by giving them 128MB of VRAM; conveniently providing the customer with 32 extra MB of VRAM to use as a RAMdrive. Instead of fudging around with names like ATI does, they've simply decided to follow 3DFX's naming scheme and simply name their cards GeForce(n + 1). I look forward to the day when the GeForce requires an input from the +5V power supply.
Re:Why ATI are a bunch of sissies (Score:2)
Nvidia does this too. The call it Lightspeed Memory Architecture II [nvidia.com].
Instead of fudging around with names like ATI does, they've simply decided to follow 3DFX's naming scheme and simply name their cards GeForce(n + 1).
But they do this so badly. 1 -> 2 was just an increase in clock rate. 3 was a new generation. 4 is a clock rate increase -- except for the G4 MX, which is SLOWER than any of the G3 cards. Stupid.
Re:Why ATI are a bunch of sissies (Score:3, Informative)
Developer Relations (Score:4, Insightful)
nvidia vs. ati (Score:4, Interesting)
When its all said and done, I have to place my vote for nVidia, hands down. There are many reasons for this... howerver this is the most compelling...
nVidia Drivers page link [nvidia.com]
ATI Drivers page link [ati.com]
At home I run about 7 computers, a mix of linux winXp 2k and 98. The fact that my geforceX cards can and will run great in all of the above os's using proper driver support is all I need to buy from nVidia. Good customer support, and good OS support. That will bring in my dollars...
Re:nvidia vs. ati (Score:2, Insightful)
True... but the ATI cards do run in linux, and ATI does provide a link to the drivers. And ATI provides the specs to do open-source drivers. For me, it isn't that much of a concern that the geforce drivers are proprietary... but it is a concern. And for purists, it's a major concern.
It wasn't a concern at all, until I ran into a situation where I actually wanted to look at the source...
Re:nvidia vs. ati (Score:2)
I have not done much more than a cursory look through the ATI dirvers pages. After not finding them quickly I, just as quickly, ditched ATI in favor of nVidia. If they do provide them, it would be quite helpful to make them more easiliy accessable.
And yes, I'm aware that if I really wanted to find them I could quickly google for the driver also.... ;)
-ryanRe:nvidia vs. ati (Score:3, Informative)
Then you wouldn't mind showing me the specs where I can switch on/off the Macrovision part, would you?
Oh, how about giving me very fast 3D drivers? oh, that will be only available in June..
What about The Rage 128Maxx 2 processor use in Linux? no support..
Maybe can I get a full support for both TV out and VGA without Xinerama (a-la Nvidia's Twinview)? nop, not supported...
Yes, nVidia doesn't give the source or specs, but I can use ALL the features of my Geforce card - top to bottom, while with ATI Radeon I can't (currently), not mentioning Matrox G450/G550..
I don't give a damn about the source - I give a damn about full feature driver (which got some nice extras - true dual head without need of Xinerama, shadow mouse, 2 versions of AGP for compatibility, and tons of other feature) which I don't get with others...
Sad, but true..
Re:nvidia vs. ati (Score:2)
Good for you. Some of us do give a damn about the source (and not for purely philosophical reasons, for practical reasons).
Dinivin
Why I won't buy a NVidia (Score:2)
As a result, the XFree guys had to stop developping for the Voodoo series, and I find myself with a card that won't ever be totally supported, nor will the current driver ever be debuggued. Only way I can get a stable X server now, without my current weekly or so weird crash, is by buying a new card. Needless to say, it will not be an NVidia, trust me on that one.
Re:nvidia vs. ati (Score:4, Informative)
http://www.ati.com/support/faq/linux.html
I have 2 computers at home, one with a nVidia TNT2 card and the other with an ATI Rage Pro 128, and I can tell you, I'm much happier with the ATI one (the nVidia one sometimes freezes the whole system, for instance).
The overall situation (If I'm not wrong) is that even though nVidia provides the drivers (and even the source), they don't disclose technical information about the cards, while ATI does the opposite.
Re:nvidia vs. ati (Score:2)
Good point. I would say that from an end user standpoint and somebody that doesn't always want to spend hours tinkering with my boxen, having that link readily available from the drivers area would be very helpful.
Granted if I was really interested in finding out more about the ATI cards and if they have linux support, I would have spent much more time during my decision process. However I wanted a quick and simple solution for multiple machines without a hassle. nVidia made it clear to me as an "end-user" that they supported all of my OS's from the download page. ATI didn't. This is pretty much just a "bad marketing" or "bad website" issue, but none the less, it was enough for me to buy 7 nVidia cards ;)
-ryanRe:nvidia vs. ati (Score:2)
But you're right, it wouldn't hurt for them to have such a link.
This is pretty much just a "bad marketing" or "bad website" issue, but none the less, it was enough for me to buy 7 nVidia cards
Maybe you should tell them that (as NOT having the link actually hurt them
Re:nvidia vs. ati (Score:2)
I don't know if this has been remedied since then, this was the status 6 months ago.
thad
Re:nvidia vs. ati (Score:2)
Re:nvidia vs. ati (Score:2)
Not having Win98/ME drivers is vaguely surprising, but not too much so.
And, of course, like everyone else, I have to question just how smart you were to buy the nForce board when there's no driver support for the OS you wanted to run. If you bought the board first and figured it out second, well, that should damn well teach you to do your homework next time.
You do realize that there are highly integrated KT266A and KT333 motherboards out there, right? The only thing the nForce 420 has that they don't is integrated video.
Finally... uh... added 50%? What exactly did you build? WinXP Home OEM is $88. OEM Pro is $140. The home edition is already less than the cost of the motherboard, the Pro is pretty close to the cost. Once you add a hard drive, memory, CPU, keyboard, mouse, and monitor there's no way in hell that it's 50% more. Yeah, you were probably upgrading piecemeal. Again, you fucked up and didn't do your research, but want to blame someone else for it instead of accepting your own screwup.
Where the Linux drivers link is (Score:3, Informative)
If you go to the retail section, there are is an OS menu with Windows, MacOS, Be OS (!), and Linux.
Re:Where the Linux drivers link is (Score:2)
-ryan
Re:nvidia vs. ati (Score:2)
I am happy to tell you, that you can use ATI cards in Linux. I am sorry to tell you, that nVidia drivers are much worse, than any other drivers for Linux, becouse they are closed source. It means, that you could have stable system if you have a luck or russian rulette if you don't have it.
Source... (Score:2)
They're using old drivers for the 8500... (Score:4, Informative)
1) Other manufacturers that produce ATi based cards have released more recent drivers. (6043 or even 6052 [rage3d.com])
2) In version 6043 a very large bug was fixed that increased openGL performance a lot in some cases. (In extreme cases from 55 to 170fps)
As a result some of their 8500 results could improve if they used more recent drivers.
good service... (Score:2, Informative)
but:
when i am thinking of getting a new video card, and wondering which will have linux support, i KNOW that nvidia (although proprietary) will already have drivers to download. although ati provides people with the spec to write drivers with, the 8500, for example, it took a while for drivers to come out...which, at least for me, is a setback.
QED
363kg gorilla . . . (Score:3, Insightful)
Also, I imagine there's still a lot of residual feeling from the days when NVidia said they were going to release open source drivers for their cards. I spent $360 AU on a TNT2 Ultra on the strength of that promise, and for six months I played Q3demo at about 12fps on a system that could have done 35 at least. I
To top that off, ATI, although they don't write Linux drivers themselves,
So yeah, it's kind of dodgy to say "use the most recent drivers!" and discount any performance differences, but many people have reason for being less than happy with NVidia's dominance.
himi
I am NVIDIA customer for now (Score:2, Informative)
Many reasons:
1. They produce the chipset, others make the video cards, thus each company is trying to outdo the others on features/price. For example there was a video card i found for 30 dollars cheaper, just because it did not have svideo out, but otherwise the same card
2. MOST IMPORTANT. nvidia seems to care. Although they do not release all their 3d specs, they released enough for xfree/whoever guys to create the nv driver. But their own driver is really great. I think that is the only 3rd party that is actually writing the drivers for linux. My only disappointment with their driver is that it failed to work with the kernel framebuffer, and caused a hardware freeze, when running X. (Could be an AGP problem) Does anyone know if that has been fixed?
That said, I will probably not buy any video cards for the next year. Damn it but I do not need that much juice to run CS.
Who cares who is the best (please read b4 flaming) (Score:4, Insightful)
Honestly, I don't want to see any one company sitting unchallenged at the top. M$ is sitting there with the desktop OS market and look at the 'quality' product that they bash out.
The fact that nVidia and ATI are fighting over the same bone means that there will be continued innovation by one to out class the other.
Result? Better product since one company can't afford to sit on thier laurels and must keep striving to better themselves.
Who's best? Does it matter? Considering that next year/month/week someone is going to out shine and make the others scramble to keep up and/or beat it.
Just my two cents worth.
I have no loyalty (Score:2, Interesting)
I dont care who has what, what ultimately motivates me is if the card can do what I want it to do and for now its make my games looks good and smooth at a price I am happy with.
Hardware issues with nVidia (Score:2, Informative)
Now, in the latest Powermac g4s (first to release those cards, even before they were officialy announced), the GeForce 4MX have a very nasty looking problem that appears to be electrical. People are freaking out, that problem look like it could damage the screens attached to the card.
The apple discussion board discusses that at lenght. Of course, neither apple nor nVidia acknoledge the problem.
Apple discussion thread [apple.com]
Mixed in that discussion you can also see there are OTHER issues reminiscent of the RAMDAC problems of the 2mx that pops up as well.
I truely wish I had selected the ATI 7500 when I bought the g4. I would have spent the money I saved on a 8500!
Re:Hardware issues with nVidia (Score:2)
I'm not sure if Asus uses the "reference" model or not, but I can almost guarantee that Apple does not. If there is a flaw with one of ATI's products, there is no-one to blame but ATI.
-kwishot
Re:Hardware issues with nVidia (Score:2, Informative)
One thing I've noticed.. (OT) (Score:3, Interesting)
Hell, just not having a pci card plugged in correctly can totatly trash a computer with a low quality MB. Ever pulled out a PCI card when the system is running? Sometimes it reboots, sometimes it don't.
The point of this diatribe is that people seem very polarized on the subject of video cards, mostly due to the other guys card not working for them. When probably in many cases it wasn't the video card causing the problem at all, but rather an incompatibility in their system that was brought out by the video card.
Guess it's the price we pay for getting such cheap, bleeding edge systems.
Re:One thing I've noticed.. (OT) (Score:2)
I'd recommend doing the following, if you're having trouble with system stability: replace your memory, power supply, video card, sound card, and motherboard, in that order. If you do it in that order, it will minimize the cost, while maximizing your chances of success.
Also, don't underestimate the problem-solving ability of moving PCI cards to different slots. It really does work sometimes.
Nvidia Chipsets vs Nvidia TV out (Score:3, Informative)
I picked up a PNY GF4 4600 128 Megs, VIVO, (video in/video out). Not impressed with it over a GF3 Ti500. Check the benchmarks out and see what I mean. I cant tell the difference between 80 and 90FPS. The big part of GF4 was it running at 1600x1200 in 4x AA which the GF3 cant. 2X looks good enough for now.
If anyone cares about some Benchmarks on GF and CPUs. I tested 3 video cards and 2 cpus. GF2MX, GF3Ti500,GF4 4600 (128 meg), P3-800 and a AMD 1800. I could swear I had GF3 benchmarks on the P800, Guess Ill need to do that when I get home. I wanted to show how a slower CPU can play newer games with just an updated GPU.
AMD 1800 + GF4 4600 - 9697 3D marks - http://service.madonion.com/compare?2k1=3157957 [madonion.com]
AMD 1800 + GF3 Ti500 - 8204 3D marks - http://service.madonion.com/compare?2k1=2777031 [madonion.com]
P3 800 + GF4 4600 - 6170 3D marks http://service.madonion.com/compare?2k1=3167224 [madonion.com]
P3-800 + GF2 MX - 2368 3D marks http://service.madonion.com/compare?2k1=2929648 [madonion.com]
There is no overclocking done on these tests, but I did hit over 12000 3Dmark with minor overclocking.
Wait a sec... (Score:2, Funny)
What's old seems new again (Score:2)
nVidia supports OpenGL (Score:3, Insightful)
Secondly, their Linux drivers are quite good. I don't care too much if they are not open source, at least they work well.
Btw, the reason why nVidia drivers are not open source. nVidia wanted one driver for all cards under their Unified Driver Architecture model. The open source community (XFree I believe, but correct me if I'm wrong) wanted the specs to the actual hardware. nVidia was willing to give the community exactly what their Windows driver writing team has and the community did not agree.
Some agree with nVidia's point of view, others agree with the community. It doesn't really matter, the end result is closed source drivers.
PK
Re:nVidia supports OpenGL (Score:2)
I keep hearing this, but I've never actually seen anything to suggest that this is the case. Does anyone have any proof of this?
Dinivin
Re:nVidia supports OpenGL (Score:2, Informative)
PK
Re:nVidia supports OpenGL (Score:2)
If you have no proof, then why the Hell did you say it in the first place? Why not just admit that you have no idea if nVidia did, in fact, make that offer to the XFree86 or DRI developers? Now, unfortunately, with your +3 (Insightful) moderation, you've probably even convinced people that that load of crap is actually true.
Dinivin
Quick summary (Score:2)
linux driver support?? (Score:2, Interesting)
i love the linux support that nvidia provides via updated drivers. other hardware manufacturers should take note.
Do I sense a little anger in the air? (Score:2, Interesting)
Initialy I was in the market for 2 new card's,one to play my current favorite fps and one to record tv to mpeg2,(at a decent resolution), both of which had to work under linux and both had to be within my budget.
Now initialy the choice seemed simple , a nvidia card for gaming and some other card for tv.Then the 8500/7500radeon'swith all of the seemingly nice pvr options came out and suddenly my inital options seemed a bit broader.
I have yet to see some one lay out the pro's and con's of the 8500/7500dv in a non biased way
There are plenty of reviews of the card's performance under window's , I have yet to see a review of how the card performes under linux.This I presume is due to the ati linux driver situation Which so many nvidia users have gleefully pointed out.
So many ati fans laud the fact that nvidia's drivers are closed,(?), and so many nvidia fans
point out that ati's linux performance is less than amazing and that ati's drivers are only slightly more open than nvidia's,(???).To me this whole situation is confusing.
Basicly from what I can decipher atm ati's radeon 8500/7500 dv drivers for linux do not full fill what I want them to do,(capture tv and play games under linux ), and therefore atm these two cards are not for me.
I would prefer to support ati over nvidia as they 'seem' the more open of the two companies
However there performance or rather the performance of the 3rd party people who develop there drivers seems under somewhat under par.
So in conclusion I think I am going to stick with an nvidia card for gaming under linux and shop around for a different capture card.I am really looking for suggestions as to what i should buy and would be happy to listen to any advice anyone has to offer on this matter
Hmmmmm... no MX 460's for sale? (Score:2, Informative)
right. well, it's 9:02 in the AM. what's there to do anyway?
parlez-vous français?
Leadtek - WinFast GeForce 4 MX 460 [materiel.net]
MSI - G4 MX460 VT [materiel.net] (looks sexy in red!)
MSI G4MX460-VT - GeForce4 MX460 64Mo DDR sortie TV [eurisko.fr] if Materiel isn't good enough...
They're there.
Re:rumor has it... (Score:2, Funny)
Re:rumor has it... (Score:2)
Re:Who cares if NVIDIA uses closed drivers (Score:2, Informative)
Yes, OpenGL works on Radeons but it just works like a fast Rage 128. For example, no vertex shaders.
That's my gripe.