120+ GeForce FX Reviews Collected 142
Peter writes "We just finished at 8Dimensional our list of GeForce FX reviews. It tries to show all reviews of these video cards currently online, 120+ are listed at the moment." Hmmm, time to upgrade from an Xpert@Play98 ...
To bad they will all have to be redone (Score:5, Interesting)
Re:To bad they will all have to be redone (Score:5, Interesting)
It's worth to note that ATI also "cheated" but they still correctly rendered the scene. All they had done was re-order the shader instructions so they were optimized for their architecture. It only boost their performance by 3% IIRC.
Re:To bad they will all have to be redone (Score:2)
Re:To bad they will all have to be redone (Score:5, Informative)
If it was a really narrow path optimization then it's borderline, but far different from the wholesale cheating of NVIDIA who: rewrote shaders with completely different results, didn't clear the screen at certain points and added hidden clip planes to eliminate pixel fill. All very underhanded and why you would make excuses for this I don't know. Sure it's a synthetic benchmark, but if it don't matter then don't cheat.
Re:To bad they will all have to be redone (Score:2, Informative)
Um... he was talking about 3dMark, not the Quack issue (btw, there are no shaders in Quake3), which although people insist on constantly bringing, up was most likely not a cheat; ATi not only fixed the 5 problematic textures, but also incre
Re:To bad they will all have to be redone (Score:2, Informative)
John Carmack released a document YEARS ago explaining to hardware vendors how to optimize their OpenGL implementation for the Quake 3 engine. Such things as Vertex Array Client States (i.e. GL_VERTEX_ARRAY) never changing (always enabled), specifics for multi-texturing, vertex structure size, etc...
Given that anyone who cares to search Google for a minute or two can pull up this document ( The original doc is gone
Re:To bad they will all have to be redone (Score:2)
Re:To bad they will all have to be redone (Score:1)
You could argue that replacing complete shaders is optimizing (as long as the shader delivers the same output)and you could be right if this would be a game. However, the 3d
Re:To bad they will all have to be redone (Score:2)
Re:To bad they will all have to be redone (Score:2)
uncharacteristic of current games... (Score:2)
You don't need to buy the top of the line card to play the current games well. So you want a benchmark that'll tell you how your card will deal with the features that are new to the card and will make their way down to the games later.
If you're buying a top of the line card, either you just want the status of having a top of the line card, in which case you want it to pass all benchmarks with flying colors, or you want a card that you won't be replacing for a while...in which case you w
Re:To bad they will all have to be redone (Score:3, Interesting)
And yet when it comes to the 3D Mark test ATI Creams NVidia when that new patch is applied.
To me that makes the 3DMark benchmark Very sus. The two possibilities I can see are NVidia's original complaints against the benchmark are justified, and yes they cheated to boost the score. OR 3DMark are bitter at NVidia a
Patches? (Score:1)
So here now is my benchmark results for the Geforce 2 MX 64MB on a P4 @ 2.4Ghz with Gentoo 1.4rc4: Amazing isn't it?
And they all prove... (Score:5, Funny)
Re:And they all prove... (Score:1, Offtopic)
Re:And they all prove... (Score:1)
RTFA -- the 5200 doesn't even have a fan (Score:2, Interesting)
But going back to your inappropriate comment
Re:RTFA -- the 5200 doesn't even have a fan (Score:2)
Re:And they all prove... (Score:2)
Re: (Score:1)
You know, slashdot itself would be more useful (Score:5, Interesting)
Re:You know, slashdot itself would be more useful (Score:2)
is it really an upgrade? (Score:5, Funny)
Hmmm.. I don't know if a GeforceFX is actually an *upgrade* from the Xpert@play98. What benchmark are you using?
Re:is it really an upgrade? (Score:5, Funny)
The "decibels produced by the video card" benchmark.
mark on a bench (Score:1, Funny)
The GeforceFX has a higher score on my benchmark by 19 meters. Thus it truly is an upgrade for me.
Re:is it really an upgrade? (Score:1)
uhhh (Score:5, Insightful)
While the idea of a site that shows all the reviews in one place is noble...unless you have huge amounts on content it's easier to search google for the reviews. The good ones usually end up being on top as well.
Re:uhhh (Score:1)
[snicker] (Score:3, Funny)
quote: "...120+ are listed at the moment".
So, production quotas went better than expected, huh?
I have to admit, I knew the horse-power game was one that Nvidia was going to trip over eventually when the Radeons were *really* pushing the Z-axis occlusion.
Face it, Nvidia became top dog by pure horse-power, and to some extent, deeper color depths and kick ass drivers.
ATI finally got their hardware on track not by being able to "outshow" Nvidia's muscle, but by outclassing them by being smarter with the bandwidth and 'getting a clue' with drivers.
(witness the Dawn demo on a 9800 running *faster* than on the FX series.)
Speaking of the Dawn demo, does anyone else remember that this was to showcase the power of the FX, yet the 5200 is a *SLIDESHOW*?
Final thought: Did they use 3Dmark in all the benchmarks? (/low blow, sorry)
.
Re:[snicker] (Score:2, Insightful)
That is faulty logic; just because ATi was using Z-axis occlusion and ATi beat nVidia (which is also using z-occlusion nowadays) does not mean that that is why they beat nVidia. The Kyro II used tile based deferred rendering which was far superior to the z-occlusion ATi and nVidia were and are still doing, and look where PowerVR is today.
ATI f
Re:[snicker] (Score:2)
Nvidia used brute force and overtook 3dfx (barring 3dfx's stumbles and fall).
Ati tried to "brute force" the Radeon series and could not keep up with Nvidia, until, they (Ati) got smart with the hardware and software optimizations (i.e. finesse).
The point is that Ati overtook Nvidia because of points we both brought up:
Smarter with bandwidth, not overdrawing a scene, texture compression and a whole host o
Re:[snicker] (Score:1)
Not exactly. 3dfx (3Dfx, before they changed their name, bought STB, and went crazy, wasn't that bad at all) fucked themselves. If anything, 3dfx was guilty of brute-forcing. The Voodoo2 was not much more than a brute-force step from the Voodoo1. The Voodoo3 wasn't much better, with the addition of fairly poor 2D. "22bpp", blah. Gamers wanted 32bpp, nVidia gave them 32bpp. 3dfx had their eyes firmly closed. And let's n
time? (Score:5, Insightful)
Nope. The most stressful video application most users do is DVD playback, and even that is loooooong past the point where hardware-assistance is needed.
The video card market has gotten absolutly rediculous in the last 2 years. Its strange, when Intel and AMD fight it out, prices plummet. While nVidia and ATI have been fighting it out prices have skyrocketed. Sure, so have features, but they're so random and game or api dependant that most people don't even know how to turn them on in different games or in the drivers advanced settings.
Up until UT2K3 it was completely absurd, because anything with DDR could play any game just fine. Now with the new crop its even worse because modern cards still can't play the very-new and upcoming games well. So buying a high-end card now is overkill for older games, and underpowered for upcoming games.
Re:time? (Score:3, Informative)
Do you honestly believe that a million people bought R9700 in the first few months after it came out just to get "extra 20fps", or to win a "pissing contest" (another slashdot favourite)?
Re:time? (Score:3, Informative)
Picked up a ATI 9700Pro, and it has been about 9 months of great use for about another year before I need to upgrade.
BTW, to see if your system is worth an upgrade, check out future marks online browser, you can search for a CPU near yours and then check to see if a faster video card will improve your FPS. Good reference.
Re:time? (Score:1)
Re:time? (Score:1)
Re:time? (Score:1, Informative)
I was using 3dMark 2001, but 2003 is out. Download the demo (if they still offer it) and sign up with the profile manager.
Then you can do 2 neat things, Compare by CPU and Compare by GFX Card. So if can see what a faster CPU does with the same GFX card, or the same cpu with Faster GFX cards. Really ace.
Re:time? (Score:3, Interesting)
The GeForceFX and Radeon 9700 support the latest incarnations of DirectX 9, and therefore appeal to rich Windoze gamers.
BTW, If you want an example of why the Radeon 7000 and GeForce4MX are considered obsolete by some, check out thi
mod parent down (Score:1)
Re:mod parent down (Score:2)
Then a link perhaps?
Re:time? (Score:1)
How often do CPU manufacturers release completely new chip designs? Once every couple years maybe. I know it has certainly been a few years since the Athlon was released, and it has been a while since the P4 was released, and there is no sign of the next CPU from Intel yet.
Now think about how often new GPU's are released. Once a year at the most. The Radeon 9700 card hasn't even been out for a year, and ATI are already
Re:time? (Score:3, Interesting)
Just a while ago, I was looking for an Nvidia card with TV-out (nothing else is even likely to work under Linux/FreeBSD). Searched pricewatch and found one for $20... Do you really need a videocard to be creaper?
There is PLENTY of blame to go around for videocard prices:
Stores try to only stock the most expensive items, because that means higher margins. You don't walk into Best Buy/Circuit City and see SIS videocards, because the
Re:time? (Score:2)
As for the great ga
The Differing Economics: CPUs vs. Video Cards (Score:2)
There are many quite logical reasons high-end video card prices rise, having to do with the differing economics of the businesses involved:
120 reviews... (Score:5, Funny)
Re: (Score:1)
No... (Score:1)
ati (Score:2)
Whoah, really? Probably time for me to upgrade to an Xpert@Play98...
120+ reviews? (Score:3, Insightful)
Re:120+ reviews? (Score:1)
120 reviews in one place... (Score:2, Insightful)
Re:120 reviews in one place... (Score:1)
Of course it helps if you have a job that pays good.
Sorry. (Score:1)
Re:Sorry. (Score:2)
Just because you run Linux doesn't mean you have to give up gaming. It usually means that you have alot of heartache
Xpert@Play98 (Score:5, Funny)
BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB
Wow, that's some card!
Fastest card in existance? (Score:3, Interesting)
Another annoying thing... Looks like Nvidia and ATI are now price tiering cards. Up until recently, the most you would pay retail for the best consumer level card was around $400. Now it looks like Nvidia and ATI want to push us into the $500 card level. What is next year card $600?
Re:Fastest card in existance? (Score:1)
Re:Fastest card in existance? (Score:1)
Re:Fastest card in existance? (Score:1)
Re:Fastest card in existance? (Score:3, Interesting)
i believe there are some sites that use gaming benchmarks to review these cards, try highend3
Re:Fastest card in existance? (Score:2)
The only thing you seem to be getting out of 3D labs
Re:Fastest card in existance? (Score:2, Informative)
Re:Fastest card in existance? (Score:1)
3dlabs is barely holding on to their tiny niche. While CAD cards need to do different things than gamer cards, this is basically how it works: Nvidia/ATI build the best cards they can. Then they tweak them and triple the price for the CAD market.
120+ reviews... in other news... (Score:2)
This is news? (Score:3, Funny)
Now if they had really 'collected' them they could have perhaps summarised all the conclusions into one short conclusion to give an overall 'world' view or something.
This page could have been written by a dumb search robot just as easily as a human, where's the human value in it! Where's the humanity !?!?!? This is how the Matrix started you know.
I play so much here (Score:1)
FX 5200 (Score:1)
The one thing they can all agree on: (Score:2)
Of course, that's only among those that actually reviewed Linux support...
A short list of errors made in the reviews... (Score:1)
1. Anand posted benches that claimed the 5900U was platform-limited (at 223fps!) running Q3 at 1600x1200 with 4xAA and 8xAF.
2. Lars at Tom's mislabeled the D3 Medium Quality + 4xAA benches as High Quality + 4xAA
3. Kyle and Anand both ran D3 in Medium Quality with 8xAF set in the drivers, despite what seems to be the fact that Nvidia drivers interpret Medium Quality as forcing no AF, while ATI drivers do not
5. ExtremeTech's 3dMark03 build 320 vs. b
How is Linux support for these cards? (Score:2)
Back it goes today!
Jon Acheson
Re:problem (Score:1)
More realistic than Operation Flashpoint?
Re:hey (Score:1, Funny)
Re: (Score:2)
Re: (Score:2)