3dfx Voodoo 5 Review 72
gnewt writes "The guys at The Tech Report have put together a pretty comprehensive review of 3dfx's V5 5500 AGP. There's a comparison against several NVIDIA products, including the GeForce2, on image quality, FSAA, performance, and mip mapping. Overall, a nice introduction to modern 3D accelerators."
Re:And how is this news ? (Score:1)
Please Mister, don't take my Karma. Its all I have left.
Re:Yes... or no. (Score:2)
IANANeurologist, but, I seem to recall some kinda fringe-sciencey research on increased framerates in film that suggested that higher framerates make a stronger impression on the brain -- they seem more real to the viewer, are more likely to be remembered or to influence his/her thinking. You may not be able to tell the difference between 60 and 80 fps at first glance, but subconsciously, there is an impression. The military was even rumored to be thinking of using faster-than-normal framerates in their training films.
This makes sense to me -- for one thing, whatever framerate the eye allegedly runs at, I doubt it does so with atomic-clock precision. Even if you're only seeing at 40 frames per second, you're not seeing precisely one frame every 40th/second. Framerates that double your rate of perception will seem clearer because there's less of a chance of your nervous system detecting the flicker between frames or other subtleties that betray that what you're seeing is not real.
Also, The nervous system may very well be able to register hundreds of images per second in moments of high stress -- many people in life-threatening situation later recall a sensation of time slowing down, and suddenly registering an outrageous level of detail with surprising clarity. While a heated game of Quake 3 doesn't necessarily qualify as life-threatening, a faster framerate may be better at fooling a brain in a high state of excitement -- it may even help stimulate the brain in and of itself.
--
perl -e '$_="06fde129ae54c1b4c8152374c00";
s/(.)/printf "%c",(10,32,65,67,69,72,
Pretty sobering (Score:2)
Only question I have is how hot this thing is going to get? My Voodoo 3000 AGP get blisteringly hot after some games, and it only has a heatsink. This card has two fans!
Re:Yes... (Score:2)
Modern? (Score:1)
Do you mean last years cards? Last months? Or ones that are coming out in the future?
They're ALL modern, since 3D cards for the PC haven't existed for all that long...
Come on, don't be so hostile (Score:1)
BTW I am not karma whoring, I am honestly interested, if anybody ever saw anywhere websites that contain studies on the above, dissertations, whatever, please post with the link.
Re:Yes... (Score:2)
The thread was talking about optimal frame-per-second rates.
"Hz" reflects how often a phosphor on a CRT (not LCD or plasma) is reheated. If it's close enough to the refresh rate of fluorescent lighting (60Hz NA; 50Hz EU) then you get flickering. It is limited by the bandwidth of the video card+cable (expressed in MHz, typically), or sometimes by the quality of the monitor.
"FPS" is how often the image on screen is changed. In a 2D environment FPS is noncritical -- it's almost always "high enough". In a 3D environment, FPS becomes limited by the CPU, GPU, and memory bandwidth. It is very often not enough.
Obviously, if FPS is higher than Hz, not every frame will actually get displayed on your monitor. Low FPS will not cause flicker, however -- it causes a jumpy image.
cheers,
mike
ATI and Matrox have the right idea. (Score:2)
Zuh? (Score:1)
Mind you, I could be wrong.
Re:Yes... (Score:1)
Re:Time? (Score:1)
Why on /.? (Score:1)
Re:Method of creating a proxy or bridge? (Score:1)
based on the AMD 751 chipset has AGP now.)
Pan
Re:Yes... (Score:2)
You push the hardware as hard as it can go and you can get a good idea of how low your framerate will go in a frantic 10-people-on-the-screen-why-am-i-reloading mode
I just put together an AMD 900 w/ a GeForce2 MX for a friend and it almost made me consider my personal boycott on nVidia. On my old computer and on the new one, sitting still on the average game, i get a good framerate. But when it heats up, the new computer doesn't lose framerate. And that's the difference between life and death in a game.
So why is it that we've made all these advances in graphics hardware but we still accept 30-40fps as a 'standard'? Honestly, I'd prefer a game that looks a little less glitzy and delivers 40fps at ALL times. The gameplay is what it's about, isn't it?
Better than Geforce! (Score:1)
He said he played Quake3 in 1600x1200 with all the pertys turned on and get a fps of around 80 or so!
Sounds like a good Card!!!!
My v3 will have to last me for a while though cause I just bought a new car and am outa money.
Besides Does Linux have support for the V5 yet???
and I have heard nopthing about a v4? Are they making one? O is like Slack? from 4.x to 7.x?
Re:Time? (Score:2)
Re:Yes... (Score:1)
I keep waiting for the movie industry to adopt "Showscan" -- the 3x higher frame-rate projection of movies (preferably also using 72mm film). Now THAT would be worth paying 8 bucks to see!
- Spryguy
3D... What about 2D speed? (Score:1)
A screen repaint takes about five minutes on the Dell Optiplex GX-1 P3-500 that they gave me here at work. On the Sun Ultra 60 I'm currently using it's about 30 seconds.
I've been asked why I've ditched the WinNT box. They ask me if I need a new one. It was new in February.
On a side note, it doesn't feel much faster than the P2-266 that it replaced.
Jeff
Re:I can't wait for Tomb Raider IV (Score:1)
----
It seems... (Score:2)
And this isn't meant as flamebait, but I am sure they will ensue...
Re:And how is this news ? (Score:1)
LOL, no seriously, it's not about the karma (despite me reading
I do know that the 'flamebait' threshold is different for different people, so since this post is sure to be a bit controversial, I tried to show that I was aware of it and wasn't just mindlessly trying to start a fanboys war.
Thanks for the laugh, though, if I had mod points left I would mod you up as funny
Well said. (Score:1)
I'm now happily playing UT with my new Voodoo 3.
Re:Method of creating a proxy or bridge? (Score:2)
Hmmm.. Sounds like fun!! I've got a Windsor-block 351 out back, and I'm sure I can get a slightly rusty 'Scort with a blown motor for $200.. But I'd have to beef the frame, knock the firewall back, put in a new rear-end..
Nah, I think I'll just go play another round of Need For Speed IISE on my pimped out 5x86-166 (Peltier, 256M of EDO and a SLI pair of 12M VooDoo II)
Re:Yes... (Score:1)
Re:Time? (Score:1)
The facts about perceivable FPS differenc (Score:1)
Refresh rate - this affects the 'flicker' you can see in the screen. In a bright environment you will perceive flicker at higher refresh rates than in a dark enironment. . In a dark environment you can get away with 60Hz (one of the reasons for dark rooms when doing video editing etc).
Frame rates.
The actual issue is 'the amount an object moves between frames' - not the frame rate.
So if you do an 180 degree spin you might need 200FPS to not perceive stutter. If you are doing a really slow pan you can get away with a much lower frame rate.
There is no such thing as the 'maximum framerate a human can perceive'. You see stuttering if an object moves more than a certain amount between frames.
I asked this exact same question of one of our Chief Scientists (I work for SGI) and he referenced a paper on this and summarised it:
Reference:
VISUAL DETECTION OF MOTION
ANDREW T SMITH / ROBERT J SNOWDEN
0 12 651660 X
Comment:
In there it indicates that deltamax should be less
than 15 arc mins. That is, for perception of smooth motion, objects should not displace more than 15 arc mins. from one frame to another.
As a reference, 1 arc min. subtends approx 1.8mm,
at a distance of 6m. This is how we get the term
6/6 vision using the Snellen chart.
Re:V5 heat. (Score:2)
Why don't you ask the people who wrote the story? (Score:1)
"THIS REVIEW IS LATE. Very late. 3dfx's Voodoo 5
has been on retail now longer than Sam Walton.
And I couldn't be more pleased about that fact. We
delayed our review's release after 3dfx and NVIDIA
introduced new drivers, and we've got a full comparison using the 3dfx's latest release drivers and NVIDIA's
Detonator 3s. "
All news to me--I haven't been following the Voodoos of late, since I can't afford a new card at the moment...but I'm interested and haven't really read a decent review written by a non-idiot. As for not being worthy of a slashdot post...you fucking trolls don't think *anything* is worthy of a slashdot post. Go start your own fucking site!
Textures (Score:1)
I was surprised to hear and see the obvious artifacts nVidia's over ambitious compression creates.
I'm curious, though, if there's an ultimate texture solution. I'm always reading about double and triple texture redundancy, and the failure of memory bandwidth to get the textures to chips (made worse by sharing the memory between dual chips.)
Is there a perfect way of doing this? Or just compression, redundancy and too little bandwidth?
Two words and one number: Quake III Arena. (Score:1)
3dfx's linux support is something of a myth (Score:3)
"A note on Linux performance
We managed to get pretty far down the road of testing the Voodoo 5 using XFree86 4.x with Linux before we discovered that 3dfx's current X drivers don't support dual-chip operation with the V5. That means no SLI mode and no FSAA, either. At that point, we decided to forego benchmarking the V5 in Linux until the drivers have some time to mature. 3dfx has committed to open source principles and to Linux support, and we have every reason to believe they'll continue their tradition of decent Linux support. But it ain't there yet."
Gee, no problem, I'm only missing one of my two cpu's, and half the memory. I wonder if that means I still need the external power?
So the v5's linux support is about as good as Win9x's support for a dual processor system, that being you only get half of what your trying to use.
Funny how the GeForce2 GTS's Linux scores are within 99% of the Windows scores.
But then, what do I know?
Re:And how is this news ? (Score:1)
Thanks, I really didn't mean any harm. It popped into my head so I had to post it. No hostility intended.
Possible reason for the hostility. (Score:1)
This was on [name that site] last week!
How is this news?
You guys suck.
You call this new news?
What the hell?
Natalie Portman says it's old news!
I'm a Mac user and I resent the Linux bias!
I'm a Windows user and I resnet the Linux bias!
I'm a BeOS user and I resent the Linux bias!
I'm a *BSD user and I resent the Linux bias!
I'm not a Linux user so how is this news?
I knew this already so therefore it's not news to anyone.
I hate you guys now that you're corporate.
I'll just go post Emily Dickinson poems now.
I think this story is lame. How is it news then?
and on and on and on...*sighs* the hostility, at least in my case, is listening to a bunch of snot-nosed brats complain that, because *they* don't find a story newsworthy, then it shouldn't matter to *me.* Or to anyone else for that matter. If you don't know yet, then, hell, I guess you don't deserve to know.
As far as the Linux bias goes, there's ways of filtering the crap out if you sign up for an account. It's easy.
Actually, the idea is quite good. (Score:2)
Actually, the modular design is a brilliant idea - it gives built-in design scalability, and gives you huge memory bandwidth gains. Memory bandwidth limits fill rate for games at the resolutions I play at, so this is a big win.
3dfx's mistake was making their individual chips sub-optimal.
The logical way of salvaging what they can is to switch to 0.15 micron (or better if they can get it) and put multiple VSA cores on one die. This gives them a higher maximum clock frequency and lower power dissipation. Have multiple memory busses running from the die (or interleave requests across one bus), and you have a card with decent processing power and enough memory bandwidth to beat the hell out of anyone else. Part count goes down when multiple cores are bundled onto one die, so the card costs a lot less too.
All of this is effectively "for free", as the R&D has already been done in the development of the Voodoo 5 series.
It remains to be seen whether 3dfx has the money left to pursue this course.
Tweaking the chips themselves wouldn't hurt, but is secondary compared to the huge benefits of the modular architecture itself.
Re:Textures (Score:1)
The Dreamcast uses Vector Quantisation to compress its textures. On a VGA adapter it doesn't look too bad at all on Shenmue. VQ has some thought behind it, while from what I understand of S3/MS DXT it's pretty crude conceptually. Consoles can innovate, whereas PCs are hindered by standards and existing APIs.
With 8:1 compression as claimed for DC (VQ can do much better, I've looked into that, so the 8x ration is conservative), that's equivalent to the bandwidth ratio betwee PCI (132Mbytes/s) and AGP 4x (1Gbytes/s). FWIW I wish there were a few more PCI graphics cards out there too.
$800 is far too much for me (Score:1)
---
Every secretary using MSWord wastes enough resources
Re:Yes... (Score:2)
On a 24fps movie the shutter is opened twice per frame. This helps eliminate strobing of the image. Even with this measure, you can see strobing under normal lighting conditions. There is a *technical* reason why movies are shown in the dark.
For comparison NTSC television is shot at 60 fields per second. These 60 fields are interlaced to give 30 frames per second. (Well, actually 29.997 fps)
In any case, you can tell the difference in frame rate. For example Showscan, used for amusement rides, is shot at 60 frames per second, giving a strong "you are there" sensation. The faster the frame rate, the more your brain tends to believe your eyes. (There is an upper limit, but I don't know it.)
Interestingly the guy who created Showscan, Douglass Trumbull, rejects the use of Showscan for storytelling. "It was film that looked too real-too much perhaps like video."(Videography Magazine May 2000 page 30)
There is also the issue of frame rate variance discussed by another poster. When the scene gets 'busy' you want to make sure your frame rate stays high. This is more critical for computer applications because they have an effectively high shutter speed. By this I mean each frame is still and sharp, it lacks motion blur. If it had accurate motion blur a lower frame rate would be more tolerable. (Inaccurate motion blur will confuse the viewers notion of how things are moving.) Without motion blur images are more easily percieved as "strobing" or flickering, which of course detracts from the realism of the environment. Since adding accurate motion blur is hard, you have to pump up those frame rates- and keep them high.
Re:Yes... (Score:1)
Re:What do you mean? (Score:1)
I sold the MX yesterday, and I'm actually DOWNGRADING back to the V3.
"modern" (Score:1)
---
Solaris/FreeBSD/Openstep/NeXTSTEP/Linux/ultrix/OS
Re:Yahoo! Porn in better colors, and 3D (Score:2)
----
Method of creating a proxy or bridge? (Score:1)
Re:Yes... (Score:2)
The explanation lies in the fact that the human eye can only see X frames per second. If I remember correctly, it is somewhere in the neighborhood of 40-60. So once you get above that range, your eyes/brain can't tell much (if any) of a difference from increased framerates.
=================================
Re:Method of creating a proxy or bridge? (Score:1)
--
3DFX Sucks (Score:1)
Re:Yes... (Score:1)
It doesn't make a difference to me, other than the psychological impact of "I have the fastest card out there".
--
Re:Method of creating a proxy or bridge? (Score:1)
There are a number of decent computers that could benefit from a Geforce2 or a Voodoo 5500 that don't have AGP slots. My computer, for example, came with an on board AGP SiS graphics chip... Not the greatest in the world.
Ranessin
Re:Yes... (Score:2)
Even more importantly, it's important to notice that the rate is an AVERAGE. Even if the game has an average of, say, 100 FPS, in moments of intense action the rate still drops WAY WAY down. I wouldn't dare play an intensive game at 40 FPS -- as soon as the action started, you will be likely to drop into the single digits!
Of course, for single player games, 40 is probably perfectly adequate.
Not to mention, with full screen antialiazing just beginning to take off, all of that extra rendering power can now be used to make the images look better! Right now, only the most powerful video cards can perform any anti-aliasing at all, and it's not even particularly good anti-aliasing. There is still a ways to go yet...
Re:Yes... (Score:4)
As others have mentioned, the top end is probably closer to 60fps than 40.
More important, though, is the headroom you get with a faster card. A game like Q3 has a standard deviation of about 7fps [avsim.com], which means over 15% [wolfram.com] of your frames are under 33fps, and about 3% [wolfram.com] are under 26fps. These are very noticeable slowdowns.
At 80fps mean, your standard deviation may jump to 14 fps (it's not a linear progression in real life, but for argument's sake...), 97% of your frames are at 52fps+, and 99.85% above 38fps. So it's smooth all the time, not just when you're standing around with nothing happening.
And that's why NVidia is still in business.
cheers,
mike
Re:3dfx's linux support is something of a myth (Score:1)
Re:V5 heat. (Score:1)
Re:V5 heat. (Score:1)
Volts or Mhz?
Re:3dfx's linux support is something of a myth (Score:1)
Funny how nVidia's drivers hard lock my machine, forcing me to hit the power. Funny how all queries sent to the e-mail address nVidia supplied went unanswered. Funny how the irc channel they suggest was quite useless for solving this problem... And funny how nVidia has lost my business since it obviously never meant anything to them in the first place.
Ranessin
Re:Pretty sobering... ow! Hot! (Score:1)
Just program it to not send all the data (Score:1)
Re:The Voodoos (Score:1)
Re:Method of creating a proxy or bridge? (Score:2)
And how is this news ? (Score:2)
Come on, now you only have to post info about the Radeon and we're all set.
OTOH I wonder why Anand pitted the MX against the G400, maybe he didn't want to piss off Matrox too much ? If the MX totally destroyed the G400 can you imagine what the GeF2 would have done ?
To the moderators: please resist the urge to mod this as flamebait. I have used the V1 (owned), the V3(work), the V5(work), the G400(own a Max) and the GeForce2(work), so I feel I am fairly neutral here.
Re:Yes... (Score:2)
Perceptually, there is no difference when looking. The difference comes with strain on the eye. Eye has to work harder to fool the brain at lower refresh rates. This causes all kinds of muscle strain.
Of course, when graphics cards can put up one frame per CRT scan, that is the utmost limit and beyond which there can be no perceieved difference simply because the monitor is the limiting factor.
Just because your eyes cannot discern between events .05 secs apart (in the best case) does not mean that there is no perceived difference. For instance, stand outside at night, stare into a camera flash (this takes less than .1 secs if you are close enough to the camera) and then try to see anything for 1 hr (assuming you didn't blow your rods and cones). Admittedly, this is the most extreme of the situation but it illustrates the point that it isn't the fact that the eye can only "reset" the retina 10-20 times a second, but that the mechanics of the eye that become important for a pleasureable experience.
Intro to modern graphics cards (Score:3)
-----------------------
Re:Method of creating a proxy or bridge? (Score:2)
Re:Yes... (Score:2)
A lot of people talk about visible framerates, and compare games with film. This simply cannot be done. Film contains temporal anti-aliasing, which smooths out a lot of the visual jerkiness that a game would exhibit rendering the same scene at the same speed.
Re:Yes... (Score:1)
And if a decent 3d viewing mechanism [slashdot.org] is ever mainstreamed, FPS will be pretty much divided by two...
___
Re:And how is this news ? (Score:1)
Re:Yes... (Score:1)
Time? (Score:3)
VSA-100 is probably their biggest creation for a long time, but simplicity of the chip, and it's need for parallelism for multiple VSA-100s which basically amounts to SLI on the same card is suggesting that 3dfx should spend less money on those dumb commercials and spend more on actually making something good.
Those little tricks don't work anymore when you're starting to become the underdog, with companies like nVidia and ATI nipping at your heels...
Another former Voodoo user (Score:1)
Re:Yes... (Score:1)
You obviously don't play FPS games or you've never had a high enough frames per sec to notice the difference. The difference in your ability to play well between 40 frames per second and 60 is amazing. You are correct, however, on that there is an upper limit on percievable effect of adding more frames per second but it is closer to 80. I saw a nice scientific explaination of it all, but I'm too lazy to look up the reference.
What do you mean? (Score:1)
Re:Why this?? (Score:1)
--
Re:Yes... (Score:2)
You can definitely notice improvements above 40fps. True, you get diminishing returns after the 30-40fps point, but there's something silky-smooth about 60+ fps that puts 40fps to shame.
Obviously the eye and the brain can process information MUCH faster than 30 or 40hz.... why do you think monitors look better at 80hz than 60hz? I'm not sure where that conventional wisdom (anything over 30-40fps doesn't matter) got started, but it's totally wrong.
Even if you couldn't tell the difference between 40 and 60fps, a card with excess power is going to be more future-proof. A card that can only run today's games at 30fps is going to have a hard time with next year's games. Today's monster card that runs today's games at 200fps is going to work nicely with next year's games, too. :-)
Whoa, that troll almost bit my hand off when I fed him!
Re:Yes... (Score:3)
Maybe not conciously, but it DOES make a difference. Ever seen an I-Max movie? They're shot at 48FPS instead of 24, and it definately makes a difference.