Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware

3dfx Voodoo 5 Review 72

gnewt writes "The guys at The Tech Report have put together a pretty comprehensive review of 3dfx's V5 5500 AGP. There's a comparison against several NVIDIA products, including the GeForce2, on image quality, FSAA, performance, and mip mapping. Overall, a nice introduction to modern 3D accelerators."
This discussion has been archived. No new comments can be posted.

3dfx Voodoo 5 Review

Comments Filter:
  • by Anonymous Coward
    To the moderators: please resist the urge to mod this as flamebait.

    Please Mister, don't take my Karma. Its all I have left.

  • The explanation lies in the fact that the human eye can only see X frames per second. If I remember correctly, it is somewhere in the neighborhood of 40-60. So once you get above that range, your eyes/brain can't tell much (if any) of a difference from increased framerates.

    IANANeurologist, but, I seem to recall some kinda fringe-sciencey research on increased framerates in film that suggested that higher framerates make a stronger impression on the brain -- they seem more real to the viewer, are more likely to be remembered or to influence his/her thinking. You may not be able to tell the difference between 60 and 80 fps at first glance, but subconsciously, there is an impression. The military was even rumored to be thinking of using faster-than-normal framerates in their training films.

    This makes sense to me -- for one thing, whatever framerate the eye allegedly runs at, I doubt it does so with atomic-clock precision. Even if you're only seeing at 40 frames per second, you're not seeing precisely one frame every 40th/second. Framerates that double your rate of perception will seem clearer because there's less of a chance of your nervous system detecting the flicker between frames or other subtleties that betray that what you're seeing is not real.

    Also, The nervous system may very well be able to register hundreds of images per second in moments of high stress -- many people in life-threatening situation later recall a sensation of time slowing down, and suddenly registering an outrageous level of detail with surprising clarity. While a heated game of Quake 3 doesn't necessarily qualify as life-threatening, a faster framerate may be better at fooling a brain in a high state of excitement -- it may even help stimulate the brain in and of itself.


    --
    perl -e '$_="06fde129ae54c1b4c8152374c00";
    s/(.)/printf "%c",(10,32,65,67,69,72,
  • I was beginning to think 3DFX was dead in the water. True, the card doesn't have the same feature set, but the performance is more than respectable.

    Only question I have is how hot this thing is going to get? My Voodoo 3000 AGP get blisteringly hot after some games, and it only has a heatsink. This card has two fans!

  • Ugh! I can't stand to sit-down at other people's computers when they have the refresh rate set at 60Hz: it flickers so badly. I need the refresh rate set at least 72Hz before I stop noticing flicker, 75 better, 85 or above the best.
  • Define "modern."

    Do you mean last years cards? Last months? Or ones that are coming out in the future?

    They're ALL modern, since 3D cards for the PC haven't existed for all that long...

  • why so much hostility ? I never really understood why people can be so much more hostile online than in real life... maybe it's that people are hostile in real life as well, but the repercussions for showing it are much higher than in the impermanent internet world... (not that the real world is permanent at all, nithya is the right word IIRC ;)

    BTW I am not karma whoring, I am honestly interested, if anybody ever saw anywhere websites that contain studies on the above, dissertations, whatever, please post with the link.
  • 60Hz: it flickers so badly. I need the refresh rate set at least 72Hz before I stop noticing flicker

    The thread was talking about optimal frame-per-second rates.

    "Hz" reflects how often a phosphor on a CRT (not LCD or plasma) is reheated. If it's close enough to the refresh rate of fluorescent lighting (60Hz NA; 50Hz EU) then you get flickering. It is limited by the bandwidth of the video card+cable (expressed in MHz, typically), or sometimes by the quality of the monitor.

    "FPS" is how often the image on screen is changed. In a 2D environment FPS is noncritical -- it's almost always "high enough". In a 3D environment, FPS becomes limited by the CPU, GPU, and memory bandwidth. It is very often not enough.

    Obviously, if FPS is higher than Hz, not every frame will actually get displayed on your monitor. Low FPS will not cause flicker, however -- it causes a jumpy image.

    cheers,
    mike

  • I think the memory bandwidth problem is best solved by ATI and Matrox. ATI uses its HyperZ archtecture to greatly reduce the number of reads into the Zbuffer, which really helps memory bandwidth. The fact that it can beat a GForce2 in some tests (only one or two now that the Detonator 3 drivers are out. In the others it still loses significantly Check www.sharkyextreme.com for the latest benchmarks in the ATI 32MB Radeon review) even though its runs as a significantly lower clock and has a good deal less fill-rate is a good indication of this. Matrox seems to have the best idea with its DualBus. Put two 128bit DDR SDRAM interfaces on the board and watch the bandwidth problems magicall dissapear.
  • I thought he pitted the MX vs the 450 because of similar features and price points. He was also doing it because there were few/no good "which card is better for Linux" reviews out there.

    Mind you, I could be wrong.
  • Yes I find my eyes are just about as strained either way. Kidding aside, 60Hz is flickery and causes eyestrain for me and 80Hz+ also wears my eyes out, I can't see it, but I can feel it aferwards. Personally, I like refresh rates at something between 67Hz and 80Hz. That for me is easiest on the eyes. Of course this all goes out the window when using an active matrix LCD because LCDs work differently and for me they never cause eyestrain. Nevermind the fact that many older color active matrix LCDs, and maybe even some newer ones too, use a vertical refresh rate of a bit less than 60Hz.
  • Actually, this isn't really the Voodoo architecture at all anymore. The V3+ boards are really based on the Banshee architecture, with some of the more advanced (SLI) features retrofit onto the chip. I recall reading an article at the time that 3dfx felt that the Banshee 2 just wasn't as sexy as the Voodoo 3 and therefore they went back to their franchise name.
  • Why are there graphic accelerator reviews (OLD ones) posted on slashdot? Are we really out of topics to post about? I mean, there are a lot of sites that'll link to stuff like that (bluesnews, perhaps?). I never thought that slashdot would degenerate to this level.
  • Most alpha systems require PCI. (The newer XP1100
    based on the AMD 751 chipset has AGP now.)

    Pan
  • You've hit the nail on the head. This is why graphics card reviewers use tests like the old Crusher demo.

    You push the hardware as hard as it can go and you can get a good idea of how low your framerate will go in a frantic 10-people-on-the-screen-why-am-i-reloading mode :o)

    I just put together an AMD 900 w/ a GeForce2 MX for a friend and it almost made me consider my personal boycott on nVidia. On my old computer and on the new one, sitting still on the average game, i get a good framerate. But when it heats up, the new computer doesn't lose framerate. And that's the difference between life and death in a game.

    So why is it that we've made all these advances in graphics hardware but we still accept 30-40fps as a 'standard'? Honestly, I'd prefer a game that looks a little less glitzy and delivers 40fps at ALL times. The gameplay is what it's about, isn't it?
  • A friend of mine just bought a Voodoo5 5500 and he says it kicks his Geforce256 up and down!
    He said he played Quake3 in 1600x1200 with all the pertys turned on and get a fps of around 80 or so!
    Sounds like a good Card!!!!
    My v3 will have to last me for a while though cause I just bought a new car and am outa money.
    Besides Does Linux have support for the V5 yet???
    and I have heard nopthing about a v4? Are they making one? O is like Slack? from 4.x to 7.x?
  • But Banshee was just basically a 3d/2d hybrid card that combines the Voodoo chip (not Voodoo1, but Voodoo chip architecture) together with a 2d chip. Voodoo3 just extended Banshee, like you said.
  • Yeah, but even at the movies, I can see the flickering (especially in large mostly white shots) and it bugs the hell out of me.

    I keep waiting for the movie industry to adopt "Showscan" -- the 3x higher frame-rate projection of movies (preferably also using 72mm film). Now THAT would be worth paying 8 bucks to see!

    - Spryguy
  • I am not playing games, but I do need a high speed graphics subsystem. Right now I have an ADS (Agilent's Advanced Design System) layout that's 16 inches tall by 17 inches wide, has six metal layers and will contain about 40,000 wires when I'm done.

    A screen repaint takes about five minutes on the Dell Optiplex GX-1 P3-500 that they gave me here at work. On the Sun Ultra 60 I'm currently using it's about 30 seconds.

    I've been asked why I've ditched the WinNT box. They ask me if I need a new one. It was new in February.

    On a side note, it doesn't feel much faster than the P2-266 that it replaced.

    Jeff

  • Are you implying her shorts aren't normally short, and her shirt isn't normally tight? Do they even make shirts that *wouldn't* be tight on Lara Croft?
    ----
  • As much as I have liked 3DFX products in the past, that they have made a few (fatal?) errors on this product. One, reading the review, you get the defenite impression that this product was cobbled together quickly with many shortcuts that affect the overall quality, and then rushed to market in order to keep afloat against NVIDIA. The product almost has an air of desperation about it. While they do have a few good ideas, like being able to daisy-chain the processors in future products, I would bet heat and power considerations will limit this in the future. Overall, it is almost as if 3DFX is trying to cheat thier customers with a holding action while they, hopefully, deliver a once more superior product in the future.

    And this isn't meant as flamebait, but I am sure they will ensue...
  • Please Mister, don't take my Karma. Its all I have left

    LOL, no seriously, it's not about the karma (despite me reading /. for more than 3 years I have very little, being mostly a lurker) it's that a couple of time it happened that my posts were thought to be flamebait, when I was just trying to spark a discussion.

    I do know that the 'flamebait' threshold is different for different people, so since this post is sure to be a bit controversial, I tried to show that I was aware of it and wasn't just mindlessly trying to start a fanboys war.

    Thanks for the laugh, though, if I had mod points left I would mod you up as funny ;)
  • I have a TNT2 in a Zip-Loc bag in my closet. It ended up there when I finally got sick of waiting for Nvidia to deliver on their promise of solid Linux drivers.

    I'm now happily playing UT with my new Voodoo 3.
  • taking a 450hp V8 engine and putting it in a 15-year-old Ford Escort

    Hmmm.. Sounds like fun!! I've got a Windsor-block 351 out back, and I'm sure I can get a slightly rusty 'Scort with a blown motor for $200.. But I'd have to beef the frame, knock the firewall back, put in a new rear-end..

    Nah, I think I'll just go play another round of Need For Speed IISE on my pimped out 5x86-166 (Peltier, 256M of EDO and a SLI pair of 12M VooDoo II)
  • Forget that. I've become addicted. For anything other than high speed gaming, I gotta have a good LCD.

  • Effectively that happened -- with 3DFX's merger with Gigapixel earlier this year, they've now got a whole new team designing the cool hardware for next year. The products you're seeing now were mostly designed in 1998/early 1999 -- it takes a while to go from design to fab to board production.
  • There are two factors to worry about:
    Refresh rate - this affects the 'flicker' you can see in the screen. In a bright environment you will perceive flicker at higher refresh rates than in a dark enironment. . In a dark environment you can get away with 60Hz (one of the reasons for dark rooms when doing video editing etc).

    Frame rates.

    The actual issue is 'the amount an object moves between frames' - not the frame rate.

    So if you do an 180 degree spin you might need 200FPS to not perceive stutter. If you are doing a really slow pan you can get away with a much lower frame rate.

    There is no such thing as the 'maximum framerate a human can perceive'. You see stuttering if an object moves more than a certain amount between frames.

    I asked this exact same question of one of our Chief Scientists (I work for SGI) and he referenced a paper on this and summarised it:

    Reference:

    VISUAL DETECTION OF MOTION
    ANDREW T SMITH / ROBERT J SNOWDEN
    0 12 651660 X
    Comment:

    In there it indicates that deltamax should be less
    than 15 arc mins. That is, for perception of smooth motion, objects should not displace more than 15 arc mins. from one frame to another.

    As a reference, 1 arc min. subtends approx 1.8mm,
    at a distance of 6m. This is how we get the term
    6/6 vision using the Snellen chart.

  • Don't worry, the V5 can't get any hotter than the V3 did. An actual temperature reading on a Voodoo 3 3500 was 180F, legally hot enough to cook raw meat in New York City, and the temperature limit set by Intel at which point performance and stability start to degrade. The V3 AGP series was a thermal mistake. The only way for 3dfx to avoid magnetic noise in the bus from the multiple fans, however, was to add on the power connector.
  • From the actual review:

    "THIS REVIEW IS LATE. Very late. 3dfx's Voodoo 5
    has been on retail now longer than Sam Walton.
    And I couldn't be more pleased about that fact. We
    delayed our review's release after 3dfx and NVIDIA
    introduced new drivers, and we've got a full comparison using the 3dfx's latest release drivers and NVIDIA's
    Detonator 3s. "

    All news to me--I haven't been following the Voodoos of late, since I can't afford a new card at the moment...but I'm interested and haven't really read a decent review written by a non-idiot. As for not being worthy of a slashdot post...you fucking trolls don't think *anything* is worthy of a slashdot post. Go start your own fucking site!
  • "Turning texture compression off on either card results in a serious performance hit, so you will definitely want to leave it on. However, I find NVIDIA's compression artifacting much less bearable than 3dfx's. And it's not just in the sky texture--on the GeForce, simple gray wall textures have purple splotches in them, and the whole image looks less "pure" with compression turned on."

    I was surprised to hear and see the obvious artifacts nVidia's over ambitious compression creates.

    I'm curious, though, if there's an ultimate texture solution. I'm always reading about double and triple texture redundancy, and the failure of memory bandwidth to get the textures to chips (made worse by sharing the memory between dual chips.)

    Is there a perfect way of doing this? Or just compression, redundancy and too little bandwidth?

  • by Tridus ( 79566 ) on Wednesday September 20, 2000 @10:03AM (#765985) Homepage
    From page 10 in the article:

    "A note on Linux performance
    We managed to get pretty far down the road of testing the Voodoo 5 using XFree86 4.x with Linux before we discovered that 3dfx's current X drivers don't support dual-chip operation with the V5. That means no SLI mode and no FSAA, either. At that point, we decided to forego benchmarking the V5 in Linux until the drivers have some time to mature. 3dfx has committed to open source principles and to Linux support, and we have every reason to believe they'll continue their tradition of decent Linux support. But it ain't there yet."

    Gee, no problem, I'm only missing one of my two cpu's, and half the memory. I wonder if that means I still need the external power?

    So the v5's linux support is about as good as Win9x's support for a dual processor system, that being you only get half of what your trying to use.

    Funny how the GeForce2 GTS's Linux scores are within 99% of the Windows scores.

    But then, what do I know?
  • Thanks, I really didn't mean any harm. It popped into my head so I had to post it. No hostility intended.

  • The most amazing thing about the Slashdot commenting system (at least, if you don't let the system filter anything out) is that most of the comments read like this:

    This was on [name that site] last week!
    How is this news?
    You guys suck.
    You call this new news?
    What the hell?
    Natalie Portman says it's old news!
    I'm a Mac user and I resent the Linux bias!
    I'm a Windows user and I resnet the Linux bias!
    I'm a BeOS user and I resent the Linux bias!
    I'm a *BSD user and I resent the Linux bias!
    I'm not a Linux user so how is this news?
    I knew this already so therefore it's not news to anyone.
    I hate you guys now that you're corporate.
    I'll just go post Emily Dickinson poems now.
    I think this story is lame. How is it news then?

    and on and on and on...*sighs* the hostility, at least in my case, is listening to a bunch of snot-nosed brats complain that, because *they* don't find a story newsworthy, then it shouldn't matter to *me.* Or to anyone else for that matter. If you don't know yet, then, hell, I guess you don't deserve to know.

    As far as the Linux bias goes, there's ways of filtering the crap out if you sign up for an account. It's easy.
  • VSA-100 is probably their biggest creation for a long time, but simplicity of the chip, and it's need for parallelism for multiple VSA-100s which basically amounts to SLI on the same card is suggesting that 3dfx should spend less money on those dumb commercials and spend more on actually making something good.

    Actually, the modular design is a brilliant idea - it gives built-in design scalability, and gives you huge memory bandwidth gains. Memory bandwidth limits fill rate for games at the resolutions I play at, so this is a big win.

    3dfx's mistake was making their individual chips sub-optimal.

    The logical way of salvaging what they can is to switch to 0.15 micron (or better if they can get it) and put multiple VSA cores on one die. This gives them a higher maximum clock frequency and lower power dissipation. Have multiple memory busses running from the die (or interleave requests across one bus), and you have a card with decent processing power and enough memory bandwidth to beat the hell out of anyone else. Part count goes down when multiple cores are bundled onto one die, so the card costs a lot less too.

    All of this is effectively "for free", as the R&D has already been done in the development of the Voodoo 5 series.

    It remains to be seen whether 3dfx has the money left to pursue this course.

    Tweaking the chips themselves wouldn't hurt, but is secondary compared to the huge benefits of the modular architecture itself.
  • The Dreamcast uses Vector Quantisation to compress its textures. On a VGA adapter it doesn't look too bad at all on Shenmue. VQ has some thought behind it, while from what I understand of S3/MS DXT it's pretty crude conceptually. Consoles can innovate, whereas PCs are hindered by standards and existing APIs.

    With 8:1 compression as claimed for DC (VQ can do much better, I've looked into that, so the 8x ration is conservative), that's equivalent to the bandwidth ratio betwee PCI (132Mbytes/s) and AGP 4x (1Gbytes/s). FWIW I wish there were a few more PCI graphics cards out there too.

  • But it can be bought for money at least (here [price.ru]).
    ---
    Every secretary using MSWord wastes enough resources
  • I-Max is shot at 48 frames per second, but movies shot at 24 fps are screened at 48 fps. I would recommend trying to get your frame rates above 48fps for your whole experience.

    On a 24fps movie the shutter is opened twice per frame. This helps eliminate strobing of the image. Even with this measure, you can see strobing under normal lighting conditions. There is a *technical* reason why movies are shown in the dark.

    For comparison NTSC television is shot at 60 fields per second. These 60 fields are interlaced to give 30 frames per second. (Well, actually 29.997 fps)

    In any case, you can tell the difference in frame rate. For example Showscan, used for amusement rides, is shot at 60 frames per second, giving a strong "you are there" sensation. The faster the frame rate, the more your brain tends to believe your eyes. (There is an upper limit, but I don't know it.)

    Interestingly the guy who created Showscan, Douglass Trumbull, rejects the use of Showscan for storytelling. "It was film that looked too real-too much perhaps like video."(Videography Magazine May 2000 page 30)

    There is also the issue of frame rate variance discussed by another poster. When the scene gets 'busy' you want to make sure your frame rate stays high. This is more critical for computer applications because they have an effectively high shutter speed. By this I mean each frame is still and sharp, it lacks motion blur. If it had accurate motion blur a lower frame rate would be more tolerable. (Inaccurate motion blur will confuse the viewers notion of how things are moving.) Without motion blur images are more easily percieved as "strobing" or flickering, which of course detracts from the realism of the environment. Since adding accurate motion blur is hard, you have to pump up those frame rates- and keep them high.
  • Thanks for bringing up the point about the monitor's refresh rate. Not sure that people realize that you CAN'T see 90 fps if the refresh rate of the monitor is 75Hz.
  • EXACTLY. I bought a GF2MX thinking that my Voodoo3 was out of date. But i've figured out that for my purposes, it's just as good and even better. I was getting 80fps in UT with the V3...60 with the MX. I also couldn't overclock the thing without crashes related to nvdsp.drv

    I sold the MX yesterday, and I'm actually DOWNGRADING back to the V3.

  • No offense but I think you meant "0 day du0d!" graphics accelerators. Modern tends to be mapped to yearly time, not monthly.

    ---
    Solaris/FreeBSD/Openstep/NeXTSTEP/Linux/ultrix/OSF /...
  • Of course. Look at Tomb Raider.
    ----
  • Is there a method of using some but not all of the bandwidth of a video card without using it all so that it can be adapted to a smaller or older motherboard without the AGP interface?
  • You are correct, however, on that there is an upper limit on percievable effect of adding more frames per second but it is closer to 80. I saw a nice scientific explaination of it all, but I'm too lazy to look up the reference.

    The explanation lies in the fact that the human eye can only see X frames per second. If I remember correctly, it is somewhere in the neighborhood of 40-60. So once you get above that range, your eyes/brain can't tell much (if any) of a difference from increased framerates.

    =================================
  • You want a PCI version? 3dfx [3dfx.com] has got one. AGP still isn't a big deal -- sure it's faster, but not by a whole lot.
    --
  • Got the Voodoo2 the very first week it came out. I got one of the Creative cards when they came out of no where and beat Diamond out to market. Got a Banshee the later that year. Direct3D was OK. Glide was a pain on both cards. The newer versions of glide always had trouble running older games. V3 had the same problem I heard. Now I'm on a Geforce2 with 64MB RAM DDR. Run all my games at the highest resolution with all the features maxed out. Never had a compatibility problem with a game. 3DFX is a has been. They made some bad business decisions and now they are paying for it. Even the V5 still has a lot of roots in the original Voodoo.
  • I've played 40, 60, even 80 on Q3A, Tribes, Tribes 2 beta, Quake2, Q2CTF, QWCTF, GlQuake....

    It doesn't make a difference to me, other than the psychological impact of "I have the fastest card out there".

    --


  • There are a number of decent computers that could benefit from a Geforce2 or a Voodoo 5500 that don't have AGP slots. My computer, for example, came with an on board AGP SiS graphics chip... Not the greatest in the world.

    Ranessin
  • Man, you haven't played many games lately, have you? Not only can you notice the difference, but the difference is MUCH bigger than you think. Try it yourself, and then do a quick 180 degree turn in a game.

    Even more importantly, it's important to notice that the rate is an AVERAGE. Even if the game has an average of, say, 100 FPS, in moments of intense action the rate still drops WAY WAY down. I wouldn't dare play an intensive game at 40 FPS -- as soon as the action started, you will be likely to drop into the single digits!

    Of course, for single player games, 40 is probably perfectly adequate.

    Not to mention, with full screen antialiazing just beginning to take off, all of that extra rendering power can now be used to make the images look better! Right now, only the most powerful video cards can perform any anti-aliasing at all, and it's not even particularly good anti-aliasing. There is still a ways to go yet...

  • by nakaduct ( 43954 ) on Wednesday September 20, 2000 @08:19AM (#766003)
    Above an average of about 40FPS, nobody notices anymore - they can't!

    As others have mentioned, the top end is probably closer to 60fps than 40.

    More important, though, is the headroom you get with a faster card. A game like Q3 has a standard deviation of about 7fps [avsim.com], which means over 15% [wolfram.com] of your frames are under 33fps, and about 3% [wolfram.com] are under 26fps. These are very noticeable slowdowns.

    At 80fps mean, your standard deviation may jump to 14 fps (it's not a linear progression in real life, but for argument's sake...), 97% of your frames are at 52fps+, and 99.85% above 38fps. So it's smooth all the time, not just when you're standing around with nothing happening.

    And that's why NVidia is still in business.

    cheers,
    mike

  • Yeah, and when I bought my Voodoo3, there was no Glide support for Linux. Strange; Glide works fine now. I guess that's becaue they fixed it. :^P
  • That's what I did to my voodoo3 2000 PCI! (of course, it was on my OLLLD 233 MMX, which my mom now uses)
  • >>> slap on some peltier REAL cooling action! .. Ramp it up to 220! <<<

    Volts or Mhz?
  • "Funny how the GeForce2 GTS's Linux scores are within 99% of the Windows scores."

    Funny how nVidia's drivers hard lock my machine, forcing me to hit the power. Funny how all queries sent to the e-mail address nVidia supplied went unanswered. Funny how the irc channel they suggest was quite useless for solving this problem... And funny how nVidia has lost my business since it obviously never meant anything to them in the first place.

    Ranessin
  • Only question I have is how hot this thing is going to get? My Voodoo 3000 AGP get blisteringly hot after some games, and it only has a heatsink. This card has two fans! Look at any high-end general-purpose (aka Intel, AMD, Motorola) processor. Notice, please, how the heatsinks have been:
    • Increased in surface area with longer, thinner spines and small ridges on the spines;
    • Supplemented with on-board fans.
    I put in a CPOC one-slot fan in my eMachines minitower when I filled the slots, and it's a damned good thing I did. (Now, in the winter, I can thaw my fingers at the back of my rig!)
  • That should at least be possible say limit the ammount of data via assembler I know I have seen this done with hard disks before. Someone took a program that could read that data slower and cuase the platters to spin slower to allow for a more minimal interface.
  • Hell yes! Bob Costas is a little bitch! Fuck him!
  • There are also 1 or 2 companies that have announced they will be making PCI versions of their GeForce MX cards. (Don't know the ones off hand)
  • Subject says it all, the V5 is hardly news: the post smells only of 'the other day we posted GeForce 2 MX and G400 now we gotta post something about the V5 otherwise the 3dfx fanboys are gonna get upset'.

    Come on, now you only have to post info about the Radeon and we're all set.

    OTOH I wonder why Anand pitted the MX against the G400, maybe he didn't want to piss off Matrox too much ? If the MX totally destroyed the G400 can you imagine what the GeF2 would have done ?

    To the moderators: please resist the urge to mod this as flamebait. I have used the V1 (owned), the V3(work), the V5(work), the G400(own a Max) and the GeForce2(work), so I feel I am fairly neutral here.
  • Set your video refresh rate to below 60Hz ( that's about 60 FPS ) and stare at it for about 6 hrs. Sleep it off. Then set your video refresh rate to above 80Hz ( that's about 80 FPS ) and stare at it for 6 hrs. Compare strain on the eye. (for those that do not know what is going to happen, I present this warning... must have a monitor that can support such high refresh rates.)

    Perceptually, there is no difference when looking. The difference comes with strain on the eye. Eye has to work harder to fool the brain at lower refresh rates. This causes all kinds of muscle strain.

    Of course, when graphics cards can put up one frame per CRT scan, that is the utmost limit and beyond which there can be no perceieved difference simply because the monitor is the limiting factor.

    Just because your eyes cannot discern between events .05 secs apart (in the best case) does not mean that there is no perceived difference. For instance, stand outside at night, stare into a camera flash (this takes less than .1 secs if you are close enough to the camera) and then try to see anything for 1 hr (assuming you didn't blow your rods and cones). Admittedly, this is the most extreme of the situation but it illustrates the point that it isn't the fact that the eye can only "reset" the retina 10-20 times a second, but that the mechanics of the eye that become important for a pleasureable experience.

  • by Datafage ( 75835 ) on Wednesday September 20, 2000 @08:29AM (#766014) Homepage
    No, this is not an introduction to modern graphics cards, it's an outdated review. An actual introduction can be had at Maximum Hardware [maximumhardware.com].

    -----------------------

  • What if you wanted two monitors? I dunno about multiple monitor support for these, but easily one can do two video cards (one in the AGP slot, one in PCI) with no problem...
  • I can tell the difference between 60 and 50 fps. I find after playing at above 60fps that 50fps or lower is jerky, and much harder to play well at. You get used to it after a while, but it is noticeable.

    A lot of people talk about visible framerates, and compare games with film. This simply cannot be done. Film contains temporal anti-aliasing, which smooths out a lot of the visual jerkiness that a game would exhibit rendering the same scene at the same speed.
  • Not to mention, with full screen antialiazing just beginning to take off, all of that extra rendering power can now be used to make the images look better!

    And if a decent 3d viewing mechanism [slashdot.org] is ever mainstreamed, FPS will be pretty much divided by two...
    ___

  • The most obvious reason to compare the G400 to the MX is that they both support dual monitors, whereas the GeF2 does not. Plus the MX and the G400 are probably a lot closer in price to each other than to the GeF2...which in adds I've seen costs as much as an entry-level PC. :)
  • I beliee the number is closer to 70fps... or at least a number less then that causes eye strain. or soemthing of that nature.
  • by Frac ( 27516 ) on Wednesday September 20, 2000 @08:34AM (#766020)
    Isn't it time to fire the entire 3dfx R&D department, assuming they have one? I still find it very amusing that after 4 years (when I got my Diamond Monster 3d Voodoo1), 3dfx is still trying to squeeze the last drops out of the dying Voodoo architecture.

    VSA-100 is probably their biggest creation for a long time, but simplicity of the chip, and it's need for parallelism for multiple VSA-100s which basically amounts to SLI on the same card is suggesting that 3dfx should spend less money on those dumb commercials and spend more on actually making something good.

    Those little tricks don't work anymore when you're starting to become the underdog, with companies like nVidia and ATI nipping at your heels...

  • I used Voodoo1 and I was blown away by the amazing difference. But my next card was a TNT, then an S3 Savage2000, but now I use a GeForce2GTS, courtesy of Elsa. I have not been inspired to buy a Voodoo for quite some time, as my impression has always been that they just keep rehashing the same old technology. The good news for Voodoo owners is that they still push enough frames to compete, and they are well supported in Linux.

  • Above an average of about 40FPS, nobody notices anymore - they can't!

    You obviously don't play FPS games or you've never had a high enough frames per sec to notice the difference. The difference in your ability to play well between 40 frames per second and 60 is amazing. You are correct, however, on that there is an upper limit on percievable effect of adding more frames per second but it is closer to 80. I saw a nice scientific explaination of it all, but I'm too lazy to look up the reference.
  • The Voodoo 3 does not suck. Having the fastest 3D accelerator is not all that it's cut out to be. The V3 is compatible with everything: BSD, BeOS, Linux, Win2K... And HAS been compatible for a long time. Sure, NVIDIA _now_ runs great on Linux, but I've been enjoying Unreal Tournament on Linux for a damn long time! As far as I'm concerned, 3Dfx is going to continue getting my money.
  • No fuckin' kidding. I'd really like to see some benchmarks of "that new Athlon" from AMD, and see how it compares with a Celeron 300A overclocked to 1.4GHz, being cooled with liquid hydrogen.
    --
  • You can definitely notice improvements above 40fps. True, you get diminishing returns after the 30-40fps point, but there's something silky-smooth about 60+ fps that puts 40fps to shame.

    Obviously the eye and the brain can process information MUCH faster than 30 or 40hz.... why do you think monitors look better at 80hz than 60hz? I'm not sure where that conventional wisdom (anything over 30-40fps doesn't matter) got started, but it's totally wrong.

    Even if you couldn't tell the difference between 40 and 60fps, a card with excess power is going to be more future-proof. A card that can only run today's games at 30fps is going to have a hard time with next year's games. Today's monster card that runs today's games at 200fps is going to work nicely with next year's games, too. :-)

    Whoa, that troll almost bit my hand off when I fed him!

  • by TheTomcat ( 53158 ) on Wednesday September 20, 2000 @07:59AM (#766026) Homepage
    Above an average of about 40FPS, nobody notices anymore - they can't!

    Maybe not conciously, but it DOES make a difference. Ever seen an I-Max movie? They're shot at 48FPS instead of 24, and it definately makes a difference.

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde

Working...