Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware

3dfx Delays Voodoo5 Schedule 164

Ant writes: "Yahoo posted the press release that 3dfx Interactive® Inc. has temporarily delayed the release of its Voodoo5(TM) 5500 AGP. The press release states that the company is taking this action to ensure that it meets its own high standards for product quality."
This discussion has been archived. No new comments can be posted.

3dfx Delays Voodoo5 Schedule

Comments Filter:
  • Im glad voodoo is delaying their card to wait for higher quality. Maybe that means that the Linux drives will ship at the same time.
  • by jawad ( 15611 )
    So why does buy.com [buy.com] already have it up for sale [buy.com]?
  • With this delay, and 3Dfx's obvious performance shortcommings, nVidia is going to continue gaining their lead.
  • NOOO!

    Dammit, I WANT to believe in 3dfx. They support Linux. For a long time they were king-of-the-hill!

    But now, even with the GEForce2 out, they have to delay a card that requires me to hook up a HD connector to power it?!?

    Dammit, if nVidia would just open up their drivers a little bit more, I wouldn't mourn for these guys, but ARGH!
  • I would like to know what the likely technical reasons are. Lately, we've seen a lot of video cards dealing with heat issues.

    Interestingly enough, this coorisponds pretty closely with rumors of Nintendo delaying Dolphin (their new game system). Although they are not confirmed, it could be that Nintendo and 3dfx are facing similar technical challanges that they are running to kinks with. Hrm.

    Just speculation...

  • Does anyone have a game that actually maxes out any of the current generation of video card, or the previous generation for that matter?
    QA3 at 170fps is meaningless.
  • Haha, they better get it up to its "High Standards" because the 6000 with 4 procs can't even beat the GeForce2 DDR ..
  • Does anyone know if 3dfx will be producing any more 3d add on cards. The voodoo2 kicked much ass and i've heard rumors of another. 3dfx, if emailed, changes thier answer about every month.

  • I think that a delay is a pleasant slowdown to the current pace of new graphics cards, seems like only yesterday that voodoo3's and GeForce's where new...

    I doubt that there will be any real reason to upgrade to one of these new cards for some time tho'.
  • You may have overlooked it, but Buy.com is listing the card as a backordered item. This way you can buy the card now, and it will ship when it is available.
  • by subtraho ( 187805 ) on Thursday May 25, 2000 @11:44AM (#1047521) Homepage
    Shouldn't this be termed "pulling a Blizzard" by now? They've honed it to a fine art.

  • Why are all the people on slashdot so worried about Nvidia opening the drivers when they dont care about the source to the games that they run?

    I mean I've been setting back watching the 3dfx vs Nvidia bitchfest for a while now, but I guess I'm a cheat because I have a TNT2 with closed drivers for q2/3 and 2 V2's SLI for all my wine/3dfx games.

  • 3DFX may support Linux, but it seems as though it is half-ass support. If you look at the recent benchmarks on linuxgames.com, the 3DFX cards have the largest gap between the Window and Linux drivers.. says you will only get like 75% of the performance you will get under Windows. Is this going to change in the future or will we continue to see half-ass support?
  • it would be nice to think that the company is holding back the realease because of there own "high quality standards", but we all know that they probably found some hardware guru sniffing glue and decided that maybe they should hold back on release until they checked out his facts. IF they are holding it back because the product doesn't meet thier expectations, it certainly is a breath of fresh air. I really hate rushed projects, and I hate being involved in them, but hey s@^! happens.

  • From buy.com's own Shipping Definitions:

    Back Order
    This product is not currently in stock. We have orders placed with our supplier but have not received a date yet as to when we will receive this product. We will make every effort to fill your order as quickly as possible and to keep you updated on our progress.

    That would be why.

  • Just wondering if there's any word as to whether the Voodoo5 will work with AMD boxes any better than the Voodoo3. I (and apparently many, many others) had to go through a number of tricks to get the dang thing just to work...
  • who cares bout the 5500 ? the only card i'm gettin from 3dfx this season is the 6000.. i mean.. dual chips ? ati's been there .. QUAD CHIPS is where its at.. (the gigapixel gloating value helps too (hopefully they'll market it the same way apple marketed the "GIgaFLoP"))
  • by Anonymous Coward on Thursday May 25, 2000 @11:52AM (#1047528)
    Yes, let's applaud them. I have to post this anonymously, because I still care about my job, but I just wanted to say how much I hate the practice that has become standard in software industry in general and at my company in particular - if you promise a release date, you HAVE to ship it, no matter what the quality is - who cares if marketing underestimated the time needed? And then every two weeks after that we ship "patches," everything gets fucked - who wins????? customers?! no way! developers? hardly! but hey, marketing promises got "delivered"!!!

    I hope sooner or later more and more companies (and shareholders, analysts, etc.) will begin to realize that if you delay your product by two months, NOTHING bad is going to happen, but if you ship crap, it's not going to improve your customer relationships.

    I suggest we write polite and GRATEFUL e-mails to 3dfx THANKING them for caring about quality and expressing support - who else if not /. crowd can "feel their pain"? I'm sending an e-mail right now...
  • Their v5-6000 has four separate chips running in parallel... each with its own cooling fan. The board requires its own external power supply.

    And if that isn't scary enough, Quantum 3D is building 8, 16, and 32 (THIRTY FRICKIN TWO) processor... well, "cards" isn't the right term: they are external rack mountable boxes. It uses 1600 WATTS.

    URL: http://www.quantum3d.com/product%20pages/aalchemy3 .html [quantum3d.com]

    Thanks shugashack [shugashack.com]
  • Well, maybe, but Blizzard (as with 3Dfx) doesn't usually release utter crap like IS has.

  • Q3A does max out these cards, simply don't run them at 640x480. I have a GeForce and Celeron 450 and only manage 45fps average at 1024x768x16 with all the bells and whistles turned on.
  • I'm obviously disappointed to hear about this, but hopefully they'll be able to up the quality and be able to compete a little better with a late release. I'm getting sick of listening to the comparisons and sneers from Diamond fans. Voodoo forever...
  • Comment removed based on user account deletion
  • This troll you're referring to (the one who claims to work for J J J J J ulius games), is NOT the REAL STEVE WOSTON.

    The real Steve Woston's site is here [mnc.co.za].

    Steve is genuinely upset that he's being impersonated on Slashdot.

  • There are actually already Linux DRI drivers out on an (unstable) cvs branch, that are supposed to work pretty well in 16 bit mode already, although 32 bit mode reportedly crashes the machine. It's the tdfx-0-0-2 branch I believe...see The DRI Page at Sourceforge [sourceforge.net] for more details.


    One Microsoft Way

  • If it is a heat dissipation problem then I can see the modifications already:

    1. Seat your VooDoo5 in the AGP slot
    2. Connect the monitor to the DB15 socket
    3. Connect your garden hose to the 1/2" NPT nozzle next to the DB15. Note: Operator is responsible for providing adequate drainage.

  • You'd better read the fine print at buy.com! Looks like they will happily take yer money for vaporware. ;)
  • I'd be concerned about a video card that consumes 200 W....think heat generation...and you have to have an extra power brick just for it.


    --
  • by doogles ( 103478 ) on Thursday May 25, 2000 @12:00PM (#1047539)
    ... Ok, let's say they release crap. We'd go on and on badmouthing them for incompatibility, lax Quality Assurance Standards, and all the glitches because of "rushing the product to market too quickly".

    So, for once, a company decides (for whatever reasons: technical, political, financial .. who knows) to withhold a product until it meets higher standards.

    And yet, somehow, we find a way to bash them for this, claiming that they need to "pick up the pace" and that they're "already behind nVidia in the video card wars".

    Let's cut them some slack, and not judge until we have a final product in our hands. I'm telling you, this is the only forum in the world where we can badmouth company's no matter WHICH choice they make. =)
  • Since noone has read the news release here it is..... (I can tell by idiotitc comments made here that noone has) Media Advisory 3DFX DELAYS VOODOO5 SCHEDULE Date: May 24, 2000 What: 3dfx Interactive® Inc. (NASDAQ: TDFX) today announced that it has temporarily delayed the release of its Voodoo5(TM) 5500 AGP. The company is taking this action to ensure that it meets its own high standards for product quality. The company discovered that the Voodoo5 may be experiencing field failure rates at very low levels in certain configurations. The company is conducting further tests to determine whether a problem actually exists. 3dfx anticipates this action will delay product availability between seven and 14 days. "We believe that this affects only a small number of configurations, but we feel that this is the safest thing to do," said Randy Schussler, vice president of operations at 3dfx Interactive. "We're taking this action to ensure that our customers receive a high quality product that exceeds their expectations."
  • It *was* almost yesterday that the GeForce was new. It was like a year and a half ago (or longer, I think) that the voodoo3 was new. It really shows, too, because the voodoo3 really sucks compared to a lot of todays top graphics cards.


    One Microsoft Way
  • Doesn't 4 come after 3? Shouldn't it be the VooDoo4?
  • by Anonymous Coward on Thursday May 25, 2000 @12:02PM (#1047543)
    I mean, really. I wrote GLIDE-only games and everything. When the Voodoo cards first came out, they blew everything else completely out of the water: They kicked ass.

    But times have changed. Voodoo 5 isn't much to get excited about. The guy that sits next to me at work, has one in his machine to check compatibility with our product. When the GeForces first arrived in the office, there was a bit of "Hey! I wan't that in my machine!" going on, but when the Voodoo 5 arrived, noone even really wanted to install the thing.

    Actual quote from a coworker: "OK, I tell you what. You run up the demo, and I'll see how big my yawn is."(1)

    Why the lack of interest? Well, what's to get excited about? Sure, it's fast, but, as someone else pointed out, no games max out on the card's speed because they need to still run on slow-ass machines without crawling, and as you start to add scaling functions, you start to add overhead -- remember, the CPU still has plenty of work to do.

    Now, the GeForce (and IIRC ATI's new card, the Radeon) has hardware to take some of the geometry strain off the CPU. Plus, newer cards are adding sexy new stuff such as cute pixel-shader features. When you get down to it, these sorts of things are far more interesting than raw fill rate / card tri rate, which is all the 3Dfx cards actually offer.

    This is because when you get down to it, texture-mapped triangles are not very interesting. Sure, they make a good building block, but there are things you just can't realistically represent that way, unless you generate textures, texture coordinates, and do interesting things with them, on the fly.

    It's with tricks like these that we can improve lighting models, reflectivity effects, and volumetric effects, to bring the otherwise rather flat, plastic world nearer to the more realistic and impressive world of raytracing, but in real time. Sure, you use cheap hacks, but at 60fps, noone notices... ;-) And then the gamers are happy, and us game developers are happy too :)

    (1) the yawn was medium-sized, by the way. Their full-screen anti-aliasing is quite good quality. Nothing else startling to look at though.

  • A sad day at 3dfx today. The Voodoo 5 cards are so large and heavy, it appears that one of the primary engineers for the card was killed when the industrial strength crane used to lift the card out of the case accidentally failed and dropped the card over his head, crushing his body in the process.

    Engineers were later heard to report that the card in question was stil functional and "played Quake 3 Arena 47% gorier than before".

    Boy it's a long day at work today...
  • by cronio ( 13526 ) on Thursday May 25, 2000 @12:07PM (#1047545) Homepage
    I asked Daryll about this recently on the dri-devel mailing list...here is was he said:

    3dfx does have a high-performance OpenGL implementation. They put a reasonable amount of manpower into it. I think this is showing you the performance improvement you can get by putting real resources behind the project.

    The nVidia situation is interesting. In that case, they are using essentially the same code base between Linux and Windows. The question to ask is why the Linux version is then slower? It could be an OS issue, a compiler issue, a driver issue, or something else entirely. If we close that gap, whatever it is, all the implementations get better. By the way, 2.3 kernels have been signficantly faster, so the 2.4 release may help with the difference.

    The argument is that Open Source efforts can do it better, but you have to qualify that a bit. What defines better? In some cases we're not concentrating on the same focus. For example, Mesa tries very hard to be a complete and conforming version of OpenGL. In some cases that may mean losing some performance compared to tweaking of Q3A at the expense of everything else. Some of the security and stability fixes in the MGA DRI code mean we lose a bit of performance. The 3dfx in-a-window implementation under Windows is quite a bit slower than their full screen mode.

    You also have to compare manpower efforts. 3dfx and ATI put a lot of people on working on their drivers. They are each paying PI for one engineer. That limits not only how fast the drivers can be produced but how good they will be. We're also really not seeing much help from the community. We'd love to have more people contribute.

    The 3dfx driver has remained mostly unchanged (except for bug fixes) for the last year. That's because I've been payed to get it running on different boards (V3/V4/V5), to fix some bugs, and improve the infrastructure (DRI). We really haven't had the resources to spend doing optimizations and to rewrite some of the really ugly parts in the code.

    The bottom line is that there is room to improve. I have no doubt that with the right attention all the drivers would be very close to their windows counterparts. We just need some good people to do the work.



    One Microsoft Way
  • I'm still holding out for the Voodoo5, because as far as I can tell it is the only PCI card to give good 3d performance.
    Are there any other feasible options?
  • Ok, with the Red Wings out of it, who do you like?

    This brings up a question I've always wanted to post on Ask Slashdot...what hockey team is the favorite among geeks? An argument could be made for any of the following:

    Pittsburgh - mascot is the Penguin. 'nuff said.

    Ottawa - play in the Corel Center. Possibly the only Linux ad you'll see in all of sports.

    New Jersey - their logo brings to mind the BSD daemon.

    San Jose - the Mecca of geekdom.

    Anybody got any more?
  • And technically illegal (? not sure about this - I do think they can ask you to stop if you cause too much interference) to use in a residential neighborhood. Only FCC Class A approval. Class A=commercial only (too much interference for residential). B=residential allowed (of course you can use a Class B in a commercial area too).

    With all the geeks running really ridiculous stuff in their residences, I think everything should be engineered to Class B standards (i.e. low interference).

  • Matrox is know for their excellent 2D. Some of their older cards, like the Millenium II, or even the mystique, are good cards, and being older would be fairly inexpensive.

    Spooner
  • One of those two cards should do you well...the Matrox G200 has a better quality picture.

    --
  • It's too bad that in my experience 3dfx has the same quality problem that MS has. They make some very sweet hardware at times, but software support? ha!

    I got the Voodoo 3 3500AGP when it came out, and while it runs Glide games like it should, there's still no OpenGL support. (No, I mean FULL support... for Windows.) and maybe I'll get windowed Glide rendering one day... yeah right.

    The video capture sucks royally too. My ($300+ CDN) 3500 on an AGP bus, in my P3-450 with 128MB of RAM (and AGP aperture set properly,) barely manages to get 20 frames per second at 320x240 truecolor, set it any higher and it chops like crazy.

    Funny that an $80 CDN PCI Hauppauge WinTV tuner card in my... Cyrix 200MX with 32MB RAM (before the CPU wore out,) was able to capture 30fps at the same quality settings!

    Also, I'm sure many of you have heard of the "DVD acceleration." This is... a hardware video overlay. That's it. Luckily I didn't trust them and got a full hardware decoder card anyhow.

    Personally while you can count on game companies supporting 3dfx, I won't expect quality until I see it from them.

    (BTW, when the card works for 3d, it's awesome, it just sorely lacks in some major points that you should expect from any decent manufacturer.)
  • As we have seen in the past with such notable example as Microsoft and Intel lack of competition in the marketplace can be a bad thing and right now nvidia has far and away the best bang for the buck. Are we looking at another portion of the industry that is going to be dominated by a single industry or will ATI or Raedon step up to bat? Can 3dfx make a comeback at this point?
  • Well those Quantum 3D things are meant only for commercial use anyway. You have to wonder though... a lot of people run their computers with the cover off their case (for different reasons)... so much for class B certification.

    I forgot to mention that the v5-6000 (4 chips) is going to be $600 so it is questionable whether that could be considered for home use... no way in hell I am putting something putting off that much heat and noise (from the fans) in my computer. I can barely stand the noise coming from my computer from the one fan on my tnt2, celeron, two hard drives, and power supply (although the power supply I have is very quiet).
  • There is a voodoo 4, it uses the same basic architecture that the voodoo 5 series will use, except that, instead of 2 vs-100 graphics processors, there will only be one, accompanied by either 16 or 32 megs of ram.
  • (Score 3: Insightful)

    Maybe I shouldn't troll while I'm logged in... ;)
  • It may never be released at all. If it has been released already, its simply a single cpu card (the V5 5500 is 2 of the cpus on the V4), and the card compares favorably to cards released a year ago, maybe. Compare it to something like the GeForce 2, and to say it gets crushed by the foot of god is an understatement.
  • The upcoming Voodoo 5000 is a dual chip solution - effectively doing SLI. The 6000 will be a quad chip part (which is why it has that nasty $599 estimated retail price). The VSA 100 is the actual chip that 3dfx developed. It supposedly will scale to 32 processors...a third party manufacturer is already on board for an 8 chip card for professional 3D / CAD etc...

    Seems nVidia will strengthen their lead with another 3dfx delay (cool, cuz I like nVidia, bad because competition drives prices down and performance up...though in many ways nVidia acts as competition to itself by releasing new products on a 6 month cycle.)
  • Matrox really makes the best stuff for normal everyday use. You should be able to pick up a G200 for fairly cheap now, and it'll do everything you want from it pretty happily.
  • One of those two cards should do you well...the Matrox G200 has a better quality picture.

    He said work WELL, not leak 5 megs of ram every time you open/close an XMMS OpenGL plugin.

    The nVidia drivers are crap, and they're really slow comming up to speed. I own a TNT1 and I wouldn't recommend buying one if your main platform is X. Go with a Matrox/3Dfx card.

    -- iCEBaLM
  • You say that 3dfx still holds the raw fill rate / card tri rate, but this is untrue. Only the Voodoo 6 will be able to compete with the GeForce II, and it seems that by the time that it gets released, Nvidia's next gen part is going to be released, and it is going to woop the buts of every other gpu out there. Take my word for it, NDA's do not allow me to go any farther.....
    The only thing that 3dfx has over the competition is their *special* features : t-buffer, fsaa....
    Unfortunately, as said above, the fsaa is only ok, not on the killer ap level that would make people give up raw fill.
  • I guess your ignoring the fact that the 6000 generates heat like a pig and draws power in the same realm as my entire laptop?
  • I knew it! Go ATI and Radeon!

    Uh, more like go NVidia GeForce 2. Has anyone actually got their hands on a Radeon yet? GeForce2 is almost shipping AFAIK.

    3dfx sucks. $700 for the top of the line Voodoo 5 which will just bearly beat out the Geforce 2? Give me a break.
  • The v5 is crap, take a look at it head on vs a real card like the GeForce 2. Its a total joke.

    So they get the Ion Storm award, not the Blizzard award.
  • It draws more power, generates more heat, and does 32 bit color. Of course everybody else has been doing that for years, but its a brand new feature from 3dfx!
  • I just did. Anyone else?
  • That Quantum 3D thing is a beast! I have to say it, but could you imagine a Beowulf cluster of these things??

    Girlfriend: Hey Dman33, what is that smell in the computer room?
    Dman33: Oh, the cat.
    Grilfriend: No, not that type of smell, this is like burnt toast.
    Dman33: Yeah, like I said, it is the cat.
    Girlfiend: ?
    Dman33: Damn thing crawled between my dual 1600W 32-processor video cards...must have made contact or something, little critter got fragged big-time!
  • The difference between Microsoft and nVidia is that nVidia is tops because they're better than a wide range of other video card makers, while Microsoft is tops because Windows is (marginally) better than no operating system at all (don't argue that you can get Linux - Joe Shmoe couldn't install Linux if his life depended on it).

    Don't mistake market dominance due to a quality product with market dominance due to monopolistic practices.

  • As to your first question, Colorado - they play the purest and most skilled style of hockey among the final four.

    To your second (favorite team among geeks), a good candidate might be Washington. The owner there (a former AOL guy, I think) gave all the players lap tops, and everybody involved with the team participates in forums via their website. Their coach, Ron Wilson, is an admitted internet junkie.

    My personal fave, of course, is the Red Wings. Been a fan since the early 80's, so it's been a great ride.

  • Obviously 3DFX has seen the Fiasco Intel has had over their RAMBUS(C) Chips and observed that Intel could pay out over $600 Million. Judging by user comments regarding AMD, PC133, Rambus and power supplies quite rightly they have decided to duck and cover. If I was their shareholders I would rather ship bad software than bad hardware and bad hardware is what they have seen in beta.
  • Yes, nVidia is now king of 3D performance--by a slim but tangible margin. But the really interesting, freaky fact is that 3Dfx is now king of 3D visual quality. Thanks to 3Dfx's superior hardware-assisted implementation of 4x anti-aliasing, their visual quality kicks the crap out of nVidia's inferior technique. All of the reviews I've read linked over at AnandTech which were done with the drivers 3Dfx will ship with, rather than the older immature drivers which came with pre-release factory samples which had been floating around the review sites, state that the Voodoo 5 5500's 4x anti-aliasing and arguably its 2x anti-aliasing really trounce the visual quality of the GeForce 2's anti-aliasing.

    The weird thing is that 3Dfx used to be the king of performance, evangelizing fill rate and frame rate over visual quality, and nVidia argued that visual quality was more important than fill rates. That's the essence of the whole argument that broke out when the TNT and TNT2 had 32-bit color but the Voodoo 3's had only 16-bit color but more speed. Now the roles are reversed. I feel like Alice, through the looking glass...

    But unfortunately I fear that 3Dfx's superior image quality is just a fluke, and that they're touting it now because it's 3Dfx's only advantage. Remember that the Voodoo 4 and 5 were supposed to be out by last Christmas, before their design and fab difficulties, so effectively they're now a product cycle behind nVidia. If 3Dfx fails to treat the current emerging lineup as an "interim" line of products, and doesn't bust its collective ass to get another and vastly superior product cycle out the door before this Christmas, it will go down faster than a freshman at a frat party. Goodbye, 3Dfx.

    I hope this doesn't happen, because I have respect for what 3Dfx did to advance 3D on the PC, and I'd hate to see yet another graphics company go bust, but at this point it looks grim for 3Dfx. The top of their current emerging lineup has superior image quality, perhaps the best in 3D right now, but at framerates which can't be considered more than just "passable." Their Voodoo 5 6000 is an utter joke, if and when it finally gets released; it'll cost twice as much as a GeForce 2, but definitely won't double the performance, and will require 4 chips and an external power connector. Yes, the very thought of such a massive, impressive piece of hardware makes me want to jam my slot 1 into a tight little socket 370, so to speak [nudge-nudge, wink-wink], but the card isn't financially sound since we know nVidia's next product cycle will probably surpass it.

    But, anyway, it is interesting how the roles have reversed, and 3Dfx's visual quality is now their selling point while nVidia's raw performance is now their selling point. My poor ATI A-i-W 128 feels so...inadequate... I need some video card Viagra...
  • Ok, I get it now :) I really meant old voodoo and voodoo2 cards, but hey.. they prolly fit the IS model better anyway simply because they have just as much as an overinflated ego as Romero does.. I'm surprised 3Dfx hasn't offered to make me their bitch yet. *grin*
  • by Tridus ( 79566 ) on Thursday May 25, 2000 @12:29PM (#1047572) Homepage
    First of all, 3dfx should be *commended* for delaying a product if they don't think its ready. This is not a bad thing. This is a good thing. This is exactly what we want from other companies.

    Praise them, they are doing the right thing.

    Unfortunately, the Voodoo 5 sucks badly. The GeForce 2 rips it up in the power consumption, heat, feature, and sheer power fields.

    I don't really think its 3dfx's fault entirely, maybe they lost some good talent or something. I mean they haven't done *anything* that was top of the line since the Voodoo2. They ruled back in those days. Then there was the Banshee. Then the Voodoo3. Both of which were lame.

    Now we have the V4 (which is a joke), and the V5 (which is a bigger more expensive joke).

    So, lets applaud them for their policy, and slam them for their technology. At least then we are doing it right.
  • Sorry, it's over. If 3dfx were a horse, we'd have to shoot it. They've lost a lot of key people, and they just can't compete with nVidia. The 5500 was supposed to compete with the GeForce, and now the GeForce2 has beat it to market, they're too far behind. Plus, the whole VSA multi-chip thing just doesn't work out too well. For one thing, you need 64MB to compete with a 32MB GeForce or other card, and the whole HD power cable thing is ridiculous. I hate to say it, but the company that created the consumer 3D market is done for.
  • Yeah, but programmers and hw designers do not like the number four. Windows 4 = Win95. Palm Pilot 4 = Palm 5. etc.

    Scratch that. Maybe Marketers do not link #4

    Tom
  • The latest Ultima, "Ultima CXVII - Lord British Teleports Away", will max out just about anything available if you set the clipping planes way back and run at the highest LOD.

  • The "low end" VooDoo5 was beating the GeForce2 at high res. If you play games at 640x480, I feel sorry for you... sniper fodder.

    My "old" VooDoo3 3500 will pull 100+ fps in Glide (1024x768). I only expect the new card to let me do the same with Full screen AA, and probly at a higher res.

    The GeForce2 is a nice card - there is no arguing that. But for the price of a DDR GeForce I can get a card that will out perform the GeForce, outperform the GeForce2 at high res, it'll support all my Glide games (Glide wrappers don't cut it), and I get open source Linux support. Case closed as far as I'm concerned.

    SQ

  • by Temporal ( 96070 ) on Thursday May 25, 2000 @12:37PM (#1047577) Journal

    I wonder how much of this was a quality concern, and how much was the sudden realization that the nVidia GeForce 2 is faster, cheaper, available NOW,and doesn't require a freeking AC adapter to be plugged into its rear-end?

    From what I've read, the V5 6000 is the only card from 3dfx that has any chance of beating the GF2, but only in a few select situations, such as running older games at super-high-res with 4xFSAA. And then it is only a little bit faster. (yes, the GeForce 2 does FSAA.) Add to that the fact that the 6000 will cost US$600 (when it finally comes out) as opposed to the GF2's current price of $300, and you have a sorry situation indeed...

    ------

  • by Ryvar ( 122400 )
    nVidia in the GeForce GTS at least uses a form of antialiasing similar to Photoshop's bicubic filtering which is computationally FAR more expensive than 3Dfx's pseudo-nearest-neighbor anti-aliasing approach. nVidia is sacrificing fillrate (of which the GeForce 2 GTS has marginially more than even the TOP of the line V5) for visual quality, same as always. Don't forget per-pixel shading, etc., either. Of course, in terms of overall 'value' of the cards there's the simple fillrate advantage of the GTS (1.6Gtexels/sec as opposted to the V5 6000's 1.3) and nVidia having T&L which 3Dfx STILL does not have.
  • Why did you see fit to use your +1 bonus on an off topic post? Wouldn't it be better to dump the bonus and post as a lesser mammal?
  • by Anonymous Coward
    Full OpenGL support? Try a Windows build of Voodoo Mesa. It's not nearly as fast as the Quake-optimized GL's, but my Quakehead friends all found it quite usable and it delivers superb image quality.
  • nt 1 = "NT 3.51", nt 2 = "NT4", nt 3 = "W2000", so nt 4 = ?

    Gee, nice 'lameass' filter, I had to make all those lower case. Thanks, Taco.

    Pope

    Freedom is Slavery! Ignorance is Strength! Monopolies offer Choice!
  • I use a Cirrus Logic 5446, X supports it out of the box, 640 x 480@24 bpp and 1280 x 1024 at 8bpp.
  • Here [tech-report.com] is an article which compares the raw power of the cards. As you can see, the V5 5500 has nothing over the GeForce 2, but the V5 6000 might have a chance, but only when T&L is not a factor.

    BTW, "128MB" on a V5 6000 is no better than 32 on a GeForce 2 due to the multiprocessor design. But the GF2 can have up to 128MB, which would be like a V5 with 512MB on-board. heh.

    ------

  • How do you download a hardware patch? You can't. That's why they are delaying the release date and recalling all the boards from the store.
  • According to an EBNews article [ebnews.com], "...A board using a single VSA-100 chip -- branded under the Voodoo4 name -- is functioning normally."

    The V4 (Which has not yet shipped), is basically meant to be a rival for the TNT-Ultra. According to Anandtech's 3dfx Voodoo4 4500 & Voodoo5 5500 Preview [anandtech.com], the V4 (with Beta drivers) does pretty good against the TNT-Ultra. It's a little slower at 640x480, a little faster at higher resolutions. At the same time, it offers some extra goodies like 2x FSAA, improved 16-bit color quality, and (possibly) a lower price.

  • If this had been raised a fortnight or so before launch then yes, this would be a valid point. However this announcement has come only a couple of days before (launch date was 26/5 was it not?) - only a showstopper would cause a company to delay at that sort of notice (remember Intel and the 820 fiasco with the memory corruption problem with 3 Rambus RIMMs?).

    At the very least this suggests unrealistically tight testing schedules at 3DFX, at worst - well how does Voodoogate sound?

  • How do you download a hardware patch?

    You use this hoary old thing called "surface mail" to upload your bad hardware and download a replacement.

    Case of Note: The companies making PowerPC accelerator cards, a market where there is a lot of competition, use this approach quite frequently with new models. Check any Mac Hardware discussion area, MacFixit, xlr8yourmac, or DealMac, and you will find that for every new release of an accelerator card, there are customers who find problems that eventually become "patched" at the factory, and customers who specifically wait until they feel a satisfactory number of patches have been implemented before downloading anything.

    3dfx doesn't have any real competition for their Voodoo line (ATI? I said 'real.') so they can afford a brief delay to patch their hardware. For them, it's cheaper than customer support.

  • by barleyguy ( 64202 ) on Thursday May 25, 2000 @01:39PM (#1047599)
    3DFX's FSAA is superior to the GeForce's. The way the G-Force does anti-aliasing is to render at a higher resolution and interpolate back down. It cuts your performance immensely, and only works with games that support the higher resolution.

    The way 3DFx does FSAA is to render slight variations at the same resolution, and then average them. It is better quality, and works with more games. It is also faster, in the case of 2x. This also allows for 4x FSAA, which is even better quality.

    I will probably go with 3dFX for my next video card, because I use a multimedia projector for gaming, which has the advantage of size, but the disadvantage of lower resolution (800x600). FSAA is great for this type of application.

    If you use a high resolution monitor, though, you can just set your resolution to 1280x1024 and turn off FSAA. It's not noticable at high resolutions. In this case, I'd probably go for the GeForce 2.
  • by LordNimon ( 85072 ) on Thursday May 25, 2000 @01:42PM (#1047601)
    Voodoo 5 isn't much to get excited about.

    It is if you are a Macintosh user. The Voodoo 4 and 5 boards blow away anything else available for the Mac. Granted, some Macs can't take the boards because they don't have any free PCI slots (a big problem in the Mac world), but I have a PowerMac 8600/300 for which the Voodoo 5 5500 PCI is perfect. I'm more than happy to pay the $350 for that card.

  • nt 1 = "NT 3.51", nt 2 = "NT4", nt 3 = "W2000", so nt 4 = ?

    Nope.

    The first version of Windows NT was released as Windows NT 3.1. Then came 3.5 and 3.51. Then we had NT 4.

    Now we have Windows 2000.

  • .. than the NVidia-cards.
    Sure, you can get the NVidia Quadro's, but
    it still is pretty middle-range.

    Think about a big 8x + CPU workstation from SUN
    with a rackmountet 3dfx graphics system from
    Quantum.
    It may not sell in large quantities, but the
    margins sure are enormous on these beasts.

    Perhaps 3dfx should realize that their current offerings cannot compete with NVidias in the middle-range market, and keep the Voodoo4 and 5
    as low-end, ditch the Voodoo6, and consentrate
    on really large grahical systems, with 10+ VSA-chips. That's Voodoo Scalable Arcitecture alright.

    I hope 3dfx can regain some of their momentum, as
    I have a rather nostalgic feel about them.
    NVidia's offerings are _much_ better though, and
    ATI is looking good.
    If VSA had reach products 1/2 year ago, they might
    have been a success in the home and OEM-market.
    Now it'll probably still stay afloat in the low-end market, but get killed in the mid-end.
  • "...nVidia in the GeForce GTS at least uses a form of antialiasing similar to Photoshop's bicubic filtering which is computationally FAR more expensive than 3Dfx's pseudo-nearest-neighbor anti-aliasing approach..."

    Thresh's Firingsquad recently performed a side-by-side visual quality comparision of the V5 vs. GeForce GTS FSAA [firingsquad.com]. According to the testers, the Voodoo5 had the best picture quality when in 4x mode, while the GeForce was better than the V5's 2x mode in some games. Both cards seemed to have a few glitches in FSAA mode--the V5 had a "bleeding" problem at 1600x1200, while the GeForce wouldn't work with D3D games.

    From their conclusion [firingsquad.com]:
    "The results from this set of tests were considerably different from that of last time. Seeing the games in motion side by side is truly the only way to compare the two cards. 2x FSAA comparisons yielded mixed results. The quality difference between the two cards was exceedingly close. We tended to like the GF2 FSAA when compared to the Voodoo 2x. However, if we take into account performance figures, the Voodoo is the clear choice. With 2x FSAA, the Voodoo performs considerably better than the GeForce2 FSAA. When it comes to 4x, the Voodoo has no competition in terms of FSAA quality."
  • "Did you forget Nvidia?"

    Mr. Bughunter was talking about the Mac market, where nVidia does not currently have any options. nVidia has said they may offer Mac cards in the future, but nVidia has already stated that the NV15 (GeForce 2) will not be one of them -- so Apple users will probably have to wait until at least early 2001.
  • Rendering at a higher resolution and interpolating down would be the same speed as 3dfx's 4xFSAA. 3dfx's implementation cuts performance quite a bit as well.

    I find it sort of odd that nVidia's method, as you describe it, does not produce the exact same image. It is essentially the same operation. 3dfx renders the same frame 4 times, with pixel offsets of (0,0), (0,0.5), (0.5,0), and (0.5,0.5). After averaging, that should look the same as just rendering at a higher res and then averaging every 4 pixels into one, right? What am I missing here? Can you point me to a comparison, preferably with screenshots?

    ------

  • I'll risk a point to say this: The Trident 8900, while dead slow, is supported so well, it is nearly unbelieveable nowadays. Even a few DOS games had support directly for it, and the setup disks had enough Trident 8900 specific customizations that it puts any other card (including my G200) to shame.

    So why was the above post (-1, redundant)?
  • Thresh's Firingsquad has an excellent (and lengthy) article [firingsquad.com] in which they test the visual quality of the Voodoo5 vs. the GeForce2 in FSAA mode. A quote from the article explains the basic difference between the two different ways the cards perform FSAA:

    ...3dfx uses a method called RGSS or JGSS (Rotated Grid Super Sampling, Jittered Grid Super Sampling). NVIDIA uses another method called OGSS (Ordered Grid Super Sampling).

    The OG!
    OGSS is exactly what it sounds like. Ordered grid means that the image is processed in an ordered fashion. Cut the screen up into nice little blocks and you have an ordered grid. Now we have super-sampling. This means, in really dumb downed terms; that the picture is processed except with a bit more data in it. Mind you, this is all going on within a pixel, thus creating a much more detailed image. The image that is represented at 640x480 is actually processed with as much detail as would be present in something that has, as an arbitrary number, 1.5 times as much detail. So in order for the GeForce2 to display a scene with FSAA at 640x480 it must do the work required for displaying an image at 960x720 and then some. Other stuff like color blending goes on to smooth out the image also. So all in all a considerable amount is going on to create the effect that FSAA delivers.

    Voodooss
    The 3dfx card does another variation of FSAA called jittered grid super-sampling. JGSS is a derivative of RGSS. RGSS, as opposed to OGSS, takes the image that is going to be represented and processes all data at a slight tilt. Jittered grid has the tilted data set, but it has a randomizing factor thrown in to make it seem more natural. If all the data was rotated at the same angle it wouldn't make too much of a difference in comparison to OGSS. This is because our eyes tend to pick up on patterns relatively easily. The random patterns make sure your eyes don't catch on to what is going on. Following this, all the other color blending and hoo-haa takes place to spruce up the image.
  • Why is it that ppl are commending 3dfx for delaying their product for better quality, but when an intel release gets delayed for the same thing, we laugh? ^_^


    ---
  • Bruce,

    I think you are talking very subtle shades of "truly commited to Open Source."

    I disagree. There are people at 3dfx who are working hard to get Open Source in the door. (Joseph & Kieth). Maybe management isn't 100% on board, but neither is Matrox.

    It's a shame that the ONLY company that has released all specs and driver code is being slamed here. Matrox always has to be cajouled into releasing their specs.. just usable specs even).

    In all of my talking with 3dfx people, never have they held back on the technical details of the glide driver or the hardware. (I am working on the h3 branch of glide to the alpha platform).

    Please read this message and tell me they arn't working. I just can't agree with your assment that they are so much the better kind of folks.!

    > Thanks for the detailed response. However, simply because we haven't
    > released info yet doesn't mean we aren't working on it as we speak and
    that
    > we don't intend to. Most of the data was never meant for public
    > consumption and much of it needs to be reformatted, completed, etc. for
    > release. Our culture is very engineering driven and so, as you might
    > expect, there are many radical linux heads running around the company
    saying
    > we should make all of our IP public including our our vlsi chip
    designs...
    > =)
    >
    > Anyway, everything you said is well understood here. Well understood.
    > We've been working with Precision Insight to enable fast 3D on linux for
    > quite some time. You will continue to see lots of action in the open
    > source/linux community soon from 3dfx. We intend to be a primary
    > contributor. To date, we've released our 2D specs, our new compression
    > algorithm (fxt1), and more WILL follow. =)
    >
    >
    > Keith

    I'll take a 3dfx or a Matrox to a nVidia or a Yamaha (Which I have spent alot of time talking to about OSS). 3dfx seems to have followed up on their promises of open source drivers and specs, and has done really well.. please explain exactly what they have failed at provinding to developers of open source code since they provided the documentation and source code last December.

    Panaflex
  • From what I can tell, this is a full recall now..

    --
  • by MyAss ( 144952 ) on Thursday May 25, 2000 @02:50PM (#1047627)
    If someone doesn't produce a competitor to nvidia it just a matter of time before nvidia becomes the Microsoft the 3d card. Or maybe Intel would be a better example.
    Think about it... Because no one (until recently with the Athlon) could touch intel in the PC cpu market they thought they could force consumers to buy what they wanted at their price. Which is why they went with RDRAM (they have alot of money invested in RDRAM... so if you have to buy RDRAM to use intel chips they make even more money) It didn't matter the RDRAM wasn't as good as promised and that it cost tons of money you had to buy it because that was all that the newer Pentiums could use... Then came AMD and the Athlon, which forced intel to stop being lazy and lower prices create the MTH and push the coppermine cpu out earlier. So what is the moral of the story?

    Competition is good dammit! I hate it when people [slashdot.org] who own nvidia chips get all happy when a competitor stumbles. Don't you understand that competion keeps prices down and increases inovation... Look at Microsoft, no competion is why they can charge rediculous prices for their shit OS. If 3dfx comes out with a card that is just as fast or faster than the Geforce2 then prices would be even better.

    Moral? There is no reason to be happy that 3dfx is having trouble even if you are a stalwart nvidia fan. (unless of course you own stock in nvdia) Less competion just means higher prices and slower less inovating releases of hardware later.

  • by stripes ( 3681 ) on Thursday May 25, 2000 @03:04PM (#1047629) Homepage Journal
    Rendering at a higher resolution and interpolating down would be the same speed as 3dfx's 4xFSAA.

    That need not be the case. If you render the pixel four times and avg there is no need to store those four pixels to memory, and then read them back. Rendering at 2x (four intermediate pixels per one output pixel) you need to write pixel values five times, read them four times, and have memory dedicated to the intermediate image (and not textures, or on card virtex lists).

    So if any portion of your performance is limited by memory bandwidth (or availability) on the 3D card, the 3Dfx method will be faster. If not, they should be as you suggested, pretty much the same (all else being equal)

    That's not saying that's what actually happens, but it is quite possable.

    3dfx's implementation cuts performance quite a bit as well.

    Could be.

    I find it sort of odd that nVidia's method, as you describe it, does not produce the exact same image. It is essentially the same operation. 3dfx renders the same frame 4 times, with pixel offsets of (0,0), (0,0.5), (0.5,0), and (0.5,0.5). After averaging, that should look the same as just rendering at a higher res and then averaging every 4 pixels into one, right? What am I missing here? Can you point me to a comparison, preferably with screenshots?

    Beats me. Either could be taking shortcuts which change the result image. It is also possable, but unlikely that the FSAA actually takes more samples if the originals arn't "close enough" in color space. Many software 3D renderers (like POVRay) can do that. It can be fairly expensave in terms of runtime (depending on how you define "close enough", and how busy the image is), but can produce some stunning results.

    If they don't do that, I expect some future 3D card will.

  • 800MPixel with two texels per pipeline. Since almost every game uses multitexturing, that means 1600MTexels per second, compared to the V5-6000 1200.

    Quake 3 is a T&L game, and depending on the map you play, T&L can make a big difference. Many more T&L games will be out in a few months. And I'm a game developer, so I sure as hell am seeing serious improvement NOW on my GF2 with T&L. Thank you.

    ------

  • I'm really getting sick of all this crap. Really I am. You are all posting like the foolish 16 to 18 year olds I LAN party with (I'm 26 myself). What am I talking about you ask?

    "3DFX is dead! Long live nVidia!"
    "3DFX was once good."
    "3DFX sux!"
    etc...

    I myself have bought 3DFX cards only in the past. I bought my Voodoo2 12MB for $300 just two years and three months ago the first week it was out. I bought my Voodoo3 3000 OEM just over a year ago for $125. I got that great price as part of a 20 pack that my friends all chipped in to order. Rember those were $175 and up when they were brand new, so $125 was a great deal. Just how often should you have to upgrade your freakin' graphics card? I cringe at once a year but I love games and want performance. I look at nVidia and see that they have this "aggressive" release schedule to try and get a new product out every 6 months. BFD!!! Who can afford that crap? Not many. You'll have to skip a generation or two between your upgrades. Then what. You suck, right? Because you don't have the very best card available you must suck. You don't think so? Sure you do.

    Now I'm not saying that nVidia doesn't have a great card now. They do. But my Voodoo3 3000 is still a kick-ass card. It looks beautiful to me. Of course I don't get 120fps, but I like some eye candy. And speaking of eye-candy... yeah I know it's Glide... but... I've seen Unreal Tournament on a GeForce that had to run D3D. It looks like SHIT!!! I'll take my proprietary Glide over that crap anyday for Unreal Tournament.

    Another thing that bugs me is that everything seems to revolve around the innovations of nVidia. I'm sorry, but last I checked 3DFX had some pretty serious new features. Oh, wait! That's more of that eye candy that prevents us from getting 600fps. And the nVidia has T&L. 3DFX doesn't. So... they said they didn't think it was that big of a deal. That's D3D thing if I remember correct. And you all dis on 3DFX for not being as OpenGL friendly as they should be. Then why even accept D3D as an option. D3D looks like shit. Back to the new features. I think that I'd like to see newer things added in like motion blur, focus on what's in view, soft reflections, etc. What's wrong with trying to be more cinematic in games? I'm not saying that these new features will be as good as they hope they'll be (games will have to incorporate them I think), but it's a goal. These guys I play Q3 with turn off all the eye candy on their GForce cards to get more fps. Looks like Q1 now. Funny.

    And to all of you that think that nVidia is king of the hill and should crush the others, then get ready for lack of innovation and high prices. Competition is needed everywhere. Shall we get rid of AMD and just have Intel? And those damn long distance providers... let's just go back to AT&T. No, not at all... you need to be hoping that companies stay in place and keep trying to one-up each other.

    And quit rooting for a damn card manufacturer like it's a commercial sports team! It's sick! It's pathetic! Go Broncos! Seattle Sucks! Indians Rule! Go Green Bay! I hate sports. I'm not playing. It makes no difference to me who beat who. I think that those who root for teams (especially those 1500 miles away) think they are part of the team or something. "We" won the game? Who the fuck is "we"? They won, not you. Same goes for graphics cards. So you've got a GeForce and nVidia is on top. What does that make you? King of the video card industry? Part of the team? Step off, fool!

    The bottom line for me is a wait and see. I'll save my money for a while and wait and see how good these cards turn out to be. My money doesn't come easy. The next card I buy is going to have to push 2 years before the next upgrade. Does that seem unreasonable? I don't think that it is. I am fully open to buying any brand of card, but I want a good card. I don't buy into hype or even reviews because they are not all encompassing. And you would all do well to remember that.

    Forgive my rant, but I am just tired of the whole thing.

  • "Why is it that ppl are commending 3dfx for delaying their product for better quality, but when an intel release gets delayed for the same thing, we laugh?"

    Probably because Intel screwed up while they were trying to shove Rambus down everybody's throats, and their attempt at backfired.
  • "...If someone doesn't produce a competitor to nvidia it just a matter of time before nvidia becomes the Microsoft [of] 3d card."

    nVidia's not going to become a Microsoft, they're going to join 'em. As one the stronger supporters of D3D in its earlier days, nVidia's been sucking up since Day 1. Just recently, Microsoft awarded nVidia a contract to develop the X-box graphics chip, together with a $200 million investment.
  • by Guppy ( 12314 ) on Thursday May 25, 2000 @05:34PM (#1047646)
    3DFX One-Ups nVidia: "Fuck Off and Die" [somethingawful.com]

    San Francisco, CA - In a formal press release read to major gaming sites, 3DFX has "one upped" the video card war by telling nVidia to "fuck off and die".

    3DFX, known for its revolutionary line of "Voodoo" based video cards, has been under fire recently for losing its competitive edge to nVidia, who's GeForce 2 line of cards is speculated to be slightly faster than the upcoming Voodoo 5.

    "Quite frankly, I'm sick to death of sidestepping the issues and trying to be 'Mister Nice Guy'. I hate nVidia and every fucking asshole that works there," quipped Brian Burke, 3DFX's PR spokesperson. "I hate their engineers, distributors, advertisers, executives, and janitors. I especially despise Derek Perez, who I formally challenge to a knife fight in the parking lot after this meeting..."


    Anybody who thinks the 3dfx vs. nVidia wars are getting increasingly ridiculous should go read the rest of this article, at SomethingAwful [somethingawful.com].
  • So, for once, a company decides (for whatever reasons: technical, political, financial .. who knows) to withhold a product until it meets higher standards.

    And yet, somehow, we find a way to bash them for this, claiming that they need to "pick up the pace" and that they're "already behind nVidia in the video card wars".


    Well, you must believe everything you see on the TV, right?

    When a company says it's delaying a launch "to meet higher standards", it does not mean that some engineers gathered together and decided that some parts needed additional polishing in order to be just right. What it means is that the company discovered a show-stopping bug, something so awful and horrible and bletcherous and unpatchable that even the marketing people agreed to postpone the launch. This, in turn, generally indicates that the product in question was rushed and suffers from the MOMOWFIL (Move On, Move On, We'll Fix It Later) syndrome which is not a good thing.

    In any case, there are ample reasons to suspect that 3dFX is already dead and we are now witnessing the last covulsions of a corpse...


    Kaa
  • by LordNimon ( 85072 ) on Thursday May 25, 2000 @07:08PM (#1047653)
    My 8600 is an established system with lots of memory, several SCSI devices, and a huge SCSI hard drive. I couldn't just get a new G4, I'd have to spend $1000+ in upgrades to make it as capable. And even if I did get one, I'd still keep the 8600 as a backup computer (our business runs on Mac technology). Besides, I could always transfer the Voodoo 5 to my G4 and use it in a dual-monitor setup.

    One thing I forgot to mention in my original post is that there is a petition [xlr8yourmac.com] that asks Apple to include AGP Voodoo 4/5 cards as a build-to-order (BTO) option for G4's. If you're a Mac user, I strongly suggest you sign the petition. We all know that ATI's monopoly on Mac video doesn't encourage them to make good drivers.

  • rock on! my trident 8900cl is still alive and kickin! the chip says (C) 1989 so it's been around for 11 years and it's still working just like the first day when i brought it home.

    but now a days, video cards go obsolete so fast, my g200 is only 2 years old and it's already considered a piece of junk...

    sigh, they just dont make video cards like they used to (=


    Zetetic
    Seeking; proceeding by inquiry.

    Elench
    A specious but fallacious argument; a sophism.

  • ATI cards are nice (for 2d only), ATI Xpert types for example. Sure they are only and they suck for anytype of gaming, but they do have strong points. For example, this ATI Xpert 98 (8 megs) really kicks ass (for everything but games). It is well support under X (Have got it to work under both Linux and OpenBSD), works great under SVGA (Linux only), support under BEos and works well under Windows9* (haven't tested NT).

    In it's day it ran about $80, you could probably pick one up now for about $50 I could bet. Nice cards, it has worked under all envoirments I have throwen at it (expect quake2, which sucks in anything over 640x480, but is playable at 640x480, doom and Age of the Empires looks nice under it, Herteric 2 is EXTREMELY slow even with all eye candy turned off).

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...