1371493
story
Ant writes:
"Yahoo
posted the press release that 3dfx Interactive® Inc. has temporarily delayed the release of its Voodoo5(TM) 5500 AGP. The press release states that the company is taking this action to ensure that it meets its own high standards for product quality."
voodoo (Score:1)
So (Score:1)
Long live nVidia! (Score:1)
ARGH! (Score:1)
Dammit, I WANT to believe in 3dfx. They support Linux. For a long time they were king-of-the-hill!
But now, even with the GEForce2 out, they have to delay a card that requires me to hook up a HD connector to power it?!?
Dammit, if nVidia would just open up their drivers a little bit more, I wouldn't mourn for these guys, but ARGH!
Possible technical reasons (Score:1)
Interestingly enough, this coorisponds pretty closely with rumors of Nintendo delaying Dolphin (their new game system). Although they are not confirmed, it could be that Nintendo and 3dfx are facing similar technical challanges that they are running to kinks with. Hrm.
Just speculation...
Is it really needed? (Score:1)
QA3 at 170fps is meaningless.
High standards? (Score:2)
3dfx SLI (Score:1)
Too fast... (Score:1)
I doubt that there will be any real reason to upgrade to one of these new cards for some time tho'.
They are taking preorders.. (Score:1)
Shouldn't this be termed... (Score:3)
Makes Me Think (Score:1)
I mean I've been setting back watching the 3dfx vs Nvidia bitchfest for a while now, but I guess I'm a cheat because I have a TNT2 with closed drivers for q2/3 and 2 V2's SLI for all my wine/3dfx games.
Windows drivers much Linux? (Score:2)
would be nice... (Score:1)
Re:So (Score:1)
From buy.com's own Shipping Definitions:
Back Order
This product is not currently in stock. We have orders placed with our supplier but have not received a date yet as to when we will receive this product. We will make every effort to fill your order as quickly as possible and to keep you updated on our progress.
That would be why.
Is it gonna work with AMD? (Score:1)
5500 ? (Score:1)
Let's.... applaud them!!! (Score:5)
I hope sooner or later more and more companies (and shareholders, analysts, etc.) will begin to realize that if you delay your product by two months, NOTHING bad is going to happen, but if you ship crap, it's not going to improve your customer relationships.
I suggest we write polite and GRATEFUL e-mails to 3dfx THANKING them for caring about quality and expressing support - who else if not
3dfx... scary (Score:2)
And if that isn't scary enough, Quantum 3D is building 8, 16, and 32 (THIRTY FRICKIN TWO) processor... well, "cards" isn't the right term: they are external rack mountable boxes. It uses 1600 WATTS.
URL: http://www.quantum3d.com/product%20pages/aalchemy
Thanks shugashack [shugashack.com]
Re:Shouldn't this be termed... (Score:1)
Re:Is it really needed? (Score:1)
Maybe it's not such a bad thing... (Score:1)
Re: (Score:1)
Re:steve woston forever (Score:1)
The real Steve Woston's site is here [mnc.co.za].
Steve is genuinely upset that he's being impersonated on Slashdot.
Re:voodoo (Score:1)
One Microsoft Way
Re:Possible technical reasons (Score:1)
If it is a heat dissipation problem then I can see the modifications already:
1. Seat your VooDoo5 in the AGP slot
2. Connect the monitor to the DB15 socket
3. Connect your garden hose to the 1/2" NPT nozzle next to the DB15. Note: Operator is responsible for providing adequate drainage.
buy.com... (Score:1)
and 200 W power consumption (Score:1)
--
Only here at Slashdot ... (Score:3)
So, for once, a company decides (for whatever reasons: technical, political, financial
And yet, somehow, we find a way to bash them for this, claiming that they need to "pick up the pace" and that they're "already behind nVidia in the video card wars".
Let's cut them some slack, and not judge until we have a final product in our hands. I'm telling you, this is the only forum in the world where we can badmouth company's no matter WHICH choice they make. =)
News Release.... (Score:1)
Re:Too fast... (Score:1)
One Microsoft Way
I've always wonderred (Score:1)
I used to be a big 3Dfx fan... (Score:4)
But times have changed. Voodoo 5 isn't much to get excited about. The guy that sits next to me at work, has one in his machine to check compatibility with our product. When the GeForces first arrived in the office, there was a bit of "Hey! I wan't that in my machine!" going on, but when the Voodoo 5 arrived, noone even really wanted to install the thing.
Actual quote from a coworker: "OK, I tell you what. You run up the demo, and I'll see how big my yawn is."(1)
Why the lack of interest? Well, what's to get excited about? Sure, it's fast, but, as someone else pointed out, no games max out on the card's speed because they need to still run on slow-ass machines without crawling, and as you start to add scaling functions, you start to add overhead -- remember, the CPU still has plenty of work to do.
Now, the GeForce (and IIRC ATI's new card, the Radeon) has hardware to take some of the geometry strain off the CPU. Plus, newer cards are adding sexy new stuff such as cute pixel-shader features. When you get down to it, these sorts of things are far more interesting than raw fill rate / card tri rate, which is all the 3Dfx cards actually offer.
This is because when you get down to it, texture-mapped triangles are not very interesting. Sure, they make a good building block, but there are things you just can't realistically represent that way, unless you generate textures, texture coordinates, and do interesting things with them, on the fly.
It's with tricks like these that we can improve lighting models, reflectivity effects, and volumetric effects, to bring the otherwise rather flat, plastic world nearer to the more realistic and impressive world of raytracing, but in real time. Sure, you use cheap hacks, but at 60fps, noone notices... ;-) And then the gamers are happy, and us game developers are happy too :)
(1) the yawn was medium-sized, by the way. Their full-screen anti-aliasing is quite good quality. Nothing else startling to look at though.
The REAL reason for the delay... (Score:2)
Engineers were later heard to report that the card in question was stil functional and "played Quake 3 Arena 47% gorier than before".
Boy it's a long day at work today...
Re:Windows drivers much Linux? (Score:3)
3dfx does have a high-performance OpenGL implementation. They put a reasonable amount of manpower into it. I think this is showing you the performance improvement you can get by putting real resources behind the project.
The nVidia situation is interesting. In that case, they are using essentially the same code base between Linux and Windows. The question to ask is why the Linux version is then slower? It could be an OS issue, a compiler issue, a driver issue, or something else entirely. If we close that gap, whatever it is, all the implementations get better. By the way, 2.3 kernels have been signficantly faster, so the 2.4 release may help with the difference.
The argument is that Open Source efforts can do it better, but you have to qualify that a bit. What defines better? In some cases we're not concentrating on the same focus. For example, Mesa tries very hard to be a complete and conforming version of OpenGL. In some cases that may mean losing some performance compared to tweaking of Q3A at the expense of everything else. Some of the security and stability fixes in the MGA DRI code mean we lose a bit of performance. The 3dfx in-a-window implementation under Windows is quite a bit slower than their full screen mode.
You also have to compare manpower efforts. 3dfx and ATI put a lot of people on working on their drivers. They are each paying PI for one engineer. That limits not only how fast the drivers can be produced but how good they will be. We're also really not seeing much help from the community. We'd love to have more people contribute.
The 3dfx driver has remained mostly unchanged (except for bug fixes) for the last year. That's because I've been payed to get it running on different boards (V3/V4/V5), to fix some bugs, and improve the infrastructure (DRI). We really haven't had the resources to spend doing optimizations and to rewrite some of the really ugly parts in the code.
The bottom line is that there is room to improve. I have no doubt that with the right attention all the drivers would be very close to their windows counterparts. We just need some good people to do the work.
One Microsoft Way
What to get for PCI (Score:1)
Are there any other feasible options?
OT: Hockey (Score:1)
This brings up a question I've always wanted to post on Ask Slashdot...what hockey team is the favorite among geeks? An argument could be made for any of the following:
Pittsburgh - mascot is the Penguin. 'nuff said.
Ottawa - play in the Corel Center. Possibly the only Linux ad you'll see in all of sports.
New Jersey - their logo brings to mind the BSD daemon.
San Jose - the Mecca of geekdom.
Anybody got any more?
Re:3dfx... scary (Score:1)
With all the geeks running really ridiculous stuff in their residences, I think everything should be engineered to Class B standards (i.e. low interference).
Re:Slightly offtopic (Score:1)
Spooner
a TNT1 or a Matrox G200 (Score:1)
--
MS syndrome (Score:2)
I got the Voodoo 3 3500AGP when it came out, and while it runs Glide games like it should, there's still no OpenGL support. (No, I mean FULL support... for Windows.) and maybe I'll get windowed Glide rendering one day... yeah right.
The video capture sucks royally too. My ($300+ CDN) 3500 on an AGP bus, in my P3-450 with 128MB of RAM (and AGP aperture set properly,) barely manages to get 20 frames per second at 320x240 truecolor, set it any higher and it chops like crazy.
Funny that an $80 CDN PCI Hauppauge WinTV tuner card in my... Cyrix 200MX with 32MB RAM (before the CPU wore out,) was able to capture 30fps at the same quality settings!
Also, I'm sure many of you have heard of the "DVD acceleration." This is... a hardware video overlay. That's it. Luckily I didn't trust them and got a full hardware decoder card anyhow.
Personally while you can count on game companies supporting 3dfx, I won't expect quality until I see it from them.
(BTW, when the card works for 3d, it's awesome, it just sorely lacks in some major points that you should expect from any decent manufacturer.)
what about competition? (Score:1)
Re:3dfx... scary (Score:1)
I forgot to mention that the v5-6000 (4 chips) is going to be $600 so it is questionable whether that could be considered for home use... no way in hell I am putting something putting off that much heat and noise (from the fans) in my computer. I can barely stand the noise coming from my computer from the one fan on my tnt2, celeron, two hard drives, and power supply (although the power supply I have is very quiet).
Re:I've always wonderred (Score:1)
Re:So (Score:1)
Maybe I shouldn't troll while I'm logged in...
yes, but the Voodoo4 is so useless... (Score:1)
Re:3dfx SLI - Voodoo 5000 & Voodoo 6000 (Score:1)
Seems nVidia will strengthen their lead with another 3dfx delay (cool, cuz I like nVidia, bad because competition drives prices down and performance up...though in many ways nVidia acts as competition to itself by releasing new products on a 6 month cycle.)
Matrox (Score:1)
Re:a TNT1 or a Matrox G200 (Score:2)
He said work WELL, not leak 5 megs of ram every time you open/close an XMMS OpenGL plugin.
The nVidia drivers are crap, and they're really slow comming up to speed. I own a TNT1 and I wouldn't recommend buying one if your main platform is X. Go with a Matrox/3Dfx card.
-- iCEBaLM
Re:I used to be a big 3Dfx fan... (Score:1)
The only thing that 3dfx has over the competition is their *special* features : t-buffer, fsaa....
Unfortunately, as said above, the fsaa is only ok, not on the killer ap level that would make people give up raw fill.
Re:5500 ? (Score:1)
Re:3dfx (Score:1)
Uh, more like go NVidia GeForce 2. Has anyone actually got their hands on a Radeon yet? GeForce2 is almost shipping AFAIK.
3dfx sucks. $700 for the top of the line Voodoo 5 which will just bearly beat out the Geforce 2? Give me a break.
umm, the v5 *is* crap. (Score:2)
So they get the Ion Storm award, not the Blizzard award.
this is whats new. (Score:1)
Re:Let's.... applaud them!!! (Score:1)
Whoah... (Score:2)
Girlfriend: Hey Dman33, what is that smell in the computer room?
Dman33: Oh, the cat.
Grilfriend: No, not that type of smell, this is like burnt toast.
Dman33: Yeah, like I said, it is the cat.
Girlfiend: ?
Dman33: Damn thing crawled between my dual 1600W 32-processor video cards...must have made contact or something, little critter got fragged big-time!
Re:what about competition? (Score:1)
The difference between Microsoft and nVidia is that nVidia is tops because they're better than a wide range of other video card makers, while Microsoft is tops because Windows is (marginally) better than no operating system at all (don't argue that you can get Linux - Joe Shmoe couldn't install Linux if his life depended on it).
Don't mistake market dominance due to a quality product with market dominance due to monopolistic practices.
Re:OT: Hockey (Score:1)
To your second (favorite team among geeks), a good candidate might be Washington. The owner there (a former AOL guy, I think) gave all the players lap tops, and everybody involved with the team participates in forums via their website. Their coach, Ron Wilson, is an admitted internet junkie.
My personal fave, of course, is the Red Wings. Been a fan since the early 80's, so it's been a great ride.
Re:News Release...Field Failures! (Score:1)
But now, the roles are reversed... (Score:2)
The weird thing is that 3Dfx used to be the king of performance, evangelizing fill rate and frame rate over visual quality, and nVidia argued that visual quality was more important than fill rates. That's the essence of the whole argument that broke out when the TNT and TNT2 had 32-bit color but the Voodoo 3's had only 16-bit color but more speed. Now the roles are reversed. I feel like Alice, through the looking glass...
But unfortunately I fear that 3Dfx's superior image quality is just a fluke, and that they're touting it now because it's 3Dfx's only advantage. Remember that the Voodoo 4 and 5 were supposed to be out by last Christmas, before their design and fab difficulties, so effectively they're now a product cycle behind nVidia. If 3Dfx fails to treat the current emerging lineup as an "interim" line of products, and doesn't bust its collective ass to get another and vastly superior product cycle out the door before this Christmas, it will go down faster than a freshman at a frat party. Goodbye, 3Dfx.
I hope this doesn't happen, because I have respect for what 3Dfx did to advance 3D on the PC, and I'd hate to see yet another graphics company go bust, but at this point it looks grim for 3Dfx. The top of their current emerging lineup has superior image quality, perhaps the best in 3D right now, but at framerates which can't be considered more than just "passable." Their Voodoo 5 6000 is an utter joke, if and when it finally gets released; it'll cost twice as much as a GeForce 2, but definitely won't double the performance, and will require 4 chips and an external power connector. Yes, the very thought of such a massive, impressive piece of hardware makes me want to jam my slot 1 into a tight little socket 370, so to speak [nudge-nudge, wink-wink], but the card isn't financially sound since we know nVidia's next product cycle will probably surpass it.
But, anyway, it is interesting how the roles have reversed, and 3Dfx's visual quality is now their selling point while nVidia's raw performance is now their selling point. My poor ATI A-i-W 128 feels so...inadequate... I need some video card Viagra...
Re:umm, the v5 *is* crap. (Score:1)
just addressing two things (Score:3)
Praise them, they are doing the right thing.
Unfortunately, the Voodoo 5 sucks badly. The GeForce 2 rips it up in the power consumption, heat, feature, and sheer power fields.
I don't really think its 3dfx's fault entirely, maybe they lost some good talent or something. I mean they haven't done *anything* that was top of the line since the Voodoo2. They ruled back in those days. Then there was the Banshee. Then the Voodoo3. Both of which were lame.
Now we have the V4 (which is a joke), and the V5 (which is a bigger more expensive joke).
So, lets applaud them for their policy, and slam them for their technology. At least then we are doing it right.
3DFX RIP (Score:1)
Re:I've always wonderred (Score:1)
Scratch that. Maybe Marketers do not link #4
Tom
Re:Is it really needed? (Score:1)
The latest Ultima, "Ultima CXVII - Lord British Teleports Away", will max out just about anything available if you set the clipping planes way back and run at the highest LOD.
Re:3dfx (Score:2)
The "low end" VooDoo5 was beating the GeForce2 at high res. If you play games at 640x480, I feel sorry for you... sniper fodder.
My "old" VooDoo3 3500 will pull 100+ fps in Glide (1024x768). I only expect the new card to let me do the same with Full screen AA, and probly at a higher res.
The GeForce2 is a nice card - there is no arguing that. But for the price of a DDR GeForce I can get a card that will out perform the GeForce, outperform the GeForce2 at high res, it'll support all my Glide games (Glide wrappers don't cut it), and I get open source Linux support. Case closed as far as I'm concerned.
SQ
Quality, and... (Score:3)
I wonder how much of this was a quality concern, and how much was the sudden realization that the nVidia GeForce 2 is faster, cheaper, available NOW,and doesn't require a freeking AC adapter to be plugged into its rear-end?
From what I've read, the V5 6000 is the only card from 3dfx that has any chance of beating the GF2, but only in a few select situations, such as running older games at super-high-res with 4xFSAA. And then it is only a little bit faster. (yes, the GeForce 2 does FSAA.) Add to that the fact that the 6000 will cost US$600 (when it finally comes out) as opposed to the GF2's current price of $300, and you have a sorry situation indeed...
------
Wrong (Score:1)
Re:Slightly offtopic (Score:1)
Re:MS syndrome (Score:1)
don't forget other Windez versions! (Score:2)
Gee, nice 'lameass' filter, I had to make all those lower case. Thanks, Taco.
Pope
Freedom is Slavery! Ignorance is Strength! Monopolies offer Choice!
Re:Slightly offtopic (Score:2)
Some numbers to go with that. (Score:2)
BTW, "128MB" on a V5 6000 is no better than 32 on a GeForce 2 due to the multiprocessor design. But the GF2 can have up to 128MB, which would be like a V5 with 512MB on-board. heh.
------
Re:Let's.... applaud them!!! (Score:2)
Problem does not affect Voodoo4 cards... (Score:2)
The V4 (Which has not yet shipped), is basically meant to be a rival for the TNT-Ultra. According to Anandtech's 3dfx Voodoo4 4500 & Voodoo5 5500 Preview [anandtech.com], the V4 (with Beta drivers) does pretty good against the TNT-Ultra. It's a little slower at 640x480, a little faster at higher resolutions. At the same time, it offers some extra goodies like 2x FSAA, improved 16-bit color quality, and (possibly) a lower price.
Re:Let's.... applaud them!!! (Score:2)
At the very least this suggests unrealistically tight testing schedules at 3DFX, at worst - well how does Voodoogate sound?
Re:Let's.... applaud them!!! (Score:2)
You use this hoary old thing called "surface mail" to upload your bad hardware and download a replacement.
Case of Note: The companies making PowerPC accelerator cards, a market where there is a lot of competition, use this approach quite frequently with new models. Check any Mac Hardware discussion area, MacFixit, xlr8yourmac, or DealMac, and you will find that for every new release of an accelerator card, there are customers who find problems that eventually become "patched" at the factory, and customers who specifically wait until they feel a satisfactory number of patches have been implemented before downloading anything.
3dfx doesn't have any real competition for their Voodoo line (ATI? I said 'real.') so they can afford a brief delay to patch their hardware. For them, it's cheaper than customer support.
Re:I used to be a big 3Dfx fan... (Score:3)
The way 3DFx does FSAA is to render slight variations at the same resolution, and then average them. It is better quality, and works with more games. It is also faster, in the case of 2x. This also allows for 4x FSAA, which is even better quality.
I will probably go with 3dFX for my next video card, because I use a multimedia projector for gaming, which has the advantage of size, but the disadvantage of lower resolution (800x600). FSAA is great for this type of application.
If you use a high resolution monitor, though, you can just set your resolution to 1280x1024 and turn off FSAA. It's not noticable at high resolutions. In this case, I'd probably go for the GeForce 2.
Re:I used to be a big 3Dfx fan... (Score:4)
It is if you are a Macintosh user. The Voodoo 4 and 5 boards blow away anything else available for the Mac. Granted, some Macs can't take the boards because they don't have any free PCI slots (a big problem in the Mac world), but I have a PowerMac 8600/300 for which the Voodoo 5 5500 PCI is perfect. I'm more than happy to pay the $350 for that card.
Re:don't forget other Windez versions! (Score:2)
Nope.
The first version of Windows NT was released as Windows NT 3.1. Then came 3.5 and 3.51. Then we had NT 4.
Now we have Windows 2000.
3dfx may surivive and thrive in a different market (Score:2)
Sure, you can get the NVidia Quadro's, but
it still is pretty middle-range.
Think about a big 8x + CPU workstation from SUN
with a rackmountet 3dfx graphics system from
Quantum.
It may not sell in large quantities, but the
margins sure are enormous on these beasts.
Perhaps 3dfx should realize that their current offerings cannot compete with NVidias in the middle-range market, and keep the Voodoo4 and 5
as low-end, ditch the Voodoo6, and consentrate
on really large grahical systems, with 10+ VSA-chips. That's Voodoo Scalable Arcitecture alright.
I hope 3dfx can regain some of their momentum, as
I have a rather nostalgic feel about them.
NVidia's offerings are _much_ better though, and
ATI is looking good.
If VSA had reach products 1/2 year ago, they might
have been a success in the home and OEM-market.
Now it'll probably still stay afloat in the low-end market, but get killed in the mid-end.
V5 vs. GeForce: The Coke vs. Pepsi Taste Test... (Score:2)
Thresh's Firingsquad recently performed a side-by-side visual quality comparision of the V5 vs. GeForce GTS FSAA [firingsquad.com]. According to the testers, the Voodoo5 had the best picture quality when in 4x mode, while the GeForce was better than the V5's 2x mode in some games. Both cards seemed to have a few glitches in FSAA mode--the V5 had a "bleeding" problem at 1600x1200, while the GeForce wouldn't work with D3D games.
From their conclusion [firingsquad.com]:
"The results from this set of tests were considerably different from that of last time. Seeing the games in motion side by side is truly the only way to compare the two cards. 2x FSAA comparisons yielded mixed results. The quality difference between the two cards was exceedingly close. We tended to like the GF2 FSAA when compared to the Voodoo 2x. However, if we take into account performance figures, the Voodoo is the clear choice. With 2x FSAA, the Voodoo performs considerably better than the GeForce2 FSAA. When it comes to 4x, the Voodoo has no competition in terms of FSAA quality."
Re:Let's.... applaud them!!! (Score:2)
Mr. Bughunter was talking about the Mac market, where nVidia does not currently have any options. nVidia has said they may offer Mac cards in the future, but nVidia has already stated that the NV15 (GeForce 2) will not be one of them -- so Apple users will probably have to wait until at least early 2001.
Re:I used to be a big 3Dfx fan... (Score:2)
Rendering at a higher resolution and interpolating down would be the same speed as 3dfx's 4xFSAA. 3dfx's implementation cuts performance quite a bit as well.
I find it sort of odd that nVidia's method, as you describe it, does not produce the exact same image. It is essentially the same operation. 3dfx renders the same frame 4 times, with pixel offsets of (0,0), (0,0.5), (0.5,0), and (0.5,0.5). After averaging, that should look the same as just rendering at a higher res and then averaging every 4 pixels into one, right? What am I missing here? Can you point me to a comparison, preferably with screenshots?
------
Defend your moderation (Score:2)
So why was the above post (-1, redundant)?
Different methods of performing FSAA (Score:2)
The OG!
OGSS is exactly what it sounds like. Ordered grid means that the image is processed in an ordered fashion. Cut the screen up into nice little blocks and you have an ordered grid. Now we have super-sampling. This means, in really dumb downed terms; that the picture is processed except with a bit more data in it. Mind you, this is all going on within a pixel, thus creating a much more detailed image. The image that is represented at 640x480 is actually processed with as much detail as would be present in something that has, as an arbitrary number, 1.5 times as much detail. So in order for the GeForce2 to display a scene with FSAA at 640x480 it must do the work required for displaying an image at 960x720 and then some. Other stuff like color blending goes on to smooth out the image also. So all in all a considerable amount is going on to create the effect that FSAA delivers.
Voodooss
The 3dfx card does another variation of FSAA called jittered grid super-sampling. JGSS is a derivative of RGSS. RGSS, as opposed to OGSS, takes the image that is going to be represented and processes all data at a slight tilt. Jittered grid has the tilted data set, but it has a randomizing factor thrown in to make it seem more natural. If all the data was rotated at the same angle it wouldn't make too much of a difference in comparison to OGSS. This is because our eyes tend to pick up on patterns relatively easily. The random patterns make sure your eyes don't catch on to what is going on. Following this, all the other color blending and hoo-haa takes place to spruce up the image.
Commend delays? (Score:2)
---
Re:ARGH! (Score:2)
I think you are talking very subtle shades of "truly commited to Open Source."
I disagree. There are people at 3dfx who are working hard to get Open Source in the door. (Joseph & Kieth). Maybe management isn't 100% on board, but neither is Matrox.
It's a shame that the ONLY company that has released all specs and driver code is being slamed here. Matrox always has to be cajouled into releasing their specs.. just usable specs even).
In all of my talking with 3dfx people, never have they held back on the technical details of the glide driver or the hardware. (I am working on the h3 branch of glide to the alpha platform).
Please read this message and tell me they arn't working. I just can't agree with your assment that they are so much the better kind of folks.!
> Thanks for the detailed response. However, simply because we haven't
> released info yet doesn't mean we aren't working on it as we speak and
that
> we don't intend to. Most of the data was never meant for public
> consumption and much of it needs to be reformatted, completed, etc. for
> release. Our culture is very engineering driven and so, as you might
> expect, there are many radical linux heads running around the company
saying
> we should make all of our IP public including our our vlsi chip
designs...
> =)
>
> Anyway, everything you said is well understood here. Well understood.
> We've been working with Precision Insight to enable fast 3D on linux for
> quite some time. You will continue to see lots of action in the open
> source/linux community soon from 3dfx. We intend to be a primary
> contributor. To date, we've released our 2D specs, our new compression
> algorithm (fxt1), and more WILL follow. =)
>
>
> Keith
I'll take a 3dfx or a Matrox to a nVidia or a Yamaha (Which I have spent alot of time talking to about OSS). 3dfx seems to have followed up on their promises of open source drivers and specs, and has done really well.. please explain exactly what they have failed at provinding to developers of open source code since they provided the documentation and source code last December.
Panaflex
Recall (Score:2)
--
Anybody worried about nvidia going Redmond? (Score:4)
Think about it... Because no one (until recently with the Athlon) could touch intel in the PC cpu market they thought they could force consumers to buy what they wanted at their price. Which is why they went with RDRAM (they have alot of money invested in RDRAM... so if you have to buy RDRAM to use intel chips they make even more money) It didn't matter the RDRAM wasn't as good as promised and that it cost tons of money you had to buy it because that was all that the newer Pentiums could use... Then came AMD and the Athlon, which forced intel to stop being lazy and lower prices create the MTH and push the coppermine cpu out earlier. So what is the moral of the story?
Competition is good dammit! I hate it when people [slashdot.org] who own nvidia chips get all happy when a competitor stumbles. Don't you understand that competion keeps prices down and increases inovation... Look at Microsoft, no competion is why they can charge rediculous prices for their shit OS. If 3dfx comes out with a card that is just as fast or faster than the Geforce2 then prices would be even better.
Moral? There is no reason to be happy that 3dfx is having trouble even if you are a stalwart nvidia fan. (unless of course you own stock in nvdia) Less competion just means higher prices and slower less inovating releases of hardware later.
Re:I used to be a big 3Dfx fan... (Score:3)
That need not be the case. If you render the pixel four times and avg there is no need to store those four pixels to memory, and then read them back. Rendering at 2x (four intermediate pixels per one output pixel) you need to write pixel values five times, read them four times, and have memory dedicated to the intermediate image (and not textures, or on card virtex lists).
So if any portion of your performance is limited by memory bandwidth (or availability) on the 3D card, the 3Dfx method will be faster. If not, they should be as you suggested, pretty much the same (all else being equal)
That's not saying that's what actually happens, but it is quite possable.
Could be.
Beats me. Either could be taking shortcuts which change the result image. It is also possable, but unlikely that the FSAA actually takes more samples if the originals arn't "close enough" in color space. Many software 3D renderers (like POVRay) can do that. It can be fairly expensave in terms of runtime (depending on how you define "close enough", and how busy the image is), but can produce some stunning results.
If they don't do that, I expect some future 3D card will.
Re:Ehrm, not quite... (Score:2)
800MPixel with two texels per pipeline. Since almost every game uses multitexturing, that means 1600MTexels per second, compared to the V5-6000 1200.
Quake 3 is a T&L game, and depending on the map you play, T&L can make a big difference. Many more T&L games will be out in a few months. And I'm a game developer, so I sure as hell am seeing serious improvement NOW on my GF2 with T&L. Thank you.
------
3DFX vs. nVidia = Coke vs. Pepsi = Ford vs. Chevy (Score:2)
"3DFX is dead! Long live nVidia!"
"3DFX was once good."
"3DFX sux!"
etc...
I myself have bought 3DFX cards only in the past. I bought my Voodoo2 12MB for $300 just two years and three months ago the first week it was out. I bought my Voodoo3 3000 OEM just over a year ago for $125. I got that great price as part of a 20 pack that my friends all chipped in to order. Rember those were $175 and up when they were brand new, so $125 was a great deal. Just how often should you have to upgrade your freakin' graphics card? I cringe at once a year but I love games and want performance. I look at nVidia and see that they have this "aggressive" release schedule to try and get a new product out every 6 months. BFD!!! Who can afford that crap? Not many. You'll have to skip a generation or two between your upgrades. Then what. You suck, right? Because you don't have the very best card available you must suck. You don't think so? Sure you do.
Now I'm not saying that nVidia doesn't have a great card now. They do. But my Voodoo3 3000 is still a kick-ass card. It looks beautiful to me. Of course I don't get 120fps, but I like some eye candy. And speaking of eye-candy... yeah I know it's Glide... but... I've seen Unreal Tournament on a GeForce that had to run D3D. It looks like SHIT!!! I'll take my proprietary Glide over that crap anyday for Unreal Tournament.
Another thing that bugs me is that everything seems to revolve around the innovations of nVidia. I'm sorry, but last I checked 3DFX had some pretty serious new features. Oh, wait! That's more of that eye candy that prevents us from getting 600fps. And the nVidia has T&L. 3DFX doesn't. So... they said they didn't think it was that big of a deal. That's D3D thing if I remember correct. And you all dis on 3DFX for not being as OpenGL friendly as they should be. Then why even accept D3D as an option. D3D looks like shit. Back to the new features. I think that I'd like to see newer things added in like motion blur, focus on what's in view, soft reflections, etc. What's wrong with trying to be more cinematic in games? I'm not saying that these new features will be as good as they hope they'll be (games will have to incorporate them I think), but it's a goal. These guys I play Q3 with turn off all the eye candy on their GForce cards to get more fps. Looks like Q1 now. Funny.
And to all of you that think that nVidia is king of the hill and should crush the others, then get ready for lack of innovation and high prices. Competition is needed everywhere. Shall we get rid of AMD and just have Intel? And those damn long distance providers... let's just go back to AT&T. No, not at all... you need to be hoping that companies stay in place and keep trying to one-up each other.
And quit rooting for a damn card manufacturer like it's a commercial sports team! It's sick! It's pathetic! Go Broncos! Seattle Sucks! Indians Rule! Go Green Bay! I hate sports. I'm not playing. It makes no difference to me who beat who. I think that those who root for teams (especially those 1500 miles away) think they are part of the team or something. "We" won the game? Who the fuck is "we"? They won, not you. Same goes for graphics cards. So you've got a GeForce and nVidia is on top. What does that make you? King of the video card industry? Part of the team? Step off, fool!
The bottom line for me is a wait and see. I'll save my money for a while and wait and see how good these cards turn out to be. My money doesn't come easy. The next card I buy is going to have to push 2 years before the next upgrade. Does that seem unreasonable? I don't think that it is. I am fully open to buying any brand of card, but I want a good card. I don't buy into hype or even reviews because they are not all encompassing. And you would all do well to remember that.
Forgive my rant, but I am just tired of the whole thing.
Re:Commend delays? (Score:2)
Probably because Intel screwed up while they were trying to shove Rambus down everybody's throats, and their attempt at backfired.
nVidia already happens to be in Microsoft's pants (Score:2)
nVidia's not going to become a Microsoft, they're going to join 'em. As one the stronger supporters of D3D in its earlier days, nVidia's been sucking up since Day 1. Just recently, Microsoft awarded nVidia a contract to develop the X-box graphics chip, together with a $200 million investment.
OT: 3dfx tells nVidia to "F**k off and Die!" (Score:4)
San Francisco, CA - In a formal press release read to major gaming sites, 3DFX has "one upped" the video card war by telling nVidia to "fuck off and die".
3DFX, known for its revolutionary line of "Voodoo" based video cards, has been under fire recently for losing its competitive edge to nVidia, who's GeForce 2 line of cards is speculated to be slightly faster than the upcoming Voodoo 5.
"Quite frankly, I'm sick to death of sidestepping the issues and trying to be 'Mister Nice Guy'. I hate nVidia and every fucking asshole that works there," quipped Brian Burke, 3DFX's PR spokesperson. "I hate their engineers, distributors, advertisers, executives, and janitors. I especially despise Derek Perez, who I formally challenge to a knife fight in the parking lot after this meeting..."
Anybody who thinks the 3dfx vs. nVidia wars are getting increasingly ridiculous should go read the rest of this article, at SomethingAwful [somethingawful.com].
Re:Only here at Slashdot ... (Score:2)
And yet, somehow, we find a way to bash them for this, claiming that they need to "pick up the pace" and that they're "already behind nVidia in the video card wars".
Well, you must believe everything you see on the TV, right?
When a company says it's delaying a launch "to meet higher standards", it does not mean that some engineers gathered together and decided that some parts needed additional polishing in order to be just right. What it means is that the company discovered a show-stopping bug, something so awful and horrible and bletcherous and unpatchable that even the marketing people agreed to postpone the launch. This, in turn, generally indicates that the product in question was rushed and suffers from the MOMOWFIL (Move On, Move On, We'll Fix It Later) syndrome which is not a good thing.
In any case, there are ample reasons to suspect that 3dFX is already dead and we are now witnessing the last covulsions of a corpse...
Kaa
Re:I used to be a big 3Dfx fan... (Score:4)
One thing I forgot to mention in my original post is that there is a petition [xlr8yourmac.com] that asks Apple to include AGP Voodoo 4/5 cards as a build-to-order (BTO) option for G4's. If you're a Mac user, I strongly suggest you sign the petition. We all know that ATI's monopoly on Mac video doesn't encourage them to make good drivers.
they just dont make video cards like they used to (Score:2)
but now a days, video cards go obsolete so fast, my g200 is only 2 years old and it's already considered a piece of junk...
sigh, they just dont make video cards like they used to (=
Zetetic
Seeking; proceeding by inquiry.
Elench
A specious but fallacious argument; a sophism.
Re:Slightly offtopic (Score:2)
ATI cards are nice (for 2d only), ATI Xpert types for example. Sure they are only and they suck for anytype of gaming, but they do have strong points. For example, this ATI Xpert 98 (8 megs) really kicks ass (for everything but games). It is well support under X (Have got it to work under both Linux and OpenBSD), works great under SVGA (Linux only), support under BEos and works well under Windows9* (haven't tested NT).
In it's day it ran about $80, you could probably pick one up now for about $50 I could bet. Nice cards, it has worked under all envoirments I have throwen at it (expect quake2, which sucks in anything over 640x480, but is playable at 640x480, doom and Age of the Empires looks nice under it, Herteric 2 is EXTREMELY slow even with all eye candy turned off).