3dfx Voodoo Graphics Gets Windows XP x64 Support 104
ryszards writes, "GlideXP author Ryan 'Colourless' Nunn has turned his insanity up a notch with a driver that allows running the 32-bit NT Glide .dlls for a Voodoo Graphics board on Windows XP x64. Already supporting Voodoo Graphics and Voodoo 2 on 32-bit Windows XP, adding XP x64 to the mix lets even more folks reminisce about the good old early days of consumer 3D acceleration hardware. Any excuse to fire up GLQuake one more time!"
Much better choises than GLQuake available (Score:5, Interesting)
Re:Much better choises than GLQuake available (Score:5, Informative)
The Voodoo 2 totally lacked 32bit rendering (what was less of a problem back then, given that the other cards' performance numbers were not high enough to render anything at 32bit reliably anyway), and the Voodoo 3 "only" boasted a so-called "22bit post-filter", which provided a MUCH better visual experience at negligible framerate losses. However, (at least european) gaming mags went rabid about the fact that "Voodoo 3 still does not support 32bit color depth!1" (which, again, was nothing to really care about, given other cards' performance at True Color settings!), and until today I'm sure that this kind of hype (and pushing of NVIDIA's TNT2-Chip along with it) did a great deal to sink 3DFX in the end.
Voodoo 5 supported True Color rendering from the beginning, but the market (or rather the marketing machinery) had moved on to the next hot subject, namely "T&L", by then (which, again, had virtually no real impact on anything that truly mattered for real world games), and due to lack of sales and the high costs 3DFX burdened itself with by acquiring STB, one of the greatest computer graphics companies ever went out of business. Just sad.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I had the good fortune to play glquake on an SGI Onyx connected to a 24" screen via 13W3 (shielded cable) back in those days, it put the voodoo cards to shame.
3dfx lost their way (Score:5, Interesting)
32bit colour and T&L may not have set the world on fire when they were released, but that's hardly surprising. Since the hardware was new, no software yet took full advantage of them.
It's different today. Try running modern games without T&L today, even on a modern CPU, and watch your game crawl - if it plays at all. And see if you can get a gamer to play in 16 bit without noticing the difference (and complaining). The TNT and GeForce chips set the scene for modern graphics, just like the Voodoo & Voodoo 2 did in their time with real 3D acceleration, dedicated texture units, SLI etc.
3dfx made many mistakes, which resulted in them simply being out-innovated and out-executed by the competition while they struggled with their consequences of their poor business decisions. They showed the way, but Voodoo 5/6's multichip approach was never the right direction for the mainstream future.
Re: (Score:2)
It really kind of sucks for old games
Why? (Score:1)
You can't do reasonable color blending and lighting with 16-bits of output colorspace.
Supporting 16-bit textures to save space is different, but the pipeline needs to at least be 8-bits wide per channel (10, or better yet 16-bits is preferred).
Re: (Score:2)
When the GeForce was rel
WTF Are you talking about? (Score:2)
The Voodoo 2 Did the multi-chip approach LONG before voodoo 5's were around, and it WAS the solution and direction to create the future of graphics that we have right now. The 12 meg Creative 3D Blaster Voodoo2 had three processors on board, and it BEAT THE SHIT OUT OF EVERY OTHER CARD ON THE MARKET. What kind of nonsense are you spouting?
Re: (Score:1, Informative)
Voodoo 2 was capable of two-card SLI for what "multi-chip" typically refers to. (You wouldn't call a multi-chip IBM RS-series processor "multi-processor" when it was a single l
And what's that got to do with anything? (Score:4, Informative)
Also the Voodoo 2 didn't have 3 processors on board, it had 3 chips each which was a part of a single unit. One chip did the frame buffer, the two others were texture units. Together they formed what is a single pipeline on a modern card. While separate chips, you had to have one frame buffer chip and at least one texture chip. Adding more texture chips made multi-texturing faster, but not single texturing. In no case did it help geometry.
The Voodoo 5 was different. Each VSA was it's own self contained chip. You could use one or you could use more. However they weren't very powerful. It took 2 of them to make a showing at all against things like a GeForce, never mind a GeForce 2. That was not the right way to go. More chips is a valid in visualization systems (which 3dfx chips were oft used in) but not for consumer desktops. As is seen with the SLI market there IS a small market for it for the ubergamers, but it's got to be optional, not mandatory to get reasonable performance.
Multichip is expensive (Score:3, Informative)
Example: in 1996 when 3dlabs designed the Permedia, it was a multi-chip solution (just like their workstation products) consisting of a pixel and vertex processor. In 1997, 3dlabs combined the multi-chip Permedia into the single-chip Permedia 2. Despite being priced mucn cheaper than the Permedia, the Permedia 2 made 3dlabs much more money due to the low-cost, single-chip design.
3DFX designed the V
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
All these years, and quake is still my DM game of choice, thanks to the FuH.
Zigurat Vertigo for teh win. I don't think that level has ever been equalled for pure high-larious fun.
wrong: designed for speed, not eyecandy. (Score:1)
Nowdays the GPU also enhance eyecandy, so GPU based games can look better, but the first cards where design for speed.
Imho, the first Quake2 screens look like crap because of that, designed for crappy 3d graphic accelerators. Even the mp2 format is "liquid" so render like crap.
Re: (Score:1)
Yeh, coz that software rendering at 320 x 200 in q95.bat really kicked arse eyecandy-wise!
Re: (Score:1)
Hrm ... (Score:2)
Can someone please explain in detail about this? It would be news to me if said sources were actually available, and I simply didn't misread the thread.
Re: (Score:3, Informative)
The glide interface DLLs (still 32bit) can then communicate correctly with the card using this minimal kernel driver.
Re: (Score:2, Informative)
3dfx was acquired by NVIDIA in 2000.
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Insanity! (Score:3, Interesting)
Voodoo3 3500 with TV in/out (Score:2)
I got mine when it was cheap and affordable. Unreal Tournament looked awesome and so did my Linux Desktop!
Who else remembers the "Don't worry we will support your Voodoo cards"... Next day... Site not found!
Re: (Score:1)
Re: (Score:1)
Insane!
Speaking of Glide (Score:1)
Re: (Score:2, Informative)
Yeah, there are loads of 'em [google.co.uk]
I used to use a Glide Wrapper so I could play The Sentinel Returns [yahoo.com] properly on my system.
Re: (Score:3, Funny)
Bob
Re: (Score:1)
Re: (Score:1)
Re:Speaking of Glide (Score:5, Informative)
Dunno about that...
Last time I saw it running on Glide under a recent Windows, there seemed to be a bug where it didn't wait for vsync properly and the CPU got way ahead of the graphics, leading to really ugly control latency. I probably screwed up somewhere
The win32 software renderer didn't have this problem.
-lead programmer of Montezuma's Return
Re: (Score:3, Informative)
Re: (Score:2)
I remember people using it to run UltraHLE (the N64 emulator, which only supported glide) on non voodoo cards.
Voodoo 2. (Score:1)
Re: (Score:1)
Good old glide... RIP (Score:2, Interesting)
Even today, very few games have made me react like I did ("OMG lookit that!") to the Voodoo driven games of yesteryear - did anyone here run Unreal 1 in "software" and then in "glide" and compare the experience?
Back then, we thought that the Trident, nVidia Riva TNT and Cirrus Logic graphics cards were crap compared to
Re:Good old glide... RIP (Score:5, Insightful)
Which, of course, brings up the biggest problem with 3dfx cards prior to the Voodoo 4: OpenGL support. OpenGL was implemented for GLQuake, Quake 2, Half-life, Hexen 2, Heretic 2, and Sin using special "MiniGL" drivers that changed specific OpenGL instructions to their Glide equivalents.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
It was the TNT2, not the original TNT. (Score:2)
The TNT2 wiped the floor with most of the Voodoo cards (and was on par with a Voodoo 3).
Although my card of choice was a 3DLabs Permedia 3. I was lucky enough to get one. If the game supported OpenGL, it flew. I remember playing Quake II and Unreal and thinking how much cooler it was than the STB Velocity I used to have. Also it was the only card that played the ports of FFVII and FFVIII with any stability in Windows 98.
Re: (Score:1)
Oh wow... (Score:1)
Re: (Score:2)
http://www.mameworld.net/vlinde/ [mameworld.net]
finishes the work he is doing and makes Nintendo 64 emulation possible the way it should have been done (instead of that "HLE" crap most N64 emulators use)
SLI (Score:4, Funny)
why is this modded funny (Score:2)
I was thinking the same thing. When I had my Dual Voodoo2 SLI set up I was king of the hill playing unreal tournement in 1024x768 mode.
Welcome to 1998 (Score:1)
GLQuake? (Score:2, Funny)
There's got to be a better game example for this.
I hope my old Voodoo hasn't been thrown out. Like to see it in action
Re: (Score:1)
I guess with the modern buy and toss in 2 months game cycle that's popular now it isn't much of a concern to most people, but I often like to revi
Security issues (Score:4, Informative)
Two services, both of which are running as privileged users, which directly map memory and IO space to a user-space process without any significant checks being done on what is asking for access or what it's asking for access to in a common driver running under a networked OS.
You might say why have a glide card in a server but just how many drivers for other hardware use this same sort of rubbish to interface to their hardware without us knowing? How many still do it under XP, 2003, Vista etc.?
Every time you install a device driver you are really granting complete machine access to the driver, without audit, without checks. Even in XP x64, he's shown that the ability to create such a driver (one that has privileged access and will grant it to any software that asks for it) requires only a trivial re-compile of a badly-designed driver, using publically available source code, and an install.
Have people known about this particular driver issue for a long time? Although deliberately introducing malware onto a system via this method would of course require the administrators co-operation, how many third-party device drivers, services, etc. can be subverted to provide that level of access to any software that asks for it?
That's the scary bit - the fact that the author must be a bit mental to want to run a VooDoo on an XP x64 machine is re-assuring in comparison.
Re: (Score:2, Funny)
Dude, it's even worse than that! Did you know that the Windows kernel talks directly to the CPU? They haven't even attempted to put an abstraction layer between them!
And the situation is no better under Linux either. Users of Intel Mac
Re: (Score:2)
OpenGL on the other hand, is a more general purpose graphics API designed to run on a wide range of hardware, and is designed for quality rather than gaming performance.
Voodoo power (Score:1)
Voodoo5 (Score:1)
Amiga drivers? (Score:2)
They're even still being sold!
http://www.vesalia.de/e_mediator.htm [vesalia.de]
non-passthrought? (Score:2)
or does it only work in passtrough mode?
Re: (Score:2)
Re: (Score:2)
Somebody has a bad memory. You can link TWO (2) Voodoo 2 cards together in SLI. It is possible to link more together, as 3DFX did in their arcade boards, but this requires custom glue logic.
The Voodoo2 was great in that it could use memory from other graphics cards. It's funny to run GLQuake and see 268 megs of memory pop up on the console while loading.
That's just not true. The only successful card to
Re: (Score:1)
Cool but useless (Score:2)
hehe, I might actually try and slap something together.
Love this kind of stuff ever since i found out XP does NOT support the old tape drives off the floppy port for no particular reason. I was convinced it should work so i set about finding a driver(s). Eventually i got a combonation that worked! Then i decided to scrap it as too slow after days of fussing with it. But damnit I wanted to make that choice NOT MS
Re: (Score:1, Funny)
Re: (Score:1, Insightful)
Re:Daryl Strauss would be proud (Score:4, Interesting)
I'd argue that's not a very much utilized benefit. If you have old hardware you'd be more likely to keep the old software as well. Old software and old hardware will work exactly the same now as they would have 5 or 10 years ago.
If you are going to get new software, you'd probably get new supported hardware as well to get any benefit out of it.
Re: (Score:3, Insightful)
Now, RC1 comes along and Creative decides to not release a driver for it. Now, granted, the X-Fi series of cards is far, far ahead and beyond what my SBLive does. However
Besides, the thought of buying PCI anything with PC
Re: (Score:2)
Re: (Score:2)
For the record, I also still have a Voodoo 3 3000 PCI th
Re: (Score:2)
Also note that there is a lively Vista forum for creative labs owners over here:
http://forums.creative.com/creativelabs/board?boar d.id=Vista [creative.com]
Friedmud
Re: (Score:2)
I have the same issue with old tulip 10/100 network cards, no 64bit windows supports them, even tho the cards were designed for 64bit machines, and are well supported by tru64 and openvms on the alpha.
Re: (Score:2)
I went out and bought the Live card. I had it for 1 week and the card fried. Creative told me to send it in and when i did they claimed that it was not a defective or damaged and it should work (it still smelled like smoke) and they sent it back saying it was fine and refusing to fix it even after hours of phone conversations. And yes i tried it in other systems (about 15 or so before i gave up on it).
I guess i just got a lemon.
Re: (Score:2)
Re: (Score:1)
Knock Knock Knock, anybody home?
I remember my old voodoo2 card with some fondness, it worked fine for every 3d game I used it for at 800*600 which was the only resolution my monitor would do non-interlaced. I had no intention of replacing my monitor as I was a poor student at the time with no dosh.
Then along comes that glorious peice of crap that was windows XP. Now all of a sudden my voodoo2 is unsupported. Not by my games mind you, I want to carry on playing the same games i have been without putting
Re: (Score:2)
Re:Daryl Strauss would be proud (Score:4, Insightful)
Yeah, and only the first 6 years of that timeframe was spent waiting for the x64 edition of Windows XP
Microsoft Windows XP Professional x64 Edition released on April 25, 2005 by Microsoft is a variation of the typical 32-bit Windows XP operating system for x86 personal computers.
Oh wait, the linked article doesn't even say anything about x64 support for the Linux 3Dfx driver. So what exactly are you trying to say, again?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
What's worse tho, is that 64bit windows is over 10 years behind the first 64bit OS, despite running (in a limited fashion, not taking advantage of 64bit features) on 64bit hardware since early releases of NT (mips, alpha etc)
Re: (Score:2)
-Pan
Re: (Score:2)
Antialiasing still better than today's cards (Score:2)
Re: (Score:1)
Re: (Score:2)