Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

3dfx Voodoo Graphics Gets Windows XP x64 Support 104

ryszards writes, "GlideXP author Ryan 'Colourless' Nunn has turned his insanity up a notch with a driver that allows running the 32-bit NT Glide .dlls for a Voodoo Graphics board on Windows XP x64. Already supporting Voodoo Graphics and Voodoo 2 on 32-bit Windows XP, adding XP x64 to the mix lets even more folks reminisce about the good old early days of consumer 3D acceleration hardware. Any excuse to fire up GLQuake one more time!"
This discussion has been archived. No new comments can be posted.

3dfx Voodoo Graphics Gets Windows XP x64 Support

Comments Filter:
  • by inio ( 26835 ) on Monday September 18, 2006 @05:11AM (#16128698) Homepage
    Really, GLQuake? you want quake-glide - it talks glide natively instead of through the OpenGL Wrapper. Or better yet Unreal/Unreal Tourment. Those games never looked better than when they were running on a Voodoo 2.
    • by c0l0 ( 826165 ) on Monday September 18, 2006 @06:04AM (#16128806) Homepage
      They did, on a Voodoo 3 or, better yet, Voodoo 5 ;)

      The Voodoo 2 totally lacked 32bit rendering (what was less of a problem back then, given that the other cards' performance numbers were not high enough to render anything at 32bit reliably anyway), and the Voodoo 3 "only" boasted a so-called "22bit post-filter", which provided a MUCH better visual experience at negligible framerate losses. However, (at least european) gaming mags went rabid about the fact that "Voodoo 3 still does not support 32bit color depth!1" (which, again, was nothing to really care about, given other cards' performance at True Color settings!), and until today I'm sure that this kind of hype (and pushing of NVIDIA's TNT2-Chip along with it) did a great deal to sink 3DFX in the end.

      Voodoo 5 supported True Color rendering from the beginning, but the market (or rather the marketing machinery) had moved on to the next hot subject, namely "T&L", by then (which, again, had virtually no real impact on anything that truly mattered for real world games), and due to lack of sales and the high costs 3DFX burdened itself with by acquiring STB, one of the greatest computer graphics companies ever went out of business. Just sad. :(
      • Not to mention many monitors' colour quality was poor at best unless you shelled out for expensive ones. 32 bit colour is kinda unnecessary when you can't tell the difference between shades of red.
        • Having owned a Voodoo2 I can tell you that the difference between 16- and 32-bit color was noticeable, and I had a shitty 15" no-name CRT at the time. Dithering [wikipedia.org] caused a very noticeable pattern in 16-bit mode.
        • by Bert64 ( 520050 )
          Yes, indeed...
          I had the good fortune to play glquake on an SGI Onyx connected to a 24" screen via 13W3 (shielded cable) back in those days, it put the voodoo cards to shame.
      • 3dfx lost their way (Score:5, Interesting)

        by Namarrgon ( 105036 ) on Monday September 18, 2006 @07:28AM (#16129100) Homepage

        32bit colour and T&L may not have set the world on fire when they were released, but that's hardly surprising. Since the hardware was new, no software yet took full advantage of them.

        It's different today. Try running modern games without T&L today, even on a modern CPU, and watch your game crawl - if it plays at all. And see if you can get a gamer to play in 16 bit without noticing the difference (and complaining). The TNT and GeForce chips set the scene for modern graphics, just like the Voodoo & Voodoo 2 did in their time with real 3D acceleration, dedicated texture units, SLI etc.

        3dfx made many mistakes, which resulted in them simply being out-innovated and out-executed by the competition while they struggled with their consequences of their poor business decisions. They showed the way, but Voodoo 5/6's multichip approach was never the right direction for the mainstream future.

        • Although I still wish someone else would implement 16 bit color without making it look like crap. Unreal was beautiful on a Voodoo3, even in 16-bit. 16bit even on my new card looks grainy and nasty.

          It really kind of sucks for old games :(
          • 16-bit is a joke.
            You can't do reasonable color blending and lighting with 16-bits of output colorspace.
            Supporting 16-bit textures to save space is different, but the pipeline needs to at least be 8-bits wide per channel (10, or better yet 16-bits is preferred).
        • I seem to recall that the thing that started the death of 3dfx was bump mapping. The TNT supported it, and it made a huge difference to how games looked. A friend of mine had one, and games which supported it looked much better at 1024x768 with bump mapping than they did at 800x600 without (which was the best my VooDoo 2 could support). The VooDoo 3 was an anticlimax; not sufficiently better than the 2 to be worth the upgrade, and the other cards needed insane amounts of power.

          When the GeForce was rel

        • They showed the way, but Voodoo 5/6's multichip approach was never the right direction for the mainstream future.

          The Voodoo 2 Did the multi-chip approach LONG before voodoo 5's were around, and it WAS the solution and direction to create the future of graphics that we have right now. The 12 meg Creative 3D Blaster Voodoo2 had three processors on board, and it BEAT THE SHIT OUT OF EVERY OTHER CARD ON THE MARKET. What kind of nonsense are you spouting?
          • Re: (Score:1, Informative)

            by Anonymous Coward
            Get real. Those three chips on the Voodoo 2 were one "pixelFX" rasteriser and two "texelFX" texture mapping units, together resulting in a single pixel pipeline capable of dual texturing -- whereas Nvidia's Riva TNT had two single-texturing pipelines (which could combine into dual-texturing) plus a 2D core on a single chip, not three.

            Voodoo 2 was capable of two-card SLI for what "multi-chip" typically refers to. (You wouldn't call a multi-chip IBM RS-series processor "multi-processor" when it was a single l
          • by Sycraft-fu ( 314770 ) on Monday September 18, 2006 @01:12PM (#16131896)
            He's right that when the Voodoo 5 came out, multi-chip was not the way to go. It was too expensive and their chips were too slow. A single chip GeForce 2 beat the Voodoos soundly on non-T&L games and annihilated them on ones that did use it. The proof would be in the fact that 3dfx fell from the preeminent 3D company down to something nVidia bought up.

            Also the Voodoo 2 didn't have 3 processors on board, it had 3 chips each which was a part of a single unit. One chip did the frame buffer, the two others were texture units. Together they formed what is a single pipeline on a modern card. While separate chips, you had to have one frame buffer chip and at least one texture chip. Adding more texture chips made multi-texturing faster, but not single texturing. In no case did it help geometry.

            The Voodoo 5 was different. Each VSA was it's own self contained chip. You could use one or you could use more. However they weren't very powerful. It took 2 of them to make a showing at all against things like a GeForce, never mind a GeForce 2. That was not the right way to go. More chips is a valid in visualization systems (which 3dfx chips were oft used in) but not for consumer desktops. As is seen with the SLI market there IS a small market for it for the ubergamers, but it's got to be optional, not mandatory to get reasonable performance.
          • 3DFX was the last holdout on combining all their chips into a single core logic for mainstream product lines.

            Example: in 1996 when 3dlabs designed the Permedia, it was a multi-chip solution (just like their workstation products) consisting of a pixel and vertex processor. In 1997, 3dlabs combined the multi-chip Permedia into the single-chip Permedia 2. Despite being priced mucn cheaper than the Permedia, the Permedia 2 made 3dlabs much more money due to the low-cost, single-chip design.

            3DFX designed the V
      • by RESPAWN ( 153636 )
        Really, the difference between 16bit color and 32bit color probably isn't all that much such that you will really notice the difference while in the middle of a heated game. I actually use a Voodoo3 3000 PCI as a secondary video card. When looking at the two monitors side by side in Windows, I can barely notice a difference at all. Granted, I do most of my photo editing on the primary display with full 32bit color, but there's really nothing wrong with the display on my Voodoo card. Why should I go out
        • The Voodoo 3 can do 32bit 2D, ie the desktop, but not 3D. I used to use a PCI one too in my multimonitor set up before replacing it with a cooler running Matrox G450 DH. The 3dfx based card was nice for those games that were directX 3 but had glide, and for shit like glide Winamp plugins. DirectX 3 had no hardware accelerated support, so glide was the one of the only ways to do it for a while on Windows. Need for Speed 2 SE and Wipeout something or other were like that, IIRC.
    • FuHQuake, now that's where it's at.

      All these years, and quake is still my DM game of choice, thanks to the FuH.

      Zigurat Vertigo for teh win. I don't think that level has ever been equalled for pure high-larious fun.
    • Quake whas not designed for OpenGL, and the initial graphics cards where a quality decrease. Whas speed that whas much better.
      Nowdays the GPU also enhance eyecandy, so GPU based games can look better, but the first cards where design for speed.

      Imho, the first Quake2 screens look like crap because of that, designed for crappy 3d graphic accelerators. Even the mp2 format is "liquid" so render like crap.
      • by lendude ( 620139 )
        Huh? - the initial 3D graphics cards were a (visual I presume) "quality decrease" in Quake???


        Yeh, coz that software rendering at 320 x 200 in q95.bat really kicked arse eyecandy-wise!

      • by CaseyB ( 1105 )
        Um, no. GLQuake looked better in every respect than software rendering. Higher resolution, higher framerate, and 16 bit color. Some folks complained about the "blurriness" of bilinear texture interpolation, but that was only because they hadn't realized that it was better yet. :)
  • Judging by the thread, it seems Vodoo opened up the sources to there drivers since he talks about how they were written.

    Can someone please explain in detail about this? It would be news to me if said sources were actually available, and I simply didn't misread the thread.
    • Re: (Score:3, Informative)

      It looks like the low level kernel drivers were just memory mapping and port io.
      The glide interface DLLs (still 32bit) can then communicate correctly with the card using this minimal kernel driver.

    • Re: (Score:2, Informative)

      by Zooka ( 457908 )
      ''it seems Vodoo opened up the sources''

      3dfx was acquired by NVIDIA in 2000.
    • The drivers leaked onto the internet. A whole cd full of documents about 3dfx and the code of the drivers were leaked [aselabs.com] and therefore the 3dfx drivermakers can make drivers practically from scratch as far as I have understood. The development is going on at 3dfxzone.it [3dfxzone.it] but the source code may be very difficult to find. The releaser don't have it up anylonger. Maybe through a p2p-network... The Glide drivers were made open source by 3dfx itself when they wanted it to work on linux.
      • The source isn't too hard to find. I used to host it myself. Ask just about anyone over at x-3dfx [ezboard.com] and you should be able to get it. It's a .rar that's about 30 megs that expands to about 300. It's pretty interesting to go through.
  • Insanity! (Score:3, Interesting)

    by NeuralAbyss ( 12335 ) on Monday September 18, 2006 @05:14AM (#16128705) Homepage
    Absolute insanity... although, I guess this proves that Voodoo cards aren't just legacy hardware.. they're supported..
    • Best damn video card ever! Did I say EVA! Yeah I said best damn card ever....
      I got mine when it was cheap and affordable. Unreal Tournament looked awesome and so did my Linux Desktop!

      Who else remembers the "Don't worry we will support your Voodoo cards"... Next day... Site not found!

    • The card is (in theory) supported, the problem is the games aren't. I haven't tried firing up Interstate 76' or Mechwarrior 2 lately, but I'm pretty sure dos/glide/sb16 is not a recipe for success under Windows 2000. It's really too bad, I had a blast playing those games 'back in the day' with my Voodoo2.

      Absolute insanity... although, I guess this proves that Voodoo cards aren't just legacy hardware.. they're supported..
  • Has anyone bothered to make a Glide Emulator for some of those games that only supported Glide. There's got to be 1 or 2 Montezuma's Return fans out there :/
    • Re: (Score:2, Informative)

      Yeah, there are loads of 'em [google.co.uk]

      I used to use a Glide Wrapper so I could play The Sentinel Returns [yahoo.com] properly on my system.

    • Re: (Score:3, Funny)

      Don't know about Montezuma's Return, but I often get a serious case of Montezuma's Revenge after visiting my local Mexican restaurant and I'm not a fan...

      Bob
    • Re:Speaking of Glide (Score:5, Informative)

      by Atman Binstock ( 172632 ) on Monday September 18, 2006 @06:56AM (#16128959)
      Has anyone bothered to make a Glide Emulator for some of those games that only supported Glide. There's got to be 1 or 2 Montezuma's Return fans out there :/

      Dunno about that...

      Last time I saw it running on Glide under a recent Windows, there seemed to be a bug where it didn't wait for vsync properly and the CPU got way ahead of the graphics, leading to really ugly control latency. I probably screwed up somewhere :(
      The win32 software renderer didn't have this problem.

      -lead programmer of Montezuma's Return
    • Re: (Score:3, Informative)

      by Phaid ( 938 )
      Yup. There are a number of Glide emulators, but dgVoodoo [freeweb.hu] is the one I have had the most success with (for Red Baron 3D). If it matters, it's also likely the highest performing one since it is a direct Glide to DirectX emulation, whereas most of the others were Glide to OpenGL emulators. I say "if it matters" because on a modern system the overhead of this conversion will make no difference given the simplicity of the games that used Glide.
    • by Bert64 ( 520050 )
      Yes, there was a glide emulator...
      I remember people using it to run UltraHLE (the N64 emulator, which only supported glide) on non voodoo cards.
  • not that i mind much, but who uses voodoo 2 these days?
  • I remember the good old days, when you got a free copy of POD Racer [wikipedia.org] with your 3DFX Voodoo card, and then your eyes popped out with the visual brilliance of the 3D accelerated graphics :)

    Even today, very few games have made me react like I did ("OMG lookit that!") to the Voodoo driven games of yesteryear - did anyone here run Unreal 1 in "software" and then in "glide" and compare the experience?

    Back then, we thought that the Trident, nVidia Riva TNT and Cirrus Logic graphics cards were crap compared to
    • by VGPowerlord ( 621254 ) on Monday September 18, 2006 @06:21AM (#16128859)
      How about Quake in software mode and then in GLQuake?

      Which, of course, brings up the biggest problem with 3dfx cards prior to the Voodoo 4: OpenGL support. OpenGL was implemented for GLQuake, Quake 2, Half-life, Hexen 2, Heretic 2, and Sin using special "MiniGL" drivers that changed specific OpenGL instructions to their Glide equivalents.
    • by TravisO ( 979545 )
      Actually when I got Unreal 1, I had a P2-233 (on a dual mobo as I assumed dual cpus would be the future, little did I know at the time your OS had to support it. Well at least I never bought the 2nd CPU, I know I was wrong about duals when the P2 400mhz came out). I had a Diamond MM Viper card, which had a VERY based 3d chip and iirc, Unreal 1 didn't support it. Anyways, Unreal 1 on software rendering (well at least I had fast MMX, I figured) was a mix blessing; visually, the game was amazing, and the b
  • LOOK GUYS I CAN RUN ULTRAHLE ON WINDOWS XP X64 NOW! OH WAIT, NO I CAN'T...UltraHLE still doesn't work on XP.
  • SLI (Score:4, Funny)

    by pbjones ( 315127 ) on Monday September 18, 2006 @06:36AM (#16128903)
    I hope it supports using SLI, I still have at least 2 of these gems.
  • GLQuake? (Score:2, Funny)

    by twazzock ( 928396 )
    GLQuake still runs fine on my shiny new nVidia, and at crazy resolutions and frame rates :) OpenGL written applications have a tendancy to work forever anyway, (unlike a certain other API we all know *cough* DirectX *cough*) at least graphically.
    There's got to be a better game example for this.

    I hope my old Voodoo hasn't been thrown out. Like to see it in action ;)
    • by archen ( 447353 )
      DirectX is another nail in the coffin for why I gave up on PC gaming. The vast majority of games I bought don't work on my newer PCs. I bought the "updated" version of Tie Fighter but never got around playing it until 2 months ago - but couldn't get it to work because it DEMANDS DirectX6 and no other version. Strangely enough the Dos version still works fine...

      I guess with the modern buy and toss in 2 months game cycle that's popular now it isn't much of a concern to most people, but I often like to revi
  • Security issues (Score:4, Informative)

    by ledow ( 319597 ) * on Monday September 18, 2006 @07:09AM (#16129014) Homepage
    I think the main focus of the article should really be the poor driver design and the huge security problems.

    Two services, both of which are running as privileged users, which directly map memory and IO space to a user-space process without any significant checks being done on what is asking for access or what it's asking for access to in a common driver running under a networked OS.

    You might say why have a glide card in a server but just how many drivers for other hardware use this same sort of rubbish to interface to their hardware without us knowing? How many still do it under XP, 2003, Vista etc.?

    Every time you install a device driver you are really granting complete machine access to the driver, without audit, without checks. Even in XP x64, he's shown that the ability to create such a driver (one that has privileged access and will grant it to any software that asks for it) requires only a trivial re-compile of a badly-designed driver, using publically available source code, and an install.

    Have people known about this particular driver issue for a long time? Although deliberately introducing malware onto a system via this method would of course require the administrators co-operation, how many third-party device drivers, services, etc. can be subverted to provide that level of access to any software that asks for it?

    That's the scary bit - the fact that the author must be a bit mental to want to run a VooDoo on an XP x64 machine is re-assuring in comparison.
    • Re: (Score:2, Funny)

      by Anonymous Coward
      Two services, both of which are running as privileged users, which directly map memory and IO space to a user-space process without any significant checks being done on what is asking for access or what it's asking for access to in a common driver running under a networked OS.

      Dude, it's even worse than that! Did you know that the Windows kernel talks directly to the CPU? They haven't even attempted to put an abstraction layer between them!

      And the situation is no better under Linux either. Users of Intel Mac
  • I'm really convinced that this guy have invoked the true malevolent power of voodoo forces to achieve such insane act - the word "Windows" in the title has no pertinence to my declaration.
  • I just got done moving and was unpacking my "computer parts" box and found that I still have a Voodoo 5. Time to put it in my new gaming rig and fire that baby up.
  • Someone on the linked forum made a joke about Amiga drivers for the voodoo boards...
    They're even still being sold!
    http://www.vesalia.de/e_mediator.htm [vesalia.de]
  • Does voodoo(2) work as an extra PCI card, for "triple-head" ?

    or does it only work in passtrough mode?
    • by Khyber ( 864651 )
      The Voodoo2 was a 3-d only card. It's successor, the Banshee, allowed for 2d acceleration as well. So the answer is it was still just a pass-thru card. But what kicked ass was you could link 4 voodoo 2's together and destroy nearly any game made back then. At least nVidia took a page from that notebook and implemented it somewhat properly to match today's hardware, though I'm wondering why they never implemented SLI and brought about dual-AGP slot boards. Just for shits, I do have a Voodoo2 in this computer
      • But what kicked ass was you could link 4 voodoo 2's together and destroy nearly any game made back then

        Somebody has a bad memory. You can link TWO (2) Voodoo 2 cards together in SLI. It is possible to link more together, as 3DFX did in their arcade boards, but this requires custom glue logic.

        The Voodoo2 was great in that it could use memory from other graphics cards. It's funny to run GLQuake and see 268 megs of memory pop up on the console while loading.


        That's just not true. The only successful card to
    • by qa'lth ( 216840 )
      Actually, there were very beta drivers released by 3dfx that could do this. Also, I believe there's a howto out there to get X11 to treat it as a 2d framebuffer, too.
  • Wait that defines half my life....

    hehe, I might actually try and slap something together.

    Love this kind of stuff ever since i found out XP does NOT support the old tape drives off the floppy port for no particular reason. I was convinced it should work so i set about finding a driver(s). Eventually i got a combonation that worked! Then i decided to scrap it as too slow after days of fussing with it. But damnit I wanted to make that choice NOT MS :)

It is now pitch dark. If you proceed, you will likely fall into a pit.

Working...