Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware Games

Legacy 32-bit PhysX Removal Cripples Performance On New GPUs (youtube.com) 23

Longtime Slashdot reader UnknowingFool writes: Gamer's Nexus performed tests on the effect of removing legacy 32-bit PhysX on the newest generation of Nvidia cards with older games, and the results are not good. With PhysX on, the latest generation Nvidia was slightly beaten by a GTX 580 (released 2010) on some games and handily beaten by a GTX 980 (2014) on some games.

With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.

The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.

Legacy 32-bit PhysX Removal Cripples Performance On New GPUs

Comments Filter:
  • by Racemaniac ( 1099281 ) on Thursday March 13, 2025 @06:05AM (#65229793)

    How hard would it be for Nvidia to release a 32 bit physix driver that just uses the 64bit physix under the hood to provide some backwards ompatibility?

    • Sure. We just need Nvidia to open source their proprietary physx protocol

    • PhysX is not a "driver", it's a userspace DLL that talks to the driver.

      For the RTX 50 series, NVIDIA has deprecated the 32bit interfaces of the driver, thus the 32bit userspace has to talk to the driver in 64bit mode. Probably a wrapper could be created.

      • IIRC, NVidia has required not upgrading drivers for older cards, and having multiple drivers on a system has not worked (at least on Linux).

        If it is purely a library issue, great. If it touches the driver in a PhysX specific way (now or future hardware releases) it may make a big mess.

        I am unsure how the author was able to run a 980GTX (requires v470 or earlier I think) and a 5000 series (assuming current release nv drivers) together in the same machine / OS.

      • Exactly. They don't need to support 32-bit CUDA in hardware, just make PhysX work with 64-bit CUDA. But NVIDIA is too busy printing money from the AI boom to care about gaming.
  • Stick with your 3000 series until they fix this shit.
    • by Luckyo ( 1726890 )

      It's unlikely to get fixed. NV deprioritized GPU sector heavily and moved developers to AI segment instead in last couple of years. Dropping 32 bit CUDA is likely a part of wider effort to dump old, largely unused feature support with likely goal to simplify driver support (i.e. provide working drivers with less work due to less engineers allocated to the GPU sector).

      The less features you need to support in drivers, the less work is needed to keep churning out new drivers.

      • Dropping 32bit PhysX support is never going to save enough Dollars to offset the reduction in sales to gamers and the drop in share prices because of bad publicity in any way attached to Nvidia.

        Sure, gaming is not their primary market anymore. But the customers there are loud and relentless, generating bad press and that bad press will affect brand recognition and share prices - and the combination of bad press, brand recognition and hiccups in the share price WILL affect the big purchase decisions for AI p

        • Let us guess: You're short on NVDA?
        • Re: (Score:2, Interesting)

          by thegarbz ( 1787294 )

          Dropping 32bit PhysX support is never going to save enough Dollars to offset the reduction in sales to gamers and the drop in share prices because of bad publicity in any way attached to Nvidia.

          NVIDIA has done nothing but piss off gamers for 4 years now. This won't make any difference. The handful of people running 10+ year old games won't even reflect in a rounding error on the balance sheet.

          There really isn't any point in keeping legacy cruft around. You want to play old games, use old tech. That's always been the case.

    • until they fix this shit.

      There's nothing to fix. They depreciated something which hasn't been used in games for over a decade. Your choice has been the way it has always been with technology over time: Either maintain older hardware which supports the software you use, or hope someone can program their way out of it for you. The vendors won't support you.

  • Fascinating (Score:4, Funny)

    by pele ( 151312 ) on Thursday March 13, 2025 @07:02AM (#65229829) Homepage

    Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.

    • Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.

      No it doesn't. The original Tetris is completely unplayable on current hardware. Some modified version to suit the fact that you are not playing it on the original hardware may still work.

      Tetris as released was not only a 16bit game *support for which has been depreciated* but the code also cycled the block speed based on CPU ticks which means playing it on a modern GHz+ CPU is unplayable.

      Tetris runs in your case much the same way as any game that has shipped an updated PhysX DLL over the years runs on mode

  • A nice video except it was repeated several times that the RT tech may await the same future. No! Not going to happen ever, RT is an official standardized extension for both Vulkan and Direct3D. It's not going anywhere, it's vendor neutral.

    • Then you should watch the video again because that's not what he said. He specifically called out vendor software features, and then mentioned no guarantee that RT will work the way it does today for the *specific hardware features* noting "that stuff could change".

      And it did. DirectX supported ray tracing long before AMD started including the hardware to accelerate it. Being an open standard doesn't do shit for you if someone isn't providing the compatible hardware for you to use it.

      • 1. What are the NVIDIA specific "vendor software features" for RT?
        2. Is there a single game that uses them?

        Sadly Steve sometimes is speaking straight from his arse.

      • by Khyber ( 864651 )

        "DirectX supported ray tracing long before AMD started including the hardware to accelerate it"

        See this is where OpenGL ruled - you simply added in the support. You didn't need specialized hardware - if your GPU simply had the horsepower just make the extension and you were good to go.

        I was using ATI and doing raytracing LONG AGO. I did it with a lil program called GemRay/GemCad, used for designing faceted stones.

        Yea, it was slow. It still worked.

    • Just because something is a standard, it doesn't mean it will be implemented well. Nothing prevents Nvidia from implementing RT exclusively in software 20 years from now in their drivers.
      • Except NVIDIA can't afford to do that because AMD and Intel will continue to support these features in HW. It would be suicide.

        32bit PhysX/CUDA have been deprecated because they were proprietary and barely used.

        Nothing RT related in Vulkan/Direct3D is proprietary.

  • For once I am on their side. It's not their role to support old features forever. Silicon real estate is valuable. If the games are oh so valuable, then the gamers should petition the developers to update their code to 64-bit, where i presume you can still use PhysX? Get the companies to open-source these old games like what id software used to do, and let the community port it. I think we are lucky that games from 2008 are still playable at all in 2025.

The use of anthropomorphic terminology when dealing with computing systems is a symptom of professional immaturity. -- Edsger Dijkstra

Working...