Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Hardware Games

Legacy 32-bit PhysX Removal Cripples Performance On New GPUs (youtube.com) 120

Longtime Slashdot reader UnknowingFool writes: Gamer's Nexus performed tests on the effect of removing legacy 32-bit PhysX on the newest generation of Nvidia cards with older games, and the results are not good. With PhysX on, the latest generation Nvidia was slightly beaten by a GTX 580 (released 2010) on some games and handily beaten by a GTX 980 (2014) on some games.

With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.

The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.

This discussion has been archived. No new comments can be posted.

Legacy 32-bit PhysX Removal Cripples Performance On New GPUs

Comments Filter:
  • by Racemaniac ( 1099281 ) on Thursday March 13, 2025 @06:05AM (#65229793)

    How hard would it be for Nvidia to release a 32 bit physix driver that just uses the 64bit physix under the hood to provide some backwards ompatibility?

    • by Carewolf ( 581105 ) on Thursday March 13, 2025 @06:09AM (#65229805) Homepage

      Sure. We just need Nvidia to open source their proprietary physx protocol

    • by Artem S. Tashkinov ( 764309 ) on Thursday March 13, 2025 @07:06AM (#65229831) Homepage

      PhysX is not a "driver", it's a userspace DLL that talks to the driver.

      For the RTX 50 series, NVIDIA has deprecated the 32bit interfaces of the driver, thus the 32bit userspace has to talk to the driver in 64bit mode. Probably a wrapper could be created.

      • IIRC, NVidia has required not upgrading drivers for older cards, and having multiple drivers on a system has not worked (at least on Linux).

        If it is purely a library issue, great. If it touches the driver in a PhysX specific way (now or future hardware releases) it may make a big mess.

        I am unsure how the author was able to run a 980GTX (requires v470 or earlier I think) and a 5000 series (assuming current release nv drivers) together in the same machine / OS.

      • by kurkosdr ( 2378710 ) on Thursday March 13, 2025 @08:14AM (#65229903)
        Exactly. They don't need to support 32-bit CUDA in hardware, just make PhysX work with 64-bit CUDA. But NVIDIA is too busy printing money from the AI boom to care about gaming.
        • But NVIDIA is too busy printing money

          NVIDIA has nothing to do with what you said. 64bit PhysX is a thing that has existed for over a decade, as is CPU accelerated PhysX. The move to both of them put the writing on the wall for 32bit support. The only software really affected by this decision is old software using a massively out of date version of PhysX.

          You should be asking DICE if they are willing to dedicate a few coders time to updating a 15 year old game to use the current version of PhysX. I'm guessing the answer is HAHAHAHHA since they a

          • Guess I'm a fool for hoping to run DooM and Quake3 on my new computer, then.

            "Old game" shouldn't need to mean "Dead game", dude.

        • Just add an old GPU? For example drop a 1050 into the gaming box, let it handle 32-bit PhysX.
          • The problem with this approach is that not all PC cases can accept a second GPU and still maintain good airflow to the first GPU, and every new PC build you'll build from now on relies on ancient used hardware.
            • by drnb ( 2434720 )

              The problem with this approach is that not all PC cases can accept a second GPU and still maintain good airflow to the first GPU, and every new PC build you'll build from now on relies on ancient used hardware.

              Given the small number of people who would care about Physx, I'd say finding a pre-50x0 card would not be much of a problem.

              Put the old card on a vertical mount adapter to help with airflow. If its only doing the physics it may not get very hot.

              • Given the small number of people who would care about Physx, I'd say finding a pre-50x0 card would not be much of a problem.

                I am not saying they'll become difficult to find, I am saying that your new PC will rely on old used hardware that already has significant use on its back.

                • by drnb ( 2434720 )
                  OK, it still solves the lack of 32-bit Phys support.

                  Is it lacking the RGB LEDs so it clashes with respect to colors? :-)

                  It may be old and have mileage but it will be asked to do something less taxing than it used to do.
      • And what if NVidia did not include hardware circuitry in the 5000 series that would have done the hardware acceleration? Remember CPUs can do the necessary computations; they just do not have specialized circuitry to do it quickly. Even the latest CPUs struggle with PhysX computations.
  • Stick with your 3000 series until they fix this shit.
    • by Luckyo ( 1726890 )

      It's unlikely to get fixed. NV deprioritized GPU sector heavily and moved developers to AI segment instead in last couple of years. Dropping 32 bit CUDA is likely a part of wider effort to dump old, largely unused feature support with likely goal to simplify driver support (i.e. provide working drivers with less work due to less engineers allocated to the GPU sector).

      The less features you need to support in drivers, the less work is needed to keep churning out new drivers.

      • by phoenix321 ( 734987 ) on Thursday March 13, 2025 @07:27AM (#65229847)

        Dropping 32bit PhysX support is never going to save enough Dollars to offset the reduction in sales to gamers and the drop in share prices because of bad publicity in any way attached to Nvidia.

        Sure, gaming is not their primary market anymore. But the customers there are loud and relentless, generating bad press and that bad press will affect brand recognition and share prices - and the combination of bad press, brand recognition and hiccups in the share price WILL affect the big purchase decisions for AI projects, because they are still done by actual humans and actual humans will have all kinds of factors influence their decisions. Every business decision is influenced by these factors and it is more or less related to the halo effect (https://en.wikipedia.org/wiki/Halo_effect) and the color of the bikeshed-problem (https://en.wikipedia.org/wiki/Law_of_triviality).

        Nvidia got bad press for these things in the 5000 series and from the distance of a market analyst or purchasing expert, they might look like this:
        A) Nvidia's 5000 GPU series are melting power connectors because of bad decision and material choices
        B) One fix to the overheating issue is a 10 Dollar part from AliExpress that Nvidia cheaped out on and that needs manual installation, possibly voiding the warranty.
        C) Nvidia's 5000 GPU series have an abysmally slow performance in some workloads because of a critically important feature they deliberately removed to save literally only a few pennies per unit
        D) Nvidia's stock price is already dropping because Chinese AI companies might do more AI with less hardware.
        E) Nvidia is hugely, critically dependent on the China/Taiwan situation through their only supplier TSMC and their sub-contract to Netherland's ASML
        F) Nvidia's 5000 GPU was received with lukewarm praises from experts because of lackluster performance increases compared to the earlier 4000 series coupled with outrageous price hikes.

        And now you're pondering about a million or billion dollar investment in Nvidia AI modules, thinking about the value for money you get, reliability of the product, reliability of the supplier, longevity of the product, longevity and even medium-term support situation. Half of the above is related to gaming workloads and the opinion of gamers, sure, but imagine you're pondering about committing a billion dollars towards a risky endeavour in AI. Would you commit that billion to a company that quite obviously cheaps out 10 dollars on critically important features like power, cheaps out on literal PENNIES per unit for a backward compatibility that would be trivially easy to maintain, has all their eggs put in one TSMC-colored basket in ONE Taiwanese city Hsinchu sitting right on the possible invasion beach in Taiwan.

        • Re: (Score:2, Interesting)

          by thegarbz ( 1787294 )

          Dropping 32bit PhysX support is never going to save enough Dollars to offset the reduction in sales to gamers and the drop in share prices because of bad publicity in any way attached to Nvidia.

          NVIDIA has done nothing but piss off gamers for 4 years now. This won't make any difference. The handful of people running 10+ year old games won't even reflect in a rounding error on the balance sheet.

          There really isn't any point in keeping legacy cruft around. You want to play old games, use old tech. That's always been the case.

          • NVIDIA has done nothing but piss off gamers for 4 years now. This won't make any difference. The handful of people running 10+ year old games won't even reflect in a rounding error on the balance sheet.

            You do know what a Steam library is right? People have these games in their libraries that cannot be played the same going forward with newer hardware. It is not like companies still sell these games on Steam today . . . oh wait.

            • I do know what a Steam library is. I also note that many of the older legacy games in a Steam library have been updated by the developers of the games to run on modern systems. Command and Conquer (in my Steam library) objectively would not run on my hardware or OS right now. The fact it is in my Steam library and currently works is thanks to EA (yes just threw up in my mouth). If DICE wants people with RTX50 series to buy Mirror's Edge then they should put coders on the case to update PhysX to a new versio

              • You missed my whole point. Your focus was solely on the number of consumers who are actively playing the game. That is not the number affected. The number that is affected is people who own these games like in their Steam library, and people who will buy these games in the future. It would be one thing if these games were last sold a decade ago on out of print CD/DVD-ROMs that you can only get used on eBay. While NVidia might think they can dismiss a small number of active players, they ignore the larger nu
            • Yup. And there's no guarantee that any game in that library will work with whatever hardware I happen to want to run it on.

              NV is not obligated to maintain old features on the off chance that you decide to buy a decade old game on a Steam sale.

        • by Luckyo ( 1726890 )

          Problem isn't just costs. It's also the fact that there's a very limited, very small pool of expert software engineers that do drivers for nvidia GPUs. It's a very exclusive club, and almost no one is capable of even being considered to join it, much less invited, much less actually does it well. You can't just pay more money and generate more of these people, because they don't exist, and almost no one can be trained to do this job.

          And those people are overwhelmingly allocated to AI accelerator drivers now

          • And those people are overwhelmingly allocated to AI accelerator drivers now. GPUs retained barely anyone, because that's not where growth and profits are.

            Are you suggesting that Nvidia converted gaming guys into AI guys or that they got rid of gaming software guys in order to hire AI guys? Neither seems reasonable.

            You do realize that Nvidia has been one of the few large tech companies that has laid off very few people in the last decade. Also that the money saved by laying off a few gaming guys wouldn't materially affect Nvidia's financials.

            • by Luckyo ( 1726890 )

              Suggesting? No. They told us, straight up in their annual reports. Over many years now. That they're refocusing on AI. This has been ongoing for at least half a decade at this point, and if you extend it to "CUDA and server space", we're in at over a decade of refocusing, retraining and moving people.

    • It works with 4000 series too. 40xx is a bit overpriced but not horribly so. 4060 Ti 16GB is almost reasonable when on sale and it gets you access to all of the latest features. Raytracing aside, it will also do ultra 4k60 in most titles. You do need PCIE 4 to get what performance it has out of it, so it's not a good upgrade for older PCs, those users should stick with their 3070 or what have you.

      • by AvitarX ( 172628 )

        Or you can get a 500 series card and use it as a PhysX accelerator with your modern card for the graphics.

        • You can't, because you can't install two versions of the driver at once.

          • by AvitarX ( 172628 )

            They were doing something in the video linked in the summary.

            I was only half paying attention, but some old accelerator card was being used with the modern cards and it gave good performance.

          • by AvitarX ( 172628 )

            It was a 5080+980.

            • Right, the 980 is the oldest card which will run on a driver for a modern card. And it will get cut off next. They don't support old cards forever. They do support them for a long time, so far I really have no cause for complaint there. I've been almost exclusively NVidia (especially on desktop) since the TNT2. But I'm still expecting to go AMD next, because I expect that by then they will have this spotty ROCm bullshit sorted, and I prefer an open driver. My laptop has Vega and works great.

              • by AvitarX ( 172628 )

                So basically when the 4000 series are no longer supported it'll be impossible to run these games with PhysX of CPUs can't handle it (which actually seems quite possibly like CPUs won't be able to)?

  • Fascinating (Score:4, Funny)

    by pele ( 151312 ) on Thursday March 13, 2025 @07:02AM (#65229829) Homepage

    Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.

    • Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.

      No it doesn't. The original Tetris is completely unplayable on current hardware. Some modified version to suit the fact that you are not playing it on the original hardware may still work.

      Tetris as released was not only a 16bit game *support for which has been depreciated* but the code also cycled the block speed based on CPU ticks which means playing it on a modern GHz+ CPU is unplayable.

      Tetris runs in your case much the same way as any game that has shipped an updated PhysX DLL over the years runs on mode

      • Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.

        No it doesn't. The original Tetris is completely unplayable on current hardware. Some modified version to suit the fact that you are not playing it on the original hardware may still work.

        Tetris as released was not only a 16bit game *support for which has been depreciated* but the code also cycled the block speed based on CPU ticks which means playing it on a modern GHz+ CPU is unplayable.

        Tetris runs in your case much the same way as any game that has shipped an updated PhysX DLL over the years runs on modern GPUs as well.

        Psst: Hey, buddy. You're spectrum is showing.

        • You and the OP missing the point that software doesn't run infinitely and needs to be maintained for modern platforms doesn't put other people on the spectrum, it just advertises your own lack of intelligence.

          • You and the OP missing the point that software doesn't run infinitely and needs to be maintained for modern platforms doesn't put other people on the spectrum, it just advertises your own lack of intelligence.

            You're taking two jokes uber-seriously and using them as an excuse to rant. Seems pretty spectrumy to me. I should know, I have my share of spectrum traits.

            • Oh I see, you "joke" about people with mental illnesses, not once but twice. Well fuck you very much.

              As for the OP, the point is that his point is interesting if incorrect. The question of how well old games play on modern hardware is the foundation for a valid discussion. But all you got is insults for potential people with mental illnesses. Do your share of spectrum traits give you a free pass at being an arsehole?

              • Oh I see, you "joke" about people with mental illnesses, not once but twice. Well fuck you very much.

                As for the OP, the point is that his point is interesting if incorrect. The question of how well old games play on modern hardware is the foundation for a valid discussion. But all you got is insults for potential people with mental illnesses. Do your share of spectrum traits give you a free pass at being an arsehole?

                Where did I insult anybody? You seem hyper-sensitive for somebody who seems to see themselves as a know-it-all. You pretty much insult every post of mine you come across. Maybe you need an internet break if you think simple teases are somehow insulting mental diseases. Good grief.

  • A nice video except it was repeated several times that the RT tech may await the same future. No! Not going to happen ever, RT is an official standardized extension for both Vulkan and Direct3D. It's not going anywhere, it's vendor neutral.

    • Then you should watch the video again because that's not what he said. He specifically called out vendor software features, and then mentioned no guarantee that RT will work the way it does today for the *specific hardware features* noting "that stuff could change".

      And it did. DirectX supported ray tracing long before AMD started including the hardware to accelerate it. Being an open standard doesn't do shit for you if someone isn't providing the compatible hardware for you to use it.

      • by Khyber ( 864651 )

        "DirectX supported ray tracing long before AMD started including the hardware to accelerate it"

        See this is where OpenGL ruled - you simply added in the support. You didn't need specialized hardware - if your GPU simply had the horsepower just make the extension and you were good to go.

        I was using ATI and doing raytracing LONG AGO. I did it with a lil program called GemRay/GemCad, used for designing faceted stones.

        Yea, it was slow. It still worked.

        • See this is where OpenGL ruled - you simply added in the support. You didn't need specialized hardware

          I recall not being able to run OpenGL games without a graphics card that supported hardware acceleration. The idea that specialised hardware isn't needed is charitably "technically correct". The very real reality is if you want ray tracing the hardware needs to be designed for it or you end up with a slideshow (like AMD cards from 2 years ago which matched the performance of NVIDIA cards when raytracing was off).

    • Just because something is a standard, it doesn't mean it will be implemented well. Nothing prevents Nvidia from implementing RT exclusively in software 20 years from now in their drivers.
      • Except NVIDIA can't afford to do that because AMD and Intel will continue to support these features in HW. It would be suicide.

        32bit PhysX/CUDA have been deprecated because they were proprietary and barely used.

        Nothing RT related in Vulkan/Direct3D is proprietary.

  • Crippling implies it still works in some capacity on the GPU.
    • It does. All the games in question still run in some capacity on the GPU. The performance of it is crippled. You can play Mirrors Edge for example just fine. Just expect 10fps on your $1500 GPU if any glass shatters in the scene.

  • ..about gamers
    All they care about is AI
    I wouldn't be surprised if they stopped work on gaming hardware

    • ..about gamers

      They don't care about gamers who target playing games released over a decade ago. And honestly why would they? They aren't the target market for a fancy new raytracing capable GPU.

  • Single slot low profile low power quadro cards like the P600 or P620 supports physx and can be found well under $50 including shipping.

    No need to waste space or power on something bigger just for physx.... just sayin.

I know engineers. They love to change things. - Dr. McCoy

Working...