

Legacy 32-bit PhysX Removal Cripples Performance On New GPUs (youtube.com) 85
Longtime Slashdot reader UnknowingFool writes: Gamer's Nexus performed tests on the effect of removing legacy 32-bit PhysX on the newest generation of Nvidia cards with older games, and the results are not good. With PhysX on, the latest generation Nvidia was slightly beaten by a GTX 580 (released 2010) on some games and handily beaten by a GTX 980 (2014) on some games.
With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.
The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.
With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.
The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.
Could this be solved by a wrapper? (Score:5, Insightful)
How hard would it be for Nvidia to release a 32 bit physix driver that just uses the 64bit physix under the hood to provide some backwards ompatibility?
Re: Could this be solved by a wrapper? (Score:4, Informative)
Sure. We just need Nvidia to open source their proprietary physx protocol
Re: Could this be solved by a wrapper? (Score:5, Interesting)
Sure. We just need Nvidia to open source their proprietary physx protocol
On December 3, 2018, PhysX was made open source under a 3-clause BSD license [wikipedia.org], but this change applied only to computer and mobile platforms.[11]
On November 8, 2022, the open source release was updated to PhysX 5, under the same 3-clause BSD license.[12]
https://github.com/NVIDIA-Omni... [github.com]
Is that not enough to make a 32 bit stub?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
They have not removed CUDA cores.
Re: (Score:2)
Is PhysX backwards compatible?
My understanding from the video in the summary is that PhysX 3.x and up games work pretty well, it's the older version in the game that are the problem.
Re: (Score:2)
My understanding is that those games work with 40xx cards, just not on 50xx. A lot of gamers will occasionally play retro games, it's really quite popular, so this is yet another big black eye for NVidia, which is running out of them. Not that they really give a crap, consumer GPU is like 10% of their business now or something? They don't really need us so long as this bubble continues, and they can also sell a lot of workstation GPUs.
This should be AMD's time to shine, but now a lot of gamers also want the
Re:Could this be solved by a wrapper? (Score:5, Interesting)
PhysX is not a "driver", it's a userspace DLL that talks to the driver.
For the RTX 50 series, NVIDIA has deprecated the 32bit interfaces of the driver, thus the 32bit userspace has to talk to the driver in 64bit mode. Probably a wrapper could be created.
Re: (Score:2)
IIRC, NVidia has required not upgrading drivers for older cards, and having multiple drivers on a system has not worked (at least on Linux).
If it is purely a library issue, great. If it touches the driver in a PhysX specific way (now or future hardware releases) it may make a big mess.
I am unsure how the author was able to run a 980GTX (requires v470 or earlier I think) and a 5000 series (assuming current release nv drivers) together in the same machine / OS.
Re:Could this be solved by a wrapper? (Score:5, Informative)
I am unsure how the author was able to run a 980GTX (requires v470 or earlier I think)
You think wrong. The 980GTX was specifically chosen since it was the oldest GPU currently supported by up to date drivers. They used 572.47 driver version last released in Feb.
Re:Could this be solved by a wrapper? (Score:4, Insightful)
Re: (Score:1)
They don't need to support 32-bit CUDA in hardware, just make PhysX work with 64-bit CUDA.
Kind of, but also kind of not. They still have 32-bit DLLs for CUDA, but those don't support PhysX. Most, and maybe all, of the games that use PhysX are 32-bit applications. The only reason to "make PhysX work with 64-bit CUDA" would be if the 32-bit DLLs could thunk to the 64-bit code -- but they apparently cannot, or else Nvidia would have done that rather than sunset support.
Re: (Score:2)
Re: (Score:1)
The most programmer-visible difference between 32- and 64-bit platforms is the size of pointers, but an x86 CPU runs in a different execution mode when running x86-64 code versus traditional 32-bit x86 code. Handling those differences is a kot more complicated than just handling the size of long or long long.
Re: (Score:2)
The CPU is switched to long mode when the OS boots and stays in that mode, it doesn't switch back to protected mode to execute parts of individual processes.
Re: Could this be solved by a wrapper? (Score:2)
And it doesn't need to switch back, because long mode supports running 32-bit code perfectly well. That exactly is the reason why you can run 32-bit chroots, and even full 32
-bit distros on a 64-bit Linux kernel.
Re: (Score:2)
But NVIDIA is too busy printing money
NVIDIA has nothing to do with what you said. 64bit PhysX is a thing that has existed for over a decade, as is CPU accelerated PhysX. The move to both of them put the writing on the wall for 32bit support. The only software really affected by this decision is old software using a massively out of date version of PhysX.
You should be asking DICE if they are willing to dedicate a few coders time to updating a 15 year old game to use the current version of PhysX. I'm guessing the answer is HAHAHAHHA since they a
Re: (Score:2)
The problem is that nVidia released hardware that was not backwards compatible with software people still have a reasonable expectation of running. Stuff from the 2010s is not "retro" by any means.
Yes and no. The people most in the market for a fancy new $1500 video card are usually not buying them to play retro games. This definitely will affect some people, but you could just ... not buy the fancy new graphics card to play your 15 year old game.
But older games were shippped with the old x87 version of the library and it was too late to fix them.
I was playing Command and Conquer the other day. Just pointing this out to say it is never "too late" to fix anything. These games could be updated quite simply. C&C definitely wouldn't run on my hardware were it not for someone updating it to make it so
Re: (Score:2)
The problem is people are trying to run games on their brand new AI accelerators.
Re: Could this be solved by a wrapper? (Score:2)
Guess I'm a fool for hoping to run DooM and Quake3 on my new computer, then.
"Old game" shouldn't need to mean "Dead game", dude.
Just add an old GPU, let a 1050 do the PhysX? (Score:2)
Re: (Score:2)
Re: (Score:1)
And what if NVidia did not include hardware circuitry in the 5000 series that would have done the hardware acceleration?
I don't believe that there is any special hardware in the 4000 series for PhysX, at least not so special that it's not used for anything else. The only problem should be if there are data types needed for 32 bit PhysX which aren't provided by the PhysX part of the modern driver. Otherwise it should be possible to translate back and forth to support old applications.
Re: (Score:2)
Re: (Score:2)
AFAIK, new PhysX has added functionality more than replacement functionality. The new hardware supports data types just as small as the old hardware, so the same functionality ought to be in there even if NVidia is no longer interested in providing the API to permit 32 bit apps to access it.
So far nothing I have read has conflicted with this idea, but I'm willing to be convinced otherwise. I haven't had time to look at the situation in depth, so I certainly could be wrong.
Re: (Score:1)
NVidia is a classist and corrupt corporation that serves the needs of its upper class masters, all they care about is maximizing their returns by gouging the public.
Re: (Score:2)
What's worse is the Courts force them to behave sociopathically.
If they don't the Courts will allow "shareholder" lawsuits claiming they didn't maximize profits.
The case that cost Musk his ~$50B pay package was brought by someone with an extremely small number of shares.
What CEO will risk that to help out some folks playing games on older hardware?
It would be better to buy a graphics card from a privately-held company should one exist.
Re: (Score:2)
Depends on what the shareholders demand from the company. What a judge will determine is whether the board is doing what the company said it's going to do and investor bought shares for it to do. If a corporation is built to maximize shareholder profit (which, granted, is most of them), and someone buys its shares based on that chart, and its board doesn't focus on maximizing profits, then yes, it can be sued for that. There are corporations whose entire premise is being socially and/or environmentally cons
Re: (Score:2)
They can always spend a few days updating the game, and call it customer goodwill.
Goodwill can translate into future sales after all.
Re: (Score:2)
Sorry but that's not how any of that works. There is no legal requirement to maximise shareholder profit. Nor is there a remedy for shareholders to force a company to do so through the courts (only through their voting tally at the AGM). The courts are there to address an issue of lying to shareholders. If you don't tell them lies, they can't sue you.
The Musk thing was something different. The board was found to act impartially and against the agreement of the shareholders, something which they said they *w
Could this be solved by a second GPU? A 1050? (Score:2)
Re: (Score:2)
nah, i just buy second hand, no problem, free open source and second hand stuff, let other people be consumers, me, i'm frugal, which is why i can afforded to be retired
I asked my last boss about the company's retirement plan, his response was, "buy Michelins"
You know what to do kids (Score:2)
Re: (Score:3)
It's unlikely to get fixed. NV deprioritized GPU sector heavily and moved developers to AI segment instead in last couple of years. Dropping 32 bit CUDA is likely a part of wider effort to dump old, largely unused feature support with likely goal to simplify driver support (i.e. provide working drivers with less work due to less engineers allocated to the GPU sector).
The less features you need to support in drivers, the less work is needed to keep churning out new drivers.
Re:You know what to do kids (Score:5, Interesting)
Dropping 32bit PhysX support is never going to save enough Dollars to offset the reduction in sales to gamers and the drop in share prices because of bad publicity in any way attached to Nvidia.
Sure, gaming is not their primary market anymore. But the customers there are loud and relentless, generating bad press and that bad press will affect brand recognition and share prices - and the combination of bad press, brand recognition and hiccups in the share price WILL affect the big purchase decisions for AI projects, because they are still done by actual humans and actual humans will have all kinds of factors influence their decisions. Every business decision is influenced by these factors and it is more or less related to the halo effect (https://en.wikipedia.org/wiki/Halo_effect) and the color of the bikeshed-problem (https://en.wikipedia.org/wiki/Law_of_triviality).
Nvidia got bad press for these things in the 5000 series and from the distance of a market analyst or purchasing expert, they might look like this:
A) Nvidia's 5000 GPU series are melting power connectors because of bad decision and material choices
B) One fix to the overheating issue is a 10 Dollar part from AliExpress that Nvidia cheaped out on and that needs manual installation, possibly voiding the warranty.
C) Nvidia's 5000 GPU series have an abysmally slow performance in some workloads because of a critically important feature they deliberately removed to save literally only a few pennies per unit
D) Nvidia's stock price is already dropping because Chinese AI companies might do more AI with less hardware.
E) Nvidia is hugely, critically dependent on the China/Taiwan situation through their only supplier TSMC and their sub-contract to Netherland's ASML
F) Nvidia's 5000 GPU was received with lukewarm praises from experts because of lackluster performance increases compared to the earlier 4000 series coupled with outrageous price hikes.
And now you're pondering about a million or billion dollar investment in Nvidia AI modules, thinking about the value for money you get, reliability of the product, reliability of the supplier, longevity of the product, longevity and even medium-term support situation. Half of the above is related to gaming workloads and the opinion of gamers, sure, but imagine you're pondering about committing a billion dollars towards a risky endeavour in AI. Would you commit that billion to a company that quite obviously cheaps out 10 dollars on critically important features like power, cheaps out on literal PENNIES per unit for a backward compatibility that would be trivially easy to maintain, has all their eggs put in one TSMC-colored basket in ONE Taiwanese city Hsinchu sitting right on the possible invasion beach in Taiwan.
Re: (Score:1)
Re: (Score:2, Interesting)
Dropping 32bit PhysX support is never going to save enough Dollars to offset the reduction in sales to gamers and the drop in share prices because of bad publicity in any way attached to Nvidia.
NVIDIA has done nothing but piss off gamers for 4 years now. This won't make any difference. The handful of people running 10+ year old games won't even reflect in a rounding error on the balance sheet.
There really isn't any point in keeping legacy cruft around. You want to play old games, use old tech. That's always been the case.
Re: (Score:3)
NVIDIA has done nothing but piss off gamers for 4 years now. This won't make any difference. The handful of people running 10+ year old games won't even reflect in a rounding error on the balance sheet.
You do know what a Steam library is right? People have these games in their libraries that cannot be played the same going forward with newer hardware. It is not like companies still sell these games on Steam today . . . oh wait.
Re: (Score:2)
I do know what a Steam library is. I also note that many of the older legacy games in a Steam library have been updated by the developers of the games to run on modern systems. Command and Conquer (in my Steam library) objectively would not run on my hardware or OS right now. The fact it is in my Steam library and currently works is thanks to EA (yes just threw up in my mouth). If DICE wants people with RTX50 series to buy Mirror's Edge then they should put coders on the case to update PhysX to a new versio
Re: (Score:2)
Problem isn't just costs. It's also the fact that there's a very limited, very small pool of expert software engineers that do drivers for nvidia GPUs. It's a very exclusive club, and almost no one is capable of even being considered to join it, much less invited, much less actually does it well. You can't just pay more money and generate more of these people, because they don't exist, and almost no one can be trained to do this job.
And those people are overwhelmingly allocated to AI accelerator drivers now
Re: (Score:2)
And those people are overwhelmingly allocated to AI accelerator drivers now. GPUs retained barely anyone, because that's not where growth and profits are.
Are you suggesting that Nvidia converted gaming guys into AI guys or that they got rid of gaming software guys in order to hire AI guys? Neither seems reasonable.
You do realize that Nvidia has been one of the few large tech companies that has laid off very few people in the last decade. Also that the money saved by laying off a few gaming guys wouldn't materially affect Nvidia's financials.
Re: (Score:1)
until they fix this shit.
There's nothing to fix. They depreciated something which hasn't been used in games for over a decade. Your choice has been the way it has always been with technology over time: Either maintain older hardware which supports the software you use, or hope someone can program their way out of it for you. The vendors won't support you.
Re: (Score:2)
They depreciated something which hasn't been used in games for over a decade.
You do know you can still buy some of these games today, right?
Re: (Score:2)
You do know you can still buy some of these games today, right?
And? You can still buy a lot of things, there's never a guarantee they will run adequately. Games have always come with hardware restrictions. You can buy Command and Conquer as well. The *only* reason it will work on your machine is because EA thought it was worth money to them to make it work on your hardware. If DICE thinks Mirrors Edge should work then they should put a programmer on it to update the PhysX version and push out an update.
Whether something works for you is on you to research.
Whether someo
Re: (Score:2)
And if they want to keep selling them, they could probably make a really simple compatibility shim to use something other than the legacy library. Unless they didn't keep all the assets needed for this. The responsibility for game preservation falls on both sides, but I think if they want to keep selling it they should find a way.
Re: You know what to do kids (Score:2)
It works with 4000 series too. 40xx is a bit overpriced but not horribly so. 4060 Ti 16GB is almost reasonable when on sale and it gets you access to all of the latest features. Raytracing aside, it will also do ultra 4k60 in most titles. You do need PCIE 4 to get what performance it has out of it, so it's not a good upgrade for older PCs, those users should stick with their 3070 or what have you.
Re: (Score:2)
Or you can get a 500 series card and use it as a PhysX accelerator with your modern card for the graphics.
Re: You know what to do kids (Score:2)
You can't, because you can't install two versions of the driver at once.
Re: (Score:2)
They were doing something in the video linked in the summary.
I was only half paying attention, but some old accelerator card was being used with the modern cards and it gave good performance.
Re: (Score:2)
It was a 5080+980.
Re: (Score:2)
Right, the 980 is the oldest card which will run on a driver for a modern card. And it will get cut off next. They don't support old cards forever. They do support them for a long time, so far I really have no cause for complaint there. I've been almost exclusively NVidia (especially on desktop) since the TNT2. But I'm still expecting to go AMD next, because I expect that by then they will have this spotty ROCm bullshit sorted, and I prefer an open driver. My laptop has Vega and works great.
Fascinating (Score:4, Funny)
Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.
Re: (Score:2)
Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.
No it doesn't. The original Tetris is completely unplayable on current hardware. Some modified version to suit the fact that you are not playing it on the original hardware may still work.
Tetris as released was not only a 16bit game *support for which has been depreciated* but the code also cycled the block speed based on CPU ticks which means playing it on a modern GHz+ CPU is unplayable.
Tetris runs in your case much the same way as any game that has shipped an updated PhysX DLL over the years runs on mode
Re: (Score:2)
Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.
No it doesn't. The original Tetris is completely unplayable on current hardware. Some modified version to suit the fact that you are not playing it on the original hardware may still work.
Tetris as released was not only a 16bit game *support for which has been depreciated* but the code also cycled the block speed based on CPU ticks which means playing it on a modern GHz+ CPU is unplayable.
Tetris runs in your case much the same way as any game that has shipped an updated PhysX DLL over the years runs on modern GPUs as well.
Psst: Hey, buddy. You're spectrum is showing.
Re: (Score:2)
You and the OP missing the point that software doesn't run infinitely and needs to be maintained for modern platforms doesn't put other people on the spectrum, it just advertises your own lack of intelligence.
Re: (Score:3)
You and the OP missing the point that software doesn't run infinitely and needs to be maintained for modern platforms doesn't put other people on the spectrum, it just advertises your own lack of intelligence.
You're taking two jokes uber-seriously and using them as an excuse to rant. Seems pretty spectrumy to me. I should know, I have my share of spectrum traits.
Re: (Score:2)
Oh I see, you "joke" about people with mental illnesses, not once but twice. Well fuck you very much.
As for the OP, the point is that his point is interesting if incorrect. The question of how well old games play on modern hardware is the foundation for a valid discussion. But all you got is insults for potential people with mental illnesses. Do your share of spectrum traits give you a free pass at being an arsehole?
Re: (Score:2)
Oh I see, you "joke" about people with mental illnesses, not once but twice. Well fuck you very much.
As for the OP, the point is that his point is interesting if incorrect. The question of how well old games play on modern hardware is the foundation for a valid discussion. But all you got is insults for potential people with mental illnesses. Do your share of spectrum traits give you a free pass at being an arsehole?
Where did I insult anybody? You seem hyper-sensitive for somebody who seems to see themselves as a know-it-all. You pretty much insult every post of mine you come across. Maybe you need an internet break if you think simple teases are somehow insulting mental diseases. Good grief.
Re: (Score:1)
If I wanted to run the original Tetris, I'd run it on 86Box. I am running Xenix-86 there, and it runs well. The emulation is almost TOO good, disk access is slow AF.
I played the original Tetris for Windows (whatever version that was, obviously not the first one) on a 286. I don't remember what the clock speed of the machine was, it wasn't mine (it was in a breakroom at SCO) and the score wrapped around to negative numbers so I decided that I won.
Re: (Score:2)
Exactly. The problem here is no one has coded a workaround for you for PhysX yet. PhysX runs fine on the modern CPU, you just need a version that is objectively designed to do so. The issue here is a small subset of games running a specific version of PhysX from back in the day where the API didn't even support SSE.
RT (Score:2)
A nice video except it was repeated several times that the RT tech may await the same future. No! Not going to happen ever, RT is an official standardized extension for both Vulkan and Direct3D. It's not going anywhere, it's vendor neutral.
Re: (Score:2)
Then you should watch the video again because that's not what he said. He specifically called out vendor software features, and then mentioned no guarantee that RT will work the way it does today for the *specific hardware features* noting "that stuff could change".
And it did. DirectX supported ray tracing long before AMD started including the hardware to accelerate it. Being an open standard doesn't do shit for you if someone isn't providing the compatible hardware for you to use it.
Re: (Score:1)
1. What are the NVIDIA specific "vendor software features" for RT?
2. Is there a single game that uses them?
Sadly Steve sometimes is speaking straight from his arse.
Re: (Score:2)
There is DXR which is a part of the DirectX 12 Ultimate API, but there is the whole RTX from NVIDIA that has NVIDIA specific functions/extensions.
Re: (Score:2)
1. None. He didn't say vendor software features for RT. He said *hardware features* for RT. The software features he specifically called out DLSS, FG, DLAA, etc.
2. Yes all games that use raytracing will use ray tracing hardware, and as he said there's no guarantee that this hardware will exist in the future.
And that is a very safe statement. Forward compatibility is never guaranteed, and is even *less* guaranteed when only a single company is pushing something (AMD had to be dragged kicking and screaming in
Re: (Score:2)
"DirectX supported ray tracing long before AMD started including the hardware to accelerate it"
See this is where OpenGL ruled - you simply added in the support. You didn't need specialized hardware - if your GPU simply had the horsepower just make the extension and you were good to go.
I was using ATI and doing raytracing LONG AGO. I did it with a lil program called GemRay/GemCad, used for designing faceted stones.
Yea, it was slow. It still worked.
Re: (Score:2)
See this is where OpenGL ruled - you simply added in the support. You didn't need specialized hardware
I recall not being able to run OpenGL games without a graphics card that supported hardware acceleration. The idea that specialised hardware isn't needed is charitably "technically correct". The very real reality is if you want ray tracing the hardware needs to be designed for it or you end up with a slideshow (like AMD cards from 2 years ago which matched the performance of NVIDIA cards when raytracing was off).
Re: (Score:2)
Re: (Score:2)
Except NVIDIA can't afford to do that because AMD and Intel will continue to support these features in HW. It would be suicide.
32bit PhysX/CUDA have been deprecated because they were proprietary and barely used.
Nothing RT related in Vulkan/Direct3D is proprietary.
Re: (Score:1)
Yeah because like no games use the Unreal engine..
It was proprietary but tons of uses it (or can). That said turning it off hardly renders these titles unplayable. They are also 'older' titles.
People are making a huge deal about this but I'd question how serious it is. Who is buy $1000+ GPUs to play 10+ year old games?
I'll grant for the kinda money they are asking, you should be able to expect to play legacy titles and you can without issue. Kick on the vanilla OpenGL or DX9/10/11/12 modes and enjoy. You wo
Re: RT (Score:2)
This also applies to their $500 GPUs. A lot of people buying those probably still play older games.
I don't like nvidia but.. (Score:1)
Re:I don't like nvidia but.. (Score:5, Insightful)
It's not an issue of silicon. They still support 32-bit data types just fine.
It's 100% a driver code issue.
Re: (Score:2)
Removal is not crippling. (Score:2)
Re: (Score:2)
It does. All the games in question still run in some capacity on the GPU. The performance of it is crippled. You can play Mirrors Edge for example just fine. Just expect 10fps on your $1500 GPU if any glass shatters in the scene.
They don't care.. (Score:2)
..about gamers
All they care about is AI
I wouldn't be surprised if they stopped work on gaming hardware
Re: (Score:2)
..about gamers
They don't care about gamers who target playing games released over a decade ago. And honestly why would they? They aren't the target market for a fancy new raytracing capable GPU.
Inexpensive workaround... (Score:2)
Single slot low profile low power quadro cards like the P600 or P620 supports physx and can be found well under $50 including shipping.
No need to waste space or power on something bigger just for physx.... just sayin.