

Legacy 32-bit PhysX Removal Cripples Performance On New GPUs (youtube.com) 120
Longtime Slashdot reader UnknowingFool writes: Gamer's Nexus performed tests on the effect of removing legacy 32-bit PhysX on the newest generation of Nvidia cards with older games, and the results are not good. With PhysX on, the latest generation Nvidia was slightly beaten by a GTX 580 (released 2010) on some games and handily beaten by a GTX 980 (2014) on some games.
With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.
The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.
With the launch of the 5000 series, NVidia dropped 32-bit CUDA support going forward. Part of that change was dropping support for 32-bit PhysX. As a result, older titles that used it would perform poorly with 5000 series cards as it would default to CPU for calculations. Even the latest CPUs do not perform as well as 15-year-old GPUs when it comes to PhysX.
The best performance on the 5080 was to turn PhysX off however that would remove many effects like smoke, breaking glass, and rubble from scenes. The second-best option was to pair a 5000 series with an older card like a 980 to handle the PhysX computations.
Could this be solved by a wrapper? (Score:5, Insightful)
How hard would it be for Nvidia to release a 32 bit physix driver that just uses the 64bit physix under the hood to provide some backwards ompatibility?
Re: Could this be solved by a wrapper? (Score:4, Informative)
Sure. We just need Nvidia to open source their proprietary physx protocol
Re: Could this be solved by a wrapper? (Score:5, Interesting)
Sure. We just need Nvidia to open source their proprietary physx protocol
On December 3, 2018, PhysX was made open source under a 3-clause BSD license [wikipedia.org], but this change applied only to computer and mobile platforms.[11]
On November 8, 2022, the open source release was updated to PhysX 5, under the same 3-clause BSD license.[12]
https://github.com/NVIDIA-Omni... [github.com]
Is that not enough to make a 32 bit stub?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
They have not removed CUDA cores.
Re: (Score:2)
Re: (Score:2)
Is PhysX backwards compatible?
My understanding from the video in the summary is that PhysX 3.x and up games work pretty well, it's the older version in the game that are the problem.
Re: (Score:2)
My understanding is that those games work with 40xx cards, just not on 50xx. A lot of gamers will occasionally play retro games, it's really quite popular, so this is yet another big black eye for NVidia, which is running out of them. Not that they really give a crap, consumer GPU is like 10% of their business now or something? They don't really need us so long as this bubble continues, and they can also sell a lot of workstation GPUs.
This should be AMD's time to shine, but now a lot of gamers also want the
Re:Could this be solved by a wrapper? (Score:5, Interesting)
PhysX is not a "driver", it's a userspace DLL that talks to the driver.
For the RTX 50 series, NVIDIA has deprecated the 32bit interfaces of the driver, thus the 32bit userspace has to talk to the driver in 64bit mode. Probably a wrapper could be created.
Re: (Score:2)
IIRC, NVidia has required not upgrading drivers for older cards, and having multiple drivers on a system has not worked (at least on Linux).
If it is purely a library issue, great. If it touches the driver in a PhysX specific way (now or future hardware releases) it may make a big mess.
I am unsure how the author was able to run a 980GTX (requires v470 or earlier I think) and a 5000 series (assuming current release nv drivers) together in the same machine / OS.
Re:Could this be solved by a wrapper? (Score:5, Informative)
I am unsure how the author was able to run a 980GTX (requires v470 or earlier I think)
You think wrong. The 980GTX was specifically chosen since it was the oldest GPU currently supported by up to date drivers. They used 572.47 driver version last released in Feb.
Re:Could this be solved by a wrapper? (Score:4, Insightful)
Re: (Score:2)
But NVIDIA is too busy printing money
NVIDIA has nothing to do with what you said. 64bit PhysX is a thing that has existed for over a decade, as is CPU accelerated PhysX. The move to both of them put the writing on the wall for 32bit support. The only software really affected by this decision is old software using a massively out of date version of PhysX.
You should be asking DICE if they are willing to dedicate a few coders time to updating a 15 year old game to use the current version of PhysX. I'm guessing the answer is HAHAHAHHA since they a
Re: Could this be solved by a wrapper? (Score:2)
Guess I'm a fool for hoping to run DooM and Quake3 on my new computer, then.
"Old game" shouldn't need to mean "Dead game", dude.
Re: (Score:2)
The problem is that nVidia released hardware that was not backwards compatible with software people still have a reasonable expectation of running. Stuff from the 2010s is not "retro" by any means.
Yes and no. The people most in the market for a fancy new $1500 video card are usually not buying them to play retro games. This definitely will affect some people, but you could just ... not buy the fancy new graphics card to play your 15 year old game.
But older games were shippped with the old x87 version of the library and it was too late to fix them.
I was playing Command and Conquer the other day. Just pointing this out to say it is never "too late" to fix anything. These games could be updated quite simply. C&C definitely wouldn't run on my hardware were it not for someone updating it to make it so
Re: (Score:2)
You don't understand the PC market. One of the biggest advantages of PC gaming is the ability to play older games, and new players often get into those games too (this is a large part of GOG's business model), especially considering 10 and 15 year old graphics hold up quite nicely, or the state of the gaming industry today, which can be best described as "expensive flops galore, and Astro Bot."
Re: (Score:2)
The problem is people are trying to run games on their brand new AI accelerators.
Re: (Score:2)
Just add an old GPU, let a 1050 do the PhysX? (Score:2)
Re: (Score:2)
Re: (Score:2)
The problem with this approach is that not all PC cases can accept a second GPU and still maintain good airflow to the first GPU, and every new PC build you'll build from now on relies on ancient used hardware.
Given the small number of people who would care about Physx, I'd say finding a pre-50x0 card would not be much of a problem.
Put the old card on a vertical mount adapter to help with airflow. If its only doing the physics it may not get very hot.
Re: (Score:2)
I am not saying they'll become difficult to find, I am saying that your new PC will rely on old used hardware that already has significant use on its back.
Re: (Score:2)
Is it lacking the RGB LEDs so it clashes with respect to colors?
It may be old and have mileage but it will be asked to do something less taxing than it used to do.
Re: (Score:2)
Re: (Score:2)
The CPU is switched to long mode when the OS boots and stays in that mode, it doesn't switch back to protected mode to execute parts of individual processes.
Re: Could this be solved by a wrapper? (Score:2)
And it doesn't need to switch back, because long mode supports running 32-bit code perfectly well. That exactly is the reason why you can run 32-bit chroots, and even full 32
-bit distros on a 64-bit Linux kernel.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
AFAIK, new PhysX has added functionality more than replacement functionality. The new hardware supports data types just as small as the old hardware, so the same functionality ought to be in there even if NVidia is no longer interested in providing the API to permit 32 bit apps to access it.
So far nothing I have read has conflicted with this idea, but I'm willing to be convinced otherwise. I haven't had time to look at the situation in depth, so I certainly could be wrong.
Re: (Score:2)
Re: (Score:2)
If the issue is bitness, CUDA handles series of very small data types already, like you can use 8 bit floats [nvidia.com] and so on. CUDA also has them in 16, 24, 32, 128 and 256 bit sizes, single and double precision, SIMD, and large integers depending on specific architecture and CUDA version. PhysX is based on CUDA now, it's not dedicated hardware. Both the hardware and the API do more stuff than they used to, not less. There's no reason to believe that the hardware can't still do the same job.
With that said, that do
Re: (Score:2)
With that said, that doesn't mean they're not discontinuing parts of the PhysX code that runs on the GPU, none of which is OSS. Only the part which runs on the CPU is.
The issue is not whether PhysX will run on a CPU. We know it does. The issue is that it runs much faster on GPU due to hardware accelerators. Just like audio/video encoding, a CPU can handle the computations to do the job albeit slower unless it has an encoder built into the CPU. I can see that NVidia chip designer removing the bits that provided hardware acceleration just for 32-bit PhysX/CUDA as technically the CPU could do the computations.
Re: (Score:2)
I can see that NVidia chip designer removing the bits that provided hardware acceleration just for 32-bit PhysX/CUDA as technically the CPU could do the computations.
Yes, I discussed that in the other paragraph of my comment, which you didn't quote. That's also not what you said originally, so you're also moving the goalposts. You suggested that the parts of the hardware needed to do it went missing, which they haven't done.
Re: (Score:2)
That's also not what you said originally, so you're also moving the goalposts.
You should scroll up. My initial comment in this thread [slashdot.org]: "And what if NVidia did not include hardware circuitry in the 5000 series that would have done the hardware acceleration. Remember CPUs can do the necessary computations; they just do not have specialized circuitry to do it quickly. Even the latest CPUs struggle with PhysX computations." It wasn't a long comment and you responded to it.
Re: (Score:2)
I already addressed why that was wrong in a prior comment. You don't seem to be able to read. The hardware acceleration is done with CUDA, they ported PhysX to CUDA years ago and now there IS no dedicated PhysX hardware AT ALL. NVidia is still using CUDA. You should try learning how any of this works.
Re: (Score:2)
I already addressed why that was wrong in a prior comment.
So in another comment which has nothing to do with this thread? Am I supposed to hold a seance to know which comment you are referring?
You don't seem to be able to read.
You are the one the says I did not say something when my FIRST comment said exactly what I said. You are basically blaming me for your lack of reading comprehension.
The hardware acceleration is done with CUDA, they ported PhysX to CUDA years ago and now there IS no dedicated PhysX hardware AT ALL.
Sigh. And you still are missing the point. NVidia has updated series 5000 cores from the 4000 series. More than likely chip designers removed and added circuits. You're absolutely sure they did not remove circuit
Re: (Score:2)
It was an earlier comment in this thread to which you replied, shitass
Re: (Score:2)
It was an earlier comment in this thread to which you replied, shitass
And being vague as usual. You do know we cannot read what you think, only what you post? Or is it every time you post details, I post the exact details about what you missed.
Re: Could this be solved by a wrapper? (Score:2)
Learn to read and you won't have to wonder what I posted, especially after you replied to it. Thanks for letting us know you reply without reading.
Re: (Score:2)
Re: (Score:2)
You wrote a bunch of stupid ignorant shit which I read, which is how I found out that you don't know how any of the things we are discussing work. You may now fuck off, kthxbye.
Re: (Score:2)
Re: (Score:2)
You replied to the post you asked me to reference. Then you told us you didn't read it. Now you want me to refer to it again, so you can not read it again? Get fucked, loser.
Re: (Score:2)
And I gave my response to your answer was in my many, many other posts. You'll just have read them all and guess which one. But AGAIN, I can't help it if you don't read.
Re: (Score:2)
What's worse is the Courts force them to behave sociopathically.
If they don't the Courts will allow "shareholder" lawsuits claiming they didn't maximize profits.
The case that cost Musk his ~$50B pay package was brought by someone with an extremely small number of shares.
What CEO will risk that to help out some folks playing games on older hardware?
It would be better to buy a graphics card from a privately-held company should one exist.
Re: (Score:2)
Depends on what the shareholders demand from the company. What a judge will determine is whether the board is doing what the company said it's going to do and investor bought shares for it to do. If a corporation is built to maximize shareholder profit (which, granted, is most of them), and someone buys its shares based on that chart, and its board doesn't focus on maximizing profits, then yes, it can be sued for that. There are corporations whose entire premise is being socially and/or environmentally cons
Re: (Score:2)
They can always spend a few days updating the game, and call it customer goodwill.
Goodwill can translate into future sales after all.
Re: (Score:2)
Sorry but that's not how any of that works. There is no legal requirement to maximise shareholder profit. Nor is there a remedy for shareholders to force a company to do so through the courts (only through their voting tally at the AGM). The courts are there to address an issue of lying to shareholders. If you don't tell them lies, they can't sue you.
The Musk thing was something different. The board was found to act impartially and against the agreement of the shareholders, something which they said they *w
Could this be solved by a second GPU? A 1050? (Score:2)
Re: (Score:2)
nah, i just buy second hand, no problem, free open source and second hand stuff, let other people be consumers, me, i'm frugal, which is why i can afforded to be retired
I asked my last boss about the company's retirement plan, his response was, "buy Michelins"
You know what to do kids (Score:2)
Re: (Score:3)
It's unlikely to get fixed. NV deprioritized GPU sector heavily and moved developers to AI segment instead in last couple of years. Dropping 32 bit CUDA is likely a part of wider effort to dump old, largely unused feature support with likely goal to simplify driver support (i.e. provide working drivers with less work due to less engineers allocated to the GPU sector).
The less features you need to support in drivers, the less work is needed to keep churning out new drivers.
Re:You know what to do kids (Score:5, Interesting)
Dropping 32bit PhysX support is never going to save enough Dollars to offset the reduction in sales to gamers and the drop in share prices because of bad publicity in any way attached to Nvidia.
Sure, gaming is not their primary market anymore. But the customers there are loud and relentless, generating bad press and that bad press will affect brand recognition and share prices - and the combination of bad press, brand recognition and hiccups in the share price WILL affect the big purchase decisions for AI projects, because they are still done by actual humans and actual humans will have all kinds of factors influence their decisions. Every business decision is influenced by these factors and it is more or less related to the halo effect (https://en.wikipedia.org/wiki/Halo_effect) and the color of the bikeshed-problem (https://en.wikipedia.org/wiki/Law_of_triviality).
Nvidia got bad press for these things in the 5000 series and from the distance of a market analyst or purchasing expert, they might look like this:
A) Nvidia's 5000 GPU series are melting power connectors because of bad decision and material choices
B) One fix to the overheating issue is a 10 Dollar part from AliExpress that Nvidia cheaped out on and that needs manual installation, possibly voiding the warranty.
C) Nvidia's 5000 GPU series have an abysmally slow performance in some workloads because of a critically important feature they deliberately removed to save literally only a few pennies per unit
D) Nvidia's stock price is already dropping because Chinese AI companies might do more AI with less hardware.
E) Nvidia is hugely, critically dependent on the China/Taiwan situation through their only supplier TSMC and their sub-contract to Netherland's ASML
F) Nvidia's 5000 GPU was received with lukewarm praises from experts because of lackluster performance increases compared to the earlier 4000 series coupled with outrageous price hikes.
And now you're pondering about a million or billion dollar investment in Nvidia AI modules, thinking about the value for money you get, reliability of the product, reliability of the supplier, longevity of the product, longevity and even medium-term support situation. Half of the above is related to gaming workloads and the opinion of gamers, sure, but imagine you're pondering about committing a billion dollars towards a risky endeavour in AI. Would you commit that billion to a company that quite obviously cheaps out 10 dollars on critically important features like power, cheaps out on literal PENNIES per unit for a backward compatibility that would be trivially easy to maintain, has all their eggs put in one TSMC-colored basket in ONE Taiwanese city Hsinchu sitting right on the possible invasion beach in Taiwan.
Re: (Score:2, Interesting)
Dropping 32bit PhysX support is never going to save enough Dollars to offset the reduction in sales to gamers and the drop in share prices because of bad publicity in any way attached to Nvidia.
NVIDIA has done nothing but piss off gamers for 4 years now. This won't make any difference. The handful of people running 10+ year old games won't even reflect in a rounding error on the balance sheet.
There really isn't any point in keeping legacy cruft around. You want to play old games, use old tech. That's always been the case.
Re: (Score:3)
NVIDIA has done nothing but piss off gamers for 4 years now. This won't make any difference. The handful of people running 10+ year old games won't even reflect in a rounding error on the balance sheet.
You do know what a Steam library is right? People have these games in their libraries that cannot be played the same going forward with newer hardware. It is not like companies still sell these games on Steam today . . . oh wait.
Re: (Score:2)
I do know what a Steam library is. I also note that many of the older legacy games in a Steam library have been updated by the developers of the games to run on modern systems. Command and Conquer (in my Steam library) objectively would not run on my hardware or OS right now. The fact it is in my Steam library and currently works is thanks to EA (yes just threw up in my mouth). If DICE wants people with RTX50 series to buy Mirror's Edge then they should put coders on the case to update PhysX to a new versio
Re: (Score:2)
Re: (Score:2)
Yup. And there's no guarantee that any game in that library will work with whatever hardware I happen to want to run it on.
NV is not obligated to maintain old features on the off chance that you decide to buy a decade old game on a Steam sale.
Re: (Score:2)
Problem isn't just costs. It's also the fact that there's a very limited, very small pool of expert software engineers that do drivers for nvidia GPUs. It's a very exclusive club, and almost no one is capable of even being considered to join it, much less invited, much less actually does it well. You can't just pay more money and generate more of these people, because they don't exist, and almost no one can be trained to do this job.
And those people are overwhelmingly allocated to AI accelerator drivers now
Re: (Score:2)
And those people are overwhelmingly allocated to AI accelerator drivers now. GPUs retained barely anyone, because that's not where growth and profits are.
Are you suggesting that Nvidia converted gaming guys into AI guys or that they got rid of gaming software guys in order to hire AI guys? Neither seems reasonable.
You do realize that Nvidia has been one of the few large tech companies that has laid off very few people in the last decade. Also that the money saved by laying off a few gaming guys wouldn't materially affect Nvidia's financials.
Re: (Score:2)
Suggesting? No. They told us, straight up in their annual reports. Over many years now. That they're refocusing on AI. This has been ongoing for at least half a decade at this point, and if you extend it to "CUDA and server space", we're in at over a decade of refocusing, retraining and moving people.
Re: You know what to do kids (Score:2)
It works with 4000 series too. 40xx is a bit overpriced but not horribly so. 4060 Ti 16GB is almost reasonable when on sale and it gets you access to all of the latest features. Raytracing aside, it will also do ultra 4k60 in most titles. You do need PCIE 4 to get what performance it has out of it, so it's not a good upgrade for older PCs, those users should stick with their 3070 or what have you.
Re: (Score:2)
Or you can get a 500 series card and use it as a PhysX accelerator with your modern card for the graphics.
Re: You know what to do kids (Score:2)
You can't, because you can't install two versions of the driver at once.
Re: (Score:2)
They were doing something in the video linked in the summary.
I was only half paying attention, but some old accelerator card was being used with the modern cards and it gave good performance.
Re: (Score:2)
It was a 5080+980.
Re: (Score:2)
Right, the 980 is the oldest card which will run on a driver for a modern card. And it will get cut off next. They don't support old cards forever. They do support them for a long time, so far I really have no cause for complaint there. I've been almost exclusively NVidia (especially on desktop) since the TNT2. But I'm still expecting to go AMD next, because I expect that by then they will have this spotty ROCm bullshit sorted, and I prefer an open driver. My laptop has Vega and works great.
Re: (Score:2)
So basically when the 4000 series are no longer supported it'll be impossible to run these games with PhysX of CPUs can't handle it (which actually seems quite possibly like CPUs won't be able to)?
Re: (Score:2)
They depreciated something which hasn't been used in games for over a decade.
You do know you can still buy some of these games today, right?
Re: (Score:2)
You do know you can still buy some of these games today, right?
And? You can still buy a lot of things, there's never a guarantee they will run adequately. Games have always come with hardware restrictions. You can buy Command and Conquer as well. The *only* reason it will work on your machine is because EA thought it was worth money to them to make it work on your hardware. If DICE thinks Mirrors Edge should work then they should put a programmer on it to update the PhysX version and push out an update.
Whether something works for you is on you to research.
Whether someo
Re: (Score:2)
And?
If you don't understand how a game still being sold that is now effectively crippled going forward, then it is a waste discussing it with you.
Whether something works for you is on you to research.
So these games or the 5000 series cards have stickers that tell the consumer buying either that there is now incompatibility? No. That seems like a legal problem for everyone.
Forward compatibility has never been guaranteed and there's a whole world of games out there that don't work. That's what refund policies are for. The only question is, will someone think you as a customer are worth the money to satisfy.
Except we are not talking about forward compatibility anymore. There is current incompatibility. It is no longer theoretical.
Re: (Score:2)
And if they want to keep selling them, they could probably make a really simple compatibility shim to use something other than the legacy library. Unless they didn't keep all the assets needed for this. The responsibility for game preservation falls on both sides, but I think if they want to keep selling it they should find a way.
Re: (Score:2)
Fascinating (Score:4, Funny)
Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.
Re: (Score:2)
Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.
No it doesn't. The original Tetris is completely unplayable on current hardware. Some modified version to suit the fact that you are not playing it on the original hardware may still work.
Tetris as released was not only a 16bit game *support for which has been depreciated* but the code also cycled the block speed based on CPU ticks which means playing it on a modern GHz+ CPU is unplayable.
Tetris runs in your case much the same way as any game that has shipped an updated PhysX DLL over the years runs on mode
Re: (Score:2)
Tetris plays just fine even on vesa local bus with enough memory for 15 bit colour.
No it doesn't. The original Tetris is completely unplayable on current hardware. Some modified version to suit the fact that you are not playing it on the original hardware may still work.
Tetris as released was not only a 16bit game *support for which has been depreciated* but the code also cycled the block speed based on CPU ticks which means playing it on a modern GHz+ CPU is unplayable.
Tetris runs in your case much the same way as any game that has shipped an updated PhysX DLL over the years runs on modern GPUs as well.
Psst: Hey, buddy. You're spectrum is showing.
Re: (Score:2)
You and the OP missing the point that software doesn't run infinitely and needs to be maintained for modern platforms doesn't put other people on the spectrum, it just advertises your own lack of intelligence.
Re: (Score:3)
You and the OP missing the point that software doesn't run infinitely and needs to be maintained for modern platforms doesn't put other people on the spectrum, it just advertises your own lack of intelligence.
You're taking two jokes uber-seriously and using them as an excuse to rant. Seems pretty spectrumy to me. I should know, I have my share of spectrum traits.
Re: (Score:2)
Oh I see, you "joke" about people with mental illnesses, not once but twice. Well fuck you very much.
As for the OP, the point is that his point is interesting if incorrect. The question of how well old games play on modern hardware is the foundation for a valid discussion. But all you got is insults for potential people with mental illnesses. Do your share of spectrum traits give you a free pass at being an arsehole?
Re: (Score:2)
Oh I see, you "joke" about people with mental illnesses, not once but twice. Well fuck you very much.
As for the OP, the point is that his point is interesting if incorrect. The question of how well old games play on modern hardware is the foundation for a valid discussion. But all you got is insults for potential people with mental illnesses. Do your share of spectrum traits give you a free pass at being an arsehole?
Where did I insult anybody? You seem hyper-sensitive for somebody who seems to see themselves as a know-it-all. You pretty much insult every post of mine you come across. Maybe you need an internet break if you think simple teases are somehow insulting mental diseases. Good grief.
Re: (Score:2)
Exactly. The problem here is no one has coded a workaround for you for PhysX yet. PhysX runs fine on the modern CPU, you just need a version that is objectively designed to do so. The issue here is a small subset of games running a specific version of PhysX from back in the day where the API didn't even support SSE.
Re: (Score:2)
The original Tetris runs on the Elektronika 60 [wikipedia.org], a PDP-11 clone. You won't be able to run it on a PC emulator like 86box. MAME might be able to do it, with a suitable emulated terminal hooked up.
RT (Score:2)
A nice video except it was repeated several times that the RT tech may await the same future. No! Not going to happen ever, RT is an official standardized extension for both Vulkan and Direct3D. It's not going anywhere, it's vendor neutral.
Re: (Score:2)
Then you should watch the video again because that's not what he said. He specifically called out vendor software features, and then mentioned no guarantee that RT will work the way it does today for the *specific hardware features* noting "that stuff could change".
And it did. DirectX supported ray tracing long before AMD started including the hardware to accelerate it. Being an open standard doesn't do shit for you if someone isn't providing the compatible hardware for you to use it.
Re: (Score:2)
"DirectX supported ray tracing long before AMD started including the hardware to accelerate it"
See this is where OpenGL ruled - you simply added in the support. You didn't need specialized hardware - if your GPU simply had the horsepower just make the extension and you were good to go.
I was using ATI and doing raytracing LONG AGO. I did it with a lil program called GemRay/GemCad, used for designing faceted stones.
Yea, it was slow. It still worked.
Re: (Score:2)
See this is where OpenGL ruled - you simply added in the support. You didn't need specialized hardware
I recall not being able to run OpenGL games without a graphics card that supported hardware acceleration. The idea that specialised hardware isn't needed is charitably "technically correct". The very real reality is if you want ray tracing the hardware needs to be designed for it or you end up with a slideshow (like AMD cards from 2 years ago which matched the performance of NVIDIA cards when raytracing was off).
Re: (Score:2)
There is DXR which is a part of the DirectX 12 Ultimate API, but there is the whole RTX from NVIDIA that has NVIDIA specific functions/extensions.
Re: (Score:2)
1. None. He didn't say vendor software features for RT. He said *hardware features* for RT. The software features he specifically called out DLSS, FG, DLAA, etc.
2. Yes all games that use raytracing will use ray tracing hardware, and as he said there's no guarantee that this hardware will exist in the future.
And that is a very safe statement. Forward compatibility is never guaranteed, and is even *less* guaranteed when only a single company is pushing something (AMD had to be dragged kicking and screaming in
Re: (Score:2)
Re: (Score:2)
Except NVIDIA can't afford to do that because AMD and Intel will continue to support these features in HW. It would be suicide.
32bit PhysX/CUDA have been deprecated because they were proprietary and barely used.
Nothing RT related in Vulkan/Direct3D is proprietary.
Re: RT (Score:2)
This also applies to their $500 GPUs. A lot of people buying those probably still play older games.
Removal is not crippling. (Score:2)
Re: (Score:2)
It does. All the games in question still run in some capacity on the GPU. The performance of it is crippled. You can play Mirrors Edge for example just fine. Just expect 10fps on your $1500 GPU if any glass shatters in the scene.
They don't care.. (Score:2)
..about gamers
All they care about is AI
I wouldn't be surprised if they stopped work on gaming hardware
Re: (Score:2)
..about gamers
They don't care about gamers who target playing games released over a decade ago. And honestly why would they? They aren't the target market for a fancy new raytracing capable GPU.
Inexpensive workaround... (Score:2)
Single slot low profile low power quadro cards like the P600 or P620 supports physx and can be found well under $50 including shipping.
No need to waste space or power on something bigger just for physx.... just sayin.
Re:I don't like nvidia but.. (Score:5, Insightful)
It's not an issue of silicon. They still support 32-bit data types just fine.
It's 100% a driver code issue.
Re: (Score:2)