NVidia Cripples PhysX "Open" API 393
An anonymous reader writes "In a foot-meet-bullet type move, NVidia is going to disable PhysX engine if you are using a display adapter other than one that came from their company. This despite the fact that you may have an NVidia card on your system specifically to do this type of processing. 'For a variety of reasons some development expense, some quality assurance and some business reasons Nvidia will not support GPU accelerated PhysX with Nvidia GPUs while GPU rendering is happening on non-Nvidia GPUs.' Time to say hello to Microsoft dx physics or Intel's Havok engine."
Havok (Score:5, Insightful)
Havok is a better engine anyway.
But that's the problem with corporate buyings anyway. Even if its kinda wrong to stop supporting the other platforms, they have every right to do so.
Re:Havok (Score:4, Insightful)
Havok is a better engine anyway.
That may be the case but in the end we'll more than likely see corporate drama surrounding that effort as well.
I hate to say it but I think a DirectX option is the lesser of three evils.
Re:Havok (Score:5, Funny)
Re: (Score:2)
Actually Windows Mobile is lesser than three of the evils too, as far as openness goes.. All iPhone, Symbian and Palm are quite closed.
Re: (Score:3, Insightful)
Symbian that you get on your phone might as well not be open source. Symbian Signed? Please.
iPhone needs jailbreaking to be open.
Windows Mobile does not.
Hence the original statement from GP that I agree with - Windows Mobile is the lesser evil. Scary, that is.
Re: (Score:3, Funny)
That's a new record for a Microsoft product. Lesser of two evils? Okay, occasionally. But a lesser of three!? There's hope for them yet!
Why am I suddenly reminded of the game "Eternal Darkness"?
Re:Havok (Score:5, Funny)
That's a new record for a Microsoft product. Lesser of two evils? Okay, occasionally. But a lesser of three!? There's hope for them yet!
Microsoft <3
you mean like this?
Re:Havok (Score:5, Funny)
.
Close, but more like this: Microsoft <3 Evil. *
Re: (Score:2)
What's wrong with OpenCL exactly?
Re: (Score:3, Insightful)
Re:Havok (Score:4, Interesting)
Re: (Score:2)
Re:Havok (Score:5, Insightful)
Re:Havok (Score:5, Insightful)
Havok and the DX Physics are completely open and either party can use them, no proprietary api or licensing or anything silly. No hardware vendor controls what happens.
PhysX is not. It is controlled by Nvidia. Gosh, they wouldn't have financial motives to abuse this power would they? No of course not...
Nvidia lately seems to have been getting around the whole market segmentation issue by ... paying off forum members in all the hot PC Hardware forums? Lately my favorite has been inundated with troll and fanboy posts proclaiming the wonders of PhysX (still waiting for a game where it actually adds anything) and the death of AMD/ATi.
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
I'm always impressed by Havok. Whenever I pick up a game that uses it I always smile as I know I'm going to enjoy the physics if nothing else.
This is a bonehead move from nVidia as they've essentially just killed PhysX.
Re: (Score:3, Interesting)
I'm always impressed by Havok. Whenever I pick up a game that uses it I always smile as I know I'm going to enjoy the physics if nothing else.
This is a bonehead move from nVidia as they've essentially just killed PhysX.
Or, they're strengthened PhysX position and on the way their gfx cards too. When company buys some technology, its never without a reason.
Re: (Score:3, Informative)
How can a developer now realistically choose PhysX when they know it would cut their target market by 25%?
They've killed it.
Re:Havok (Score:4, Insightful)
1. An onboard NVIDIA device with a discrete ATI graphics card. From what I've heard, PhysX running on integrated devices isn't any faster than running on the CPU in software mode, so nothing has been lost. So no target market has been lost there.
2. Having both a discrete ATI graphics card, and an unused GeForce 8000+ or Tesla. That is a pretty fucking weird configuration. I can't see that being more than a tenth of a percent of gamers. I've personally never encountered someone who runs both.
Mountain. Molehill.
Re: (Score:3, Interesting)
2: I've considered using a mix of ATI and nVidia cards on my primary machine, which is also where I play games. Why? I'd like to move from having dual displays to having three, and I ostensibly do have enough hardware to do so. But due to nVidia's driver limitations, I'd have to turn off SLI in order to make all of the DVI outputs live at the same time, and I don't want to turn off SLI.
Currently, the way around this problem is to install another GPU of a different brand. In this way, one can utilize SL
Re: (Score:3, Insightful)
Re:Havok (Score:4, Insightful)
@sopssa: "Havok is a better engine anyway."
By saying that in the context of this article you're implying that Havoc is a more open, less ip/license/business relationship-constrained option, and I don't think that's true. If Intel wants to exert its rights over the technology in the same way we're right back to the same situation with PhysX; Havoc may be better technically but its worthless if no one can get their hands on it.
Wrong != right (Score:3, Funny)
That's quite a contradiction you made there.
Comment removed (Score:4, Insightful)
Re: (Score:3, Interesting)
Well, ATI cards have some issues still. One that comes to mind:
WoW under wine. The minimap displays solid white because of an issue with pixel buffers in the ATI Catalyst drivers.
Not saying ATI sucks, but they do still have some issues that need to be addressed, particularly on the Linux side of the pond.
Anti-trust? (Score:5, Interesting)
Re: (Score:2, Insightful)
Re:Anti-trust? (Score:4, Interesting)
I gave up on Nvidia when they screwed over my 3D glasses setup; I'd gone through all the trouble of maintaining my rig with an NVidia graphics card, because their occasional driver updates for the stereoscopic driver still made my old VRStandard rig (coupled with a 120Hz-capable CRT) run well.
Lo and behold, their latest set "only" works either with the Nvidia-branded "Geforce 3D Vision" glasses and a short-list of extra-expensive "approved" 120-Hz LCD's, or else red/blue anaglyph setups. No reason for them to cut off older shutter glasses setups except to force people to buy their new setup if they wanted to continue to have stereoscopic 3D.
So add the PhysX thing in and we can chalk up two strikes for Nvidia. My new card when I updated my computer this summer was an ATi (no point wasting the $$$ on a Nvidia). One more strike and I won't bother going back to them ever. Boy am I glad I didn't buy that second-hand PCI PhysX board the other day...
Re:CRT? Are you from the past? (Score:5, Informative)
They didn't drop support for "CRT's". They decided that their stereographics driver would only work in the following configurations:
- anaglyph glasses with a "whatever" monitor (horrible color distortion and headache-inducing ghosting ensues).
- *THEIR* shutter glasses, with *THEIR* overpriced "partnered" LCD monitors.
Now, what is the difference (tech-wise) between their shutter glasses and mine? Only the fact that theirs send a specific "yes I'm nvidia" signal back to the card. What is the capability difference between their overpriced "partnered" LCD monitors and my 120Hz-capable CRT? Two things: Jack and Shit.
This is not about "dropping support for outdated technology." Prior to what they pulled, I could plug in an industry-standard shutter glasses set made by any of a number of manufacturers, combine them with any monitor capable of 120-Hz refresh (whether CRT, LCD, certain televisions, or even a few projector models), and enjoy stereoscopic gaming. After their "update" to the drivers and subsequent "update" to the stereoscopic drivers, the Nvidia cards would only recognize *THEIR* proprietary glasses (which again, hardware-wise are no different than the old type save for sending a "hi I'm from nvidia" signal to the card) and would only interoperate with a precious few "specially chosen" 120Hz LCD's.
This had nothing to do with "dropping support" for "obsolete equipment" (which wasn't in any way, shape, or form) and everything to do with trying to milk people for $500+ on a new rig by forcibly crippling industry-standard hardware.
Looks like you're wrong (Score:3, Informative)
Nvidia says they support any 100+Hz CRT.
http://www.nvidia.com/object/3D_Vision_Requirements.html [nvidia.com]
Re: (Score:3, Interesting)
As of last month. They just added that back. A bit too late (plus I'm still unwilling to infect my system with Vista, AND buy their $200 glasses when my old ones were exactly the same hardware, just to get it working again).
Re: (Score:3, Interesting)
I wonder if CRT persistence would become a problem at that high of a refresh rate. I had a similar system (Asus had 3D goggles that were tied to a dongle on the VGA port, pre-DVI) but the 3D would get really blurry at refresh rates higher than 60Hz due to phosphor persistence (essentially 30Hz per eye, though it's probably not that simple since they're alternating). IIRC
Re: (Score:3, Insightful)
He doesn't care if they are on the up and up, he cares that they (from his point of view) arbitrarily removed functionality, that, for all he could tell, was working just fine. They don't have to be dishonest to make stupid decisions that make them worth avoiding as a supplier.
Re: (Score:3, Funny)
Capitalism can't be bettered - the Giant Invisible Hand has spoken through its prophets and all we can do is try to grasp the perfection of the pre-existing system. Any deviation swiftly results in systemic breakdown - surely the way the Swiss went to stuffing 45 Million people into Gulags in Swisberia, not three weeks after they tried socialized mail service, proves this. And of course the lifeless wastelands that are all that is left of the Scandinavian countries after they adopted public health care plan
Re:CRT? Are you from the past? (Score:4, Insightful)
No, but you can't blame a company for not wanting to support outdated technology.
That's like complaining that Microsoft won't release security updates for Windows 98. Sure, some people are still using it, and it might work perfectly well for them, but that doesn't mean MS is evil for not patching it.
Re:CRT? Are you from the past? (Score:5, Insightful)
What about a CRT is outdated? It has better black levels, faster refresh, and higher brightness than an LCD. It's analog but good cabling will still result in a crystal clear image. I think the primary disadvantages of CRTs is that widescreen is so costly as to be impractical. They are heavy. And they suck a lot of power. but in terms of image quality a CRT is still extremely good.
CRTs are "outdated" because businesses want to sell LCDs. Flat and light is sexy. And LCDs sold like crazy back when the image quality was dramatically inferior to a CRT, and it took them years to catch up.
CRT technology is not obsolete, but the marketing of CRTs is dead. If you want to argue that we should use technology based on marketability alone, be my guest. I suspect most slashdotters will rip into you pretty brutally.
Re: (Score:3, Informative)
Nicely done. You had me until this line:
And XP sold like crazy back when its stability was dramatically inferior to Windows 98, and it took them two service packs to catch up.
I just couldn't suspend disbelief anymore after that. ;-)
Re:CRT? Are you from the past? (Score:5, Insightful)
That was a ridiculous thing to post.
A CRT doesn't need support, it needs to not be sabotaged.
His glasses don't need support, they need to not be sabotaged.
Not supporting both of them takes more effort than ignoring them.
Competent support of all that hardware would take less space in code than this comment window is high. Going to the trouble to restrict it was much more... *after* the meetings, licenses, and money exchanges had all taken place.
The cynic in me believes that someone with a debugger is probably a single (or two) flipped bit(s) away from a working setup.
Re:Anti-trust? (Score:4, Insightful)
Worse than that even, this is using your strength in one industry segment (physics acceleration) to support sales of an arguably different segment (graphics acceleration).
Re:Anti-trust? (Score:5, Informative)
Worse than that even, this is using your strength in one industry segment (physics acceleration) to support sales of an arguably different segment (graphics acceleration).
Which is nasty and unethical to be sure, but it's not illegal unless it can be legally shown that Nvidia is a monopoly. It's amazing to me how many slashbots don't understand this distinction.
I'm pissed at ATI for dropping binary support for FGLRX for Linux kernels later than 2.6.29, and was considering getting an Nvidia GPU in my next laptop, but now it looks an awful lot like Intel is getting my $50....
Re: (Score:2)
Which is nasty and unethical to be sure, but it's not illegal unless it can be legally shown that Nvidia is a monopoly. It's amazing to me how many slashbots don't understand this distinction.
Is there another hardware-accelerated physics computing system that we are not aware of?
Re: (Score:2)
Re: (Score:2)
Yes. ATI. Various custom providers. General purpose CPUs. Supercomputers.
It's laughable to claim NVidia has a monopoly on anything.
Re: (Score:2)
Ray-traced footguns, perhaps?
Re:Anti-trust? (Score:4, Informative)
Getting a bit off topic, but I like the direction ATI is taking recently with Open Source. Long term, I think they will be the better choice for Linux. :-)
In a recent test at Phoronix (http://www.phoronix.com/scan.php?page=article&item=amd_r600_r700_2d&num=1 [phoronix.com]) the OS driver already offered better 2D performance over the binary one
Re:Anti-trust? (Score:5, Informative)
Read the f... Phoronix article ;-)
Yes, it seems the binary is really crappy in this case and the OSS driver at least passable. Although I'm not familiar with those benchmarks and how they measure up against similar software on Windows.
3D still seems to lag behind, otherwise we could officially forget the Catalyst driver and use the OSS driver exclusively for Linux. But I think we'll get there.
Re: (Score:3, Informative)
I'm pissed at ATI for dropping binary support for FGLRX for Linux kernels later than 2.6.29, and was considering getting an Nvidia GPU in my next laptop, but now it looks an awful lot like Intel is getting my $50....
It was my understanding they had only dropped updated support for older cards (R500?), which are pretty well supported by the OS driver these days anyway, now that ATI is publishing specs again. Am I confused?
Re:Anti-trust? (Score:4, Informative)
Re:Anti-trust? (Score:5, Insightful)
This phrase "anti-trust", I don't think it means what you think it means.
How are they leveraging a monopoly to gain unfair advantage in a marketplace?
To me it seems more like NVIDIA has finally realized that they *can't* use it to gain unfair advantage so they're dumping it.
Re: (Score:2)
Why is this not anti-trust? ...
Because nVidia doesn't have a monopoly in the video card market.
Re: (Score:3, Interesting)
What's to say they won't release a more expensive dual or quad GPU card with no video output, at a higher cost (profit margin)? This sort of move indicates that's what they're planning on doing. Buying single core cheaper video card units might cannibalize that market.
Truth (Score:5, Funny)
At least he was 33.3% truthful.
But... (Score:5, Funny)
Re:But... (Score:5, Funny)
I think you accidentally the verb.
Just in time! (Score:2)
I was about to start using it, this announcement has saved me a lot of wasted effort.
Re: (Score:3, Interesting)
Can someone explain this more clearly? (Score:2)
Was Nvidia previously offering a software framework that could run on any GPU, but now only supports their own? Can ATI (or anyone else) not implement the standard in their own drivers?
Re:Can someone explain this more clearly? (Score:5, Informative)
Re:Can someone explain this more clearly? (Score:5, Informative)
No. The framework would only run on their GPUs. However, you could have one of their cards in the system to do purely physics calculations, and then use a competitor's card to do the actual display and 3d rendering. They've now disabled this, so if your monitors are connected to, say, and ATI card, you can no longer use the Nvidia card in your system for physics processing.
Before you discount this as an unlikely scenario, consider motherboards with onboard NVidia chipsets. These are usually underpowered for full time duty, but are perfectly suited to being used for physics calculations while a more powerful ATI card in the PCI-E slot does the graphics rendering. This is actually a fairly likely setup these days, and NVidia has just said they're going to block it.
Personally, I agree with others who have pointed out this must be an anti-trust issue. Intel and Microsoft have both been fined heavily recently for doing exactly this kind of anti-competitive behaviour.
Re:Can someone explain this more clearly? (Score:5, Insightful)
Its anti-consumer, but that doesn't trigger an anti-trust charge, they don't have a monopoly.
Why does everyone scream like its illegal when a company does something they don't like? Unless they are king of the hill and using their powers to force others into capitulating with them, its not an issue for the courts. You don't have to buy nVidia. You don't have to use PhysX. You don't have to buy a Voodoo 3 card. Sure a game may only support one of the above, but thats not something that justifies going after nVidia unless they owned the market.
Re:Can someone explain this more clearly? (Score:5, Informative)
Its anti-consumer, but that doesn't trigger an anti-trust charge, they don't have a monopoly.
The perpetrator doesn't have to have a monopoly for tying to be illegal - in U.S., for example, you only need "sufficient market power" to affect "not insubstantial amount of interstate commerce in the tied product market" [wikipedia.org]. I dare say that NVidia has pretty damn substantial market power in GPU niche, and it is quite likely to affect sales of all other GPUs in a significant way. In the end, of course, it's up to the courts to decide, if it comes to that, but the allegation is not without merit.
Re:Can someone explain this more clearly? (Score:4, Informative)
Re:Can someone explain this more clearly? (Score:5, Insightful)
If they caused the ATI card to not function then I could understand it, but a secondary function on their own card ?
Re:Can someone explain this more clearly? (Score:4, Informative)
Re: (Score:2)
No, PhysX is (and was) only ever hardware accelerated on Nvidia/Ageia hardware. Before you could add a second (Nvidia) card to your system and use it for PhysX. All this announcement is saying is that people using AMD as their primary GPU can no longer do this.
Re: (Score:2)
Good luck with that (Score:5, Insightful)
Nope... (Score:3, Insightful)
PhysX was trying to make a market for PPUs (and relatively failing). nVidia bought them up to make the technology another marketing bullet point for their GPU parts, not to sell GPU parts as mere physics calculations. Sure, they'll take the business as it comes incidently, but they have no interest in anything that could remotely be construed as putting something other than their role as a graphics adapter vendor first.
Re: (Score:2)
Re:Good luck with that (Score:5, Informative)
Closing the Architecture (Score:4, Insightful)
Re: (Score:3, Insightful)
windows is an "approved list of hardware". Ever tried to run DirectX under anything else?
OpenGL3 is the first time that companies are breaking away from windows.
You can't keep a PC closed forever because it's bad for business.
Re:Closing the Architecture (Score:5, Informative)
OpenGL3 is the first time that companies are breaking away from windows.
It seems like OpenAL was the first. Creative have been visibly pushing it now that Vista's forced-software-only sound API has made their sound cards pointless.
Re: (Score:2)
you are correct. I meant for graphics, but I didn't really think about that with openAL. Thank you for the correction.
Re:Closing the Architecture (Score:5, Insightful)
MS pulled a smackdown on Creative. Creative cards (and drivers, especially drivers! [FU creative]) have been sucking for years.
So, new OS comes out and MS removes all the hooks that 3rd parties have been putting into the Windows sound system, instantly leveling the playing field and removing a major source of Windows instability.
One of the few times MS really did the right thing.
oh well (Score:5, Informative)
physx seemed nice until they tried to close source it. Does Nvidia have anything left this round? Bad Yields [semiaccurate.com], physx being stupid and abusive when disabled (it only uses 1 cpu core when on AMD for example [driverheaven.net]instead of even all threads). Not to mention their crippling of batman as well. [hardwarezone.com.sg]
So what's left for Nvidia? I don't see a whole lot.
Re: (Score:2)
Actually a lot.
They have the ION platform which is much better than what Intel supplies for netbooks and nettops.
They have hardware flash acceleration coming.
They used to have the best Linux drivers but I have not been keeping up with ATI's progress with their closed source drivers or the open source drivers that people are working on with the specs that ATI released.
And they have a lot of mindshare and support from game makers. I just hope that Nvidia gets heading back in the right direction. It is good t
Re: (Score:2)
ion still is only par for par with ATI's integrated products. Not worse, not better. I like the tegra solutions they have had but you know, that's not exactly a huge growing business sector yet (although it could become one).
they are rapidly losing "mindshare" behind closed doors, because people aren't liking the results of physx and it's impact on sales.
Hardware flash acceleration? That's not unique to Nvidia or a solution to anything that exists. Nobody wants flash, it's going out of style.
OpenCL? ATI and
Re: (Score:3, Informative)
They used to have the best Linux drivers but I have not been keeping up with ATI's progress with their closed source drivers or the open source drivers that people are working on with the specs that ATI released.
Contrary to the constant cheer leading here on slashdot, Nvidia still has the best Linux drivers by a wide margin. With steady progress being made with the open source drives, this may not always be the case, but it is likely to be so for at least another year, maybe more. The simple fact is, ATI's
Re: (Score:2)
Re: (Score:2)
uh, no. If they had done this with their graphics card alone it would be fair. Go read the article. "We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced."
What are they trying to do? (Score:3, Interesting)
Stop things like this [anandtech.com] from working?
I don't get it.. (Score:2)
I know nothing about PhysX other than what I've gleaned from the article..
If you buy an nvidia card to do some headless gpu grunt work, they will disable the functionality to do that unless the work is being shown through another nvidia card?
The displaying of the work is pretty much superfluous to the work being done, and they've already made their money on selling, support, etc the PhysX card.
Err?
ATI and Nvidia (Score:2)
Has anything changed with Windows 7 where you can run an ATI and Nvidia card at the same time? I know you could in XP, but I found out the hard way you couldn't in Vista. It was something to do with the new driver model.
I was trying this over a year ago to get dual monitors working while having SLI enabled under Vista. The recommended solution was to use an ATI card for the secondary output since the Nvidia drivers wouldn't see it and disable your ability to use SLI. When I tried to load the ATI drivers
Re: (Score:2)
This doesn't have anything to do with multiple graphics cards (except insofar as you have two cards capable of rendering accelerated graphics in the machine). A card set up purely as a PhysX processor isn't using the WDDM (the Vista/Win7 display driver architecture) pathway, which requires the same driver for all graphics cards. The non-Physx card used for graphics owns that pathway, and the Physx card runs independently. The Physx card is like a RAID or USB add-on card; there's no real limit to how many
Weird to begin with (Score:3, Interesting)
Re: (Score:2, Insightful)
When I shop for a video card, I don't care if it is ATI or NVIDIA as long as the choice I am making is cost effective. I would much rather spend my money on the card that is cheaper for the same performance -- which happens to be ATI in this case. Originally I was going to pair an 8800GT with an ATI card for Windows 7, but this news blows. NVIDIA should straighten up and get over their emotional attention whoring. They won't get my money now unless they grow up.
Re: (Score:3)
Proprietary APIs (Score:4, Interesting)
I'm currently avoiding PhysX due to the fact that the license requires that credit be given to nVidia/PhysX in any advertisement that mentions the advertised product's physics capabilities. It's a real shame, because I hear that PhysX has pretty robust physics implementation.
The current state of physics acceleration reminds me of the days when hardware-accelerated 3D graphics (except for high-end OpenGL stuff) were only supported through manufacturer-specific APIs. Hopefully, DirectX physics will be good enough that PhysX will ultimately become mostly irrelevant to game developers -- I'm just not convinced that Microsoft can pull it off.
You recommend against proprietary APIs and yet.. (Score:4, Insightful)
You express a desire for an API from Microsoft to become dominant?
PhysX doesn't matter because Nvidia is doomed (Score:2, Interesting)
Between the full stack (CPU+Chipset+GPU) provided by AMD and the full stack that will be Intel (with Larrabee in 2010) Nvidia has no future in either Chipsets or GPUs. Any other outcome is a bet against integration and in electronics integration always wins.
Good thing too; both Intel and AMD are vastly more open (at least recently) with their hardware.
Re: (Score:3, Interesting)
Integration only wins when the integrated chip is "good enough". Intel has had "integrated, accelerated" graphics chips on their mobos for ages, but they've been so monumentally inferior, that anyone who wanted to play even 'older' 3D games like Q3-engine based games, far cry, unreal , most MMOGs released in the last 6 years, etc, needed to add-on a GPU.
From the reviews I've seen, unless you want to muck around with real-time ray tracing (which Intel still hasn't gotten up to very good performance, from wha
not a problem (Score:2, Interesting)
Crazy (Score:3, Insightful)
Re: (Score:2)
Soon irrelevant anyway (Score:5, Insightful)
Once the big game engines and physics libraries get generic support for GPU programming through OpenCL, this will all be pretty moot anyway. From what I can tell, the bullet physics library is already developing this, and I am sure closed source competitors are doing that as well. Relying on anything that will only run on a single vendor's hardware is just a losing business proposition (unless that vendor pays you for it, which I guess is how PhysX got going).
Bullet Physics for the Win! (Score:5, Informative)
http://www.bulletphysics.com/ [bulletphysics.com]
I don't have any affiliation with the project other than I've used it in my homegrown game engine that has never left my hard drive. It is however rather easy to use. When I was looking for a physics engine, Bullet turned out to be the best license, code base, and documentation set out there for no cost.
THIS JUST IN! (Score:4, Funny)
Nvidia releases announcement that they will no longer provide free driver support to ATI for interaction between Nvidia hardware and ATI competing hardware. Notes that software APIs are available for ATI to pay for and release their own damn drivers.
NEWS AT 11!!!
I had never actually thought of this before. (Score:3, Interesting)
If you think about it, physx works on all 8 series and up.
That's a $30 card for physx support. I wonder if I can do this since I have a spare x16 port on my machine.
I don't really know if this will work though.
Re:Key word: "reportedly" (Score:5, Informative)
Can I use an NVIDIA GPU as a PhysX processor and a non-NVIDIA GPU for regular display graphics?
No. There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, NVIDIA PhysX technology has been fully verified and enabled using only NVIDIA GPUs for graphics.