NVidia Cripples PhysX "Open" API 393
An anonymous reader writes "In a foot-meet-bullet type move, NVidia is going to disable PhysX engine if you are using a display adapter other than one that came from their company. This despite the fact that you may have an NVidia card on your system specifically to do this type of processing. 'For a variety of reasons some development expense, some quality assurance and some business reasons Nvidia will not support GPU accelerated PhysX with Nvidia GPUs while GPU rendering is happening on non-Nvidia GPUs.' Time to say hello to Microsoft dx physics or Intel's Havok engine."
Re:Anti-trust? (Score:5, Informative)
Worse than that even, this is using your strength in one industry segment (physics acceleration) to support sales of an arguably different segment (graphics acceleration).
Which is nasty and unethical to be sure, but it's not illegal unless it can be legally shown that Nvidia is a monopoly. It's amazing to me how many slashbots don't understand this distinction.
I'm pissed at ATI for dropping binary support for FGLRX for Linux kernels later than 2.6.29, and was considering getting an Nvidia GPU in my next laptop, but now it looks an awful lot like Intel is getting my $50....
Re:Can someone explain this more clearly? (Score:5, Informative)
oh well (Score:5, Informative)
physx seemed nice until they tried to close source it. Does Nvidia have anything left this round? Bad Yields [semiaccurate.com], physx being stupid and abusive when disabled (it only uses 1 cpu core when on AMD for example [driverheaven.net]instead of even all threads). Not to mention their crippling of batman as well. [hardwarezone.com.sg]
So what's left for Nvidia? I don't see a whole lot.
Re:Can someone explain this more clearly? (Score:5, Informative)
No. The framework would only run on their GPUs. However, you could have one of their cards in the system to do purely physics calculations, and then use a competitor's card to do the actual display and 3d rendering. They've now disabled this, so if your monitors are connected to, say, and ATI card, you can no longer use the Nvidia card in your system for physics processing.
Before you discount this as an unlikely scenario, consider motherboards with onboard NVidia chipsets. These are usually underpowered for full time duty, but are perfectly suited to being used for physics calculations while a more powerful ATI card in the PCI-E slot does the graphics rendering. This is actually a fairly likely setup these days, and NVidia has just said they're going to block it.
Personally, I agree with others who have pointed out this must be an anti-trust issue. Intel and Microsoft have both been fined heavily recently for doing exactly this kind of anti-competitive behaviour.
Re:Havok (Score:3, Informative)
I'm always impressed by Havok. Whenever I pick up a game that uses it I always smile as I know I'm going to enjoy the physics if nothing else.
This is a bonehead move from nVidia as they've essentially just killed PhysX.
Re:Anti-trust? (Score:4, Informative)
Getting a bit off topic, but I like the direction ATI is taking recently with Open Source. Long term, I think they will be the better choice for Linux. :-)
In a recent test at Phoronix (http://www.phoronix.com/scan.php?page=article&item=amd_r600_r700_2d&num=1 [phoronix.com]) the OS driver already offered better 2D performance over the binary one
Re:Closing the Architecture (Score:5, Informative)
OpenGL3 is the first time that companies are breaking away from windows.
It seems like OpenAL was the first. Creative have been visibly pushing it now that Vista's forced-software-only sound API has made their sound cards pointless.
Bullet Physics for the Win! (Score:5, Informative)
http://www.bulletphysics.com/ [bulletphysics.com]
I don't have any affiliation with the project other than I've used it in my homegrown game engine that has never left my hard drive. It is however rather easy to use. When I was looking for a physics engine, Bullet turned out to be the best license, code base, and documentation set out there for no cost.
Re:Key word: "reportedly" (Score:5, Informative)
Can I use an NVIDIA GPU as a PhysX processor and a non-NVIDIA GPU for regular display graphics?
No. There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, NVIDIA PhysX technology has been fully verified and enabled using only NVIDIA GPUs for graphics.
Re:Can someone explain this more clearly? (Score:4, Informative)
Re:Anti-trust? (Score:3, Informative)
I'm pissed at ATI for dropping binary support for FGLRX for Linux kernels later than 2.6.29, and was considering getting an Nvidia GPU in my next laptop, but now it looks an awful lot like Intel is getting my $50....
It was my understanding they had only dropped updated support for older cards (R500?), which are pretty well supported by the OS driver these days anyway, now that ATI is publishing specs again. Am I confused?
Re:Havok (Score:1, Informative)
Re:Can someone explain this more clearly? (Score:4, Informative)
Re:Anti-trust? (Score:4, Informative)
Re:Anti-trust? (Score:5, Informative)
Read the f... Phoronix article ;-)
Yes, it seems the binary is really crappy in this case and the OSS driver at least passable. Although I'm not familiar with those benchmarks and how they measure up against similar software on Windows.
3D still seems to lag behind, otherwise we could officially forget the Catalyst driver and use the OSS driver exclusively for Linux. But I think we'll get there.
Re:Good luck with that (Score:5, Informative)
Re:Anti-trust? (Score:1, Informative)
Actually, this is tying the products together. That is quite illegal.
http://en.wikipedia.org/wiki/Tying_(commerce)
It's amazing to me you didn't know that.
Re:Havok (Score:3, Informative)
How can a developer now realistically choose PhysX when they know it would cut their target market by 25%?
They've killed it.
Re:CRT? Are you from the past? (Score:5, Informative)
They didn't drop support for "CRT's". They decided that their stereographics driver would only work in the following configurations:
- anaglyph glasses with a "whatever" monitor (horrible color distortion and headache-inducing ghosting ensues).
- *THEIR* shutter glasses, with *THEIR* overpriced "partnered" LCD monitors.
Now, what is the difference (tech-wise) between their shutter glasses and mine? Only the fact that theirs send a specific "yes I'm nvidia" signal back to the card. What is the capability difference between their overpriced "partnered" LCD monitors and my 120Hz-capable CRT? Two things: Jack and Shit.
This is not about "dropping support for outdated technology." Prior to what they pulled, I could plug in an industry-standard shutter glasses set made by any of a number of manufacturers, combine them with any monitor capable of 120-Hz refresh (whether CRT, LCD, certain televisions, or even a few projector models), and enjoy stereoscopic gaming. After their "update" to the drivers and subsequent "update" to the stereoscopic drivers, the Nvidia cards would only recognize *THEIR* proprietary glasses (which again, hardware-wise are no different than the old type save for sending a "hi I'm from nvidia" signal to the card) and would only interoperate with a precious few "specially chosen" 120Hz LCD's.
This had nothing to do with "dropping support" for "obsolete equipment" (which wasn't in any way, shape, or form) and everything to do with trying to milk people for $500+ on a new rig by forcibly crippling industry-standard hardware.
Re:CRT? Are you from the past? (Score:3, Informative)
Nicely done. You had me until this line:
And XP sold like crazy back when its stability was dramatically inferior to Windows 98, and it took them two service packs to catch up.
I just couldn't suspend disbelief anymore after that. ;-)
Re:Can someone explain this more clearly? (Score:5, Informative)
Its anti-consumer, but that doesn't trigger an anti-trust charge, they don't have a monopoly.
The perpetrator doesn't have to have a monopoly for tying to be illegal - in U.S., for example, you only need "sufficient market power" to affect "not insubstantial amount of interstate commerce in the tied product market" [wikipedia.org]. I dare say that NVidia has pretty damn substantial market power in GPU niche, and it is quite likely to affect sales of all other GPUs in a significant way. In the end, of course, it's up to the courts to decide, if it comes to that, but the allegation is not without merit.
Looks like you're wrong (Score:3, Informative)
Nvidia says they support any 100+Hz CRT.
http://www.nvidia.com/object/3D_Vision_Requirements.html [nvidia.com]
Re:Havok (Score:2, Informative)
Re:oh well (Score:3, Informative)
They used to have the best Linux drivers but I have not been keeping up with ATI's progress with their closed source drivers or the open source drivers that people are working on with the specs that ATI released.
Contrary to the constant cheer leading here on slashdot, Nvidia still has the best Linux drivers by a wide margin. With steady progress being made with the open source drives, this may not always be the case, but it is likely to be so for at least another year, maybe more. The simple fact is, ATI's drivers have always been exceedingly poor and the ATI linux drivers were even worse. It takes a long time and a lot of effort to overcome the poor quality ATI worked hard to entrench.
The simple fact is, if you want quality 3D on Linux, there is only one game in town, Nvidia. And for the foreseeable future, the game will continue to be Nvidia.
If you want to purchase your 3D card based on ideology while performance, usability, and functionality doesn't matter, ATI is likely what you want. If on the other hand, you want a solution that is actually fast and reliable, then Nvidia is your only option. Anyone who says otherwise is attempting to blow their ideology up your ass, in the most dishonest means possible.
Don't believe me, feel free to do some Googling for yourself. You'll have no trouble find hordes of crashes, broken apps, per application, per drive release, custom work arounds, etc, for ATI cards on Linux. As for Nvidia, with some minor exceptions, things generally just work. Even more so, most common distros provide the nvidia drivers off the install so you typically just install and things are running with HW 3D support.
Ahh... (Score:1, Informative)
But currently *everybody* can get their hands on it, and they don't even have to pay money to use it in a commercial product anymore! Because Intel bought them and is basically giving the technology away to anyone who will use it, because Intel will optimize it more for their stuff and so it will help make Intel look good with gamers.
Compare that to PhysX which only works on a subset of hardware, and whose owner (Nvidia) seems to be exhibiting control-freak tendencies towards it that rival those of Apple.
I know which one I'd choose. Disclaimer: I've used Havok on several AAA game projects, and I've never used PhysX. And that trend is likely to continue after this move by Nvidia.