NVidia Cripples PhysX "Open" API 393
An anonymous reader writes "In a foot-meet-bullet type move, NVidia is going to disable PhysX engine if you are using a display adapter other than one that came from their company. This despite the fact that you may have an NVidia card on your system specifically to do this type of processing. 'For a variety of reasons some development expense, some quality assurance and some business reasons Nvidia will not support GPU accelerated PhysX with Nvidia GPUs while GPU rendering is happening on non-Nvidia GPUs.' Time to say hello to Microsoft dx physics or Intel's Havok engine."
Anti-trust? (Score:5, Interesting)
What are they trying to do? (Score:3, Interesting)
Stop things like this [anandtech.com] from working?
Weird to begin with (Score:3, Interesting)
Proprietary APIs (Score:4, Interesting)
I'm currently avoiding PhysX due to the fact that the license requires that credit be given to nVidia/PhysX in any advertisement that mentions the advertised product's physics capabilities. It's a real shame, because I hear that PhysX has pretty robust physics implementation.
The current state of physics acceleration reminds me of the days when hardware-accelerated 3D graphics (except for high-end OpenGL stuff) were only supported through manufacturer-specific APIs. Hopefully, DirectX physics will be good enough that PhysX will ultimately become mostly irrelevant to game developers -- I'm just not convinced that Microsoft can pull it off.
PhysX doesn't matter because Nvidia is doomed (Score:2, Interesting)
Between the full stack (CPU+Chipset+GPU) provided by AMD and the full stack that will be Intel (with Larrabee in 2010) Nvidia has no future in either Chipsets or GPUs. Any other outcome is a bet against integration and in electronics integration always wins.
Good thing too; both Intel and AMD are vastly more open (at least recently) with their hardware.
not a problem (Score:2, Interesting)
Re:Anti-trust? (Score:4, Interesting)
I gave up on Nvidia when they screwed over my 3D glasses setup; I'd gone through all the trouble of maintaining my rig with an NVidia graphics card, because their occasional driver updates for the stereoscopic driver still made my old VRStandard rig (coupled with a 120Hz-capable CRT) run well.
Lo and behold, their latest set "only" works either with the Nvidia-branded "Geforce 3D Vision" glasses and a short-list of extra-expensive "approved" 120-Hz LCD's, or else red/blue anaglyph setups. No reason for them to cut off older shutter glasses setups except to force people to buy their new setup if they wanted to continue to have stereoscopic 3D.
So add the PhysX thing in and we can chalk up two strikes for Nvidia. My new card when I updated my computer this summer was an ATi (no point wasting the $$$ on a Nvidia). One more strike and I won't bother going back to them ever. Boy am I glad I didn't buy that second-hand PCI PhysX board the other day...
Re:Havok (Score:3, Interesting)
I'm always impressed by Havok. Whenever I pick up a game that uses it I always smile as I know I'm going to enjoy the physics if nothing else.
This is a bonehead move from nVidia as they've essentially just killed PhysX.
Or, they're strengthened PhysX position and on the way their gfx cards too. When company buys some technology, its never without a reason.
Re:Anti-trust? (Score:3, Interesting)
What's to say they won't release a more expensive dual or quad GPU card with no video output, at a higher cost (profit margin)? This sort of move indicates that's what they're planning on doing. Buying single core cheaper video card units might cannibalize that market.
Re:Just in time! (Score:3, Interesting)
CRT? Are you from the past? (Score:2, Interesting)
You really can't blame them for dropping support for CRTs. If you can even buy them anymore, you'd have to be insane to want to.
Re:Havok (Score:2, Interesting)
Yeah, but I have every right to stop being their customer as well. nVidia burned me twice in the last two years. Once on an m1330 laptop, where their chips were spec'd out wrong thermally, so they would basically melt themselves if OEMs followed nVidia's recommended cooling. nVidia worked hard to bury the issue, preventing people like myself from getting a legitimate replacement of the lemon we were sold. The other time, they REFUSED to add dual monitor support for desktop (not games, just DESKTOP) if you were running SLI on a 7xxx series graphics card. You could get it... if you upgraded to SLI 8xxx cards. Considering that the formerly excellent quality of their drivers is now in the gutter (and headed downhilll for a long time to get there), I saw no more reason to put up with them.
My desktop has an ATI graphics card now. My wallet did the talking, and it said "Fuck you, nVidia." The more shit they pull like this, I hope other people vote with their wallets as well. Punish this behavior: boycott nVidia.
Re:PhysX doesn't matter because Nvidia is doomed (Score:3, Interesting)
Integration only wins when the integrated chip is "good enough". Intel has had "integrated, accelerated" graphics chips on their mobos for ages, but they've been so monumentally inferior, that anyone who wanted to play even 'older' 3D games like Q3-engine based games, far cry, unreal , most MMOGs released in the last 6 years, etc, needed to add-on a GPU.
From the reviews I've seen, unless you want to muck around with real-time ray tracing (which Intel still hasn't gotten up to very good performance, from what I understand, but they are still working on it), Larabee will still be inferior to nVidia and AMD/ATI GPUs. If they prove me wrong, and Larabee really is "good enough", then you might be right.
What it comes down to is, for nVidia to survive, they either have to a) come out with some tech breakthroughs that keep their chips much superior to Intel/AMD, then convince developers to forget about compatibility with such "inferior" chips (unlikely, but, hey, maybe possible?), b) start releasing their own CPU/Mobo/Integrated chip stacks (they are already working on this some, particularly in the ultra-mobile/netbook and htpc/media center device space), c) work with third-party Mobo manufacturers to integrate their chips into the mobos instead of Intel or AMD (I think they've been doing this for a couple years now?), d) get some big console 'win' - like convincing Microsoft, Sony, or Nintendo to use nVidia chips as the basis of their next generation console offering, e) All of the above.
I'm not ready to count nVidia out just yet, because they've been laying the groundwork for 4 or 5 years to have their own tech integrated into motherboards and devices.
I had never actually thought of this before. (Score:3, Interesting)
If you think about it, physx works on all 8 series and up.
That's a $30 card for physx support. I wonder if I can do this since I have a spare x16 port on my machine.
I don't really know if this will work though.
Re:CRT? Are you from the past? (Score:3, Interesting)
I wonder if CRT persistence would become a problem at that high of a refresh rate. I had a similar system (Asus had 3D goggles that were tied to a dongle on the VGA port, pre-DVI) but the 3D would get really blurry at refresh rates higher than 60Hz due to phosphor persistence (essentially 30Hz per eye, though it's probably not that simple since they're alternating). IIRC, that 60Hz was interlaced as well.
Made for some serious migraines, but it was neat to play Descent II in 3D for 15 minutes at a time until my head asplode.
I do have a 120Hz LCD monitor now, but I haven't sunk the extra dollars in for the 3D glasses. I love the frame rates I'm getting now - movement in FPSers are liquid smooth... very reminiscent of my CRT days, but I'm not sure I want to revisit the 3D stuff again until I see more user feedback.
Re:Havok (Score:4, Interesting)
Re:Havok (Score:2, Interesting)
Re:Havok (Score:3, Interesting)
Well, ATI cards have some issues still. One that comes to mind:
WoW under wine. The minimap displays solid white because of an issue with pixel buffers in the ATI Catalyst drivers.
Not saying ATI sucks, but they do still have some issues that need to be addressed, particularly on the Linux side of the pond.
Re:Havok (Score:3, Interesting)
2: I've considered using a mix of ATI and nVidia cards on my primary machine, which is also where I play games. Why? I'd like to move from having dual displays to having three, and I ostensibly do have enough hardware to do so. But due to nVidia's driver limitations, I'd have to turn off SLI in order to make all of the DVI outputs live at the same time, and I don't want to turn off SLI.
Currently, the way around this problem is to install another GPU of a different brand. In this way, one can utilize SLI on a single monitor, and use the other GPU for one or more secondary monitors.
And soon, it looks like that configuration will carry an additional caveat. Hooray.
Re:Looks like you're wrong (Score:3, Interesting)
As of last month. They just added that back. A bit too late (plus I'm still unwilling to infect my system with Vista, AND buy their $200 glasses when my old ones were exactly the same hardware, just to get it working again).