Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVidia Cripples PhysX "Open" API 393

An anonymous reader writes "In a foot-meet-bullet type move, NVidia is going to disable PhysX engine if you are using a display adapter other than one that came from their company. This despite the fact that you may have an NVidia card on your system specifically to do this type of processing. 'For a variety of reasons some development expense, some quality assurance and some business reasons Nvidia will not support GPU accelerated PhysX with Nvidia GPUs while GPU rendering is happening on non-Nvidia GPUs.' Time to say hello to Microsoft dx physics or Intel's Havok engine."
This discussion has been archived. No new comments can be posted.

NVidia Cripples PhysX "Open" API

Comments Filter:
  • Anti-trust? (Score:5, Interesting)

    by headkase ( 533448 ) on Wednesday September 30, 2009 @04:09PM (#29598141)
    Why is this not anti-trust? When you paid for the nVidia card to put into your machine why should its functions depend on whether or not a competitors hardware is present? What if Windows said uh-oh you have Linux installed on another partition, disabling Windows...
  • by H3lldr0p ( 40304 ) on Wednesday September 30, 2009 @04:27PM (#29598417) Homepage

    Stop things like this [anandtech.com] from working?

  • Weird to begin with (Score:3, Interesting)

    by dagamer34 ( 1012833 ) on Wednesday September 30, 2009 @04:31PM (#29598477)
    Who on earth has a graphics card from two different manufacturers? Regardless though, it means they've directly tied PhysX to their hardware, and I just don't care for them anymore. ATI all the way baby!
  • Proprietary APIs (Score:4, Interesting)

    by Adrian Lopez ( 2615 ) on Wednesday September 30, 2009 @04:32PM (#29598485) Homepage

    I'm currently avoiding PhysX due to the fact that the license requires that credit be given to nVidia/PhysX in any advertisement that mentions the advertised product's physics capabilities. It's a real shame, because I hear that PhysX has pretty robust physics implementation.

    The current state of physics acceleration reminds me of the days when hardware-accelerated 3D graphics (except for high-end OpenGL stuff) were only supported through manufacturer-specific APIs. Hopefully, DirectX physics will be good enough that PhysX will ultimately become mostly irrelevant to game developers -- I'm just not convinced that Microsoft can pull it off.

  • by Anonymous Coward on Wednesday September 30, 2009 @04:34PM (#29598509)

    Between the full stack (CPU+Chipset+GPU) provided by AMD and the full stack that will be Intel (with Larrabee in 2010) Nvidia has no future in either Chipsets or GPUs. Any other outcome is a bet against integration and in electronics integration always wins.

    Good thing too; both Intel and AMD are vastly more open (at least recently) with their hardware.

  • not a problem (Score:2, Interesting)

    by poly_pusher ( 1004145 ) on Wednesday September 30, 2009 @04:34PM (#29598511)
    Techspot [techspot.com] AMD has been working hard to develop Open Physics. Furthermore Bullet Physics has been shown running on Cuda. So that sounds to me like doom for physx...
  • Re:Anti-trust? (Score:4, Interesting)

    by Moryath ( 553296 ) on Wednesday September 30, 2009 @04:42PM (#29598597)

    I gave up on Nvidia when they screwed over my 3D glasses setup; I'd gone through all the trouble of maintaining my rig with an NVidia graphics card, because their occasional driver updates for the stereoscopic driver still made my old VRStandard rig (coupled with a 120Hz-capable CRT) run well.

    Lo and behold, their latest set "only" works either with the Nvidia-branded "Geforce 3D Vision" glasses and a short-list of extra-expensive "approved" 120-Hz LCD's, or else red/blue anaglyph setups. No reason for them to cut off older shutter glasses setups except to force people to buy their new setup if they wanted to continue to have stereoscopic 3D.

    So add the PhysX thing in and we can chalk up two strikes for Nvidia. My new card when I updated my computer this summer was an ATi (no point wasting the $$$ on a Nvidia). One more strike and I won't bother going back to them ever. Boy am I glad I didn't buy that second-hand PCI PhysX board the other day...

  • Re:Havok (Score:3, Interesting)

    by sopssa ( 1498795 ) * <sopssa@email.com> on Wednesday September 30, 2009 @04:45PM (#29598639) Journal

    I'm always impressed by Havok. Whenever I pick up a game that uses it I always smile as I know I'm going to enjoy the physics if nothing else.

    This is a bonehead move from nVidia as they've essentially just killed PhysX.

    Or, they're strengthened PhysX position and on the way their gfx cards too. When company buys some technology, its never without a reason.

  • Re:Anti-trust? (Score:3, Interesting)

    by Hadlock ( 143607 ) on Wednesday September 30, 2009 @04:52PM (#29598721) Homepage Journal

    What's to say they won't release a more expensive dual or quad GPU card with no video output, at a higher cost (profit margin)? This sort of move indicates that's what they're planning on doing. Buying single core cheaper video card units might cannibalize that market.

  • Re:Just in time! (Score:3, Interesting)

    by j00r0m4nc3r ( 959816 ) on Wednesday September 30, 2009 @04:52PM (#29598725)
    I don't see what the big deal is. They currently only support their cloth simulation on the GPU, so whether or not GPU is being used doesn't affect rigid body physics at all. Havok is ridiculously expensive and they've dropped GPU support for their HavokFX system. I wouldn't discount PhysX based on this announcement alone unless all you care about is cloth.
  • by Rix ( 54095 ) on Wednesday September 30, 2009 @05:02PM (#29598855)

    You really can't blame them for dropping support for CRTs. If you can even buy them anymore, you'd have to be insane to want to.

  • Re:Havok (Score:2, Interesting)

    by Anonymous Coward on Wednesday September 30, 2009 @05:14PM (#29598967)

    Yeah, but I have every right to stop being their customer as well. nVidia burned me twice in the last two years. Once on an m1330 laptop, where their chips were spec'd out wrong thermally, so they would basically melt themselves if OEMs followed nVidia's recommended cooling. nVidia worked hard to bury the issue, preventing people like myself from getting a legitimate replacement of the lemon we were sold. The other time, they REFUSED to add dual monitor support for desktop (not games, just DESKTOP) if you were running SLI on a 7xxx series graphics card. You could get it... if you upgraded to SLI 8xxx cards. Considering that the formerly excellent quality of their drivers is now in the gutter (and headed downhilll for a long time to get there), I saw no more reason to put up with them.

    My desktop has an ATI graphics card now. My wallet did the talking, and it said "Fuck you, nVidia." The more shit they pull like this, I hope other people vote with their wallets as well. Punish this behavior: boycott nVidia.

  • by JSBiff ( 87824 ) on Wednesday September 30, 2009 @05:22PM (#29599069) Journal

    Integration only wins when the integrated chip is "good enough". Intel has had "integrated, accelerated" graphics chips on their mobos for ages, but they've been so monumentally inferior, that anyone who wanted to play even 'older' 3D games like Q3-engine based games, far cry, unreal , most MMOGs released in the last 6 years, etc, needed to add-on a GPU.

    From the reviews I've seen, unless you want to muck around with real-time ray tracing (which Intel still hasn't gotten up to very good performance, from what I understand, but they are still working on it), Larabee will still be inferior to nVidia and AMD/ATI GPUs. If they prove me wrong, and Larabee really is "good enough", then you might be right.

    What it comes down to is, for nVidia to survive, they either have to a) come out with some tech breakthroughs that keep their chips much superior to Intel/AMD, then convince developers to forget about compatibility with such "inferior" chips (unlikely, but, hey, maybe possible?), b) start releasing their own CPU/Mobo/Integrated chip stacks (they are already working on this some, particularly in the ultra-mobile/netbook and htpc/media center device space), c) work with third-party Mobo manufacturers to integrate their chips into the mobos instead of Intel or AMD (I think they've been doing this for a couple years now?), d) get some big console 'win' - like convincing Microsoft, Sony, or Nintendo to use nVidia chips as the basis of their next generation console offering, e) All of the above.

    I'm not ready to count nVidia out just yet, because they've been laying the groundwork for 4 or 5 years to have their own tech integrated into motherboards and devices.

  • by DragonTHC ( 208439 ) <Dragon&gamerslastwill,com> on Wednesday September 30, 2009 @05:53PM (#29599331) Homepage Journal

    If you think about it, physx works on all 8 series and up.

    That's a $30 card for physx support. I wonder if I can do this since I have a spare x16 port on my machine.

    I don't really know if this will work though.

  • by TommydCat ( 791543 ) on Wednesday September 30, 2009 @07:01PM (#29599979) Homepage

    What is the capability difference between their overpriced "partnered" LCD monitors and my 120Hz-capable CRT? Two things: Jack and Shit.

    I wonder if CRT persistence would become a problem at that high of a refresh rate. I had a similar system (Asus had 3D goggles that were tied to a dongle on the VGA port, pre-DVI) but the 3D would get really blurry at refresh rates higher than 60Hz due to phosphor persistence (essentially 30Hz per eye, though it's probably not that simple since they're alternating). IIRC, that 60Hz was interlaced as well.

    Made for some serious migraines, but it was neat to play Descent II in 3D for 15 minutes at a time until my head asplode.

    I do have a 120Hz LCD monitor now, but I haven't sunk the extra dollars in for the 3D glasses. I love the frame rates I'm getting now - movement in FPSers are liquid smooth... very reminiscent of my CRT days, but I'm not sure I want to revisit the 3D stuff again until I see more user feedback.

  • Re:Havok (Score:4, Interesting)

    by V!NCENT ( 1105021 ) on Wednesday September 30, 2009 @07:09PM (#29600045)
    Yeah it really sucks that some major vendors work together to deliver you a platform inde-fscking-pendant solution that speeds up your computer at no extra freaking costs, patents and other crap. What hidden agenda are you pushing?
  • Re:Havok (Score:2, Interesting)

    by V!NCENT ( 1105021 ) on Wednesday September 30, 2009 @07:16PM (#29600091)
    Add to that open source drivers if you're a Linux user and Coreboot support (maybe a little offtopic) and you know why AMD is 'The smarter choice' (yes copied that right from their marketing department).
  • Re:Havok (Score:3, Interesting)

    by rainmaestro ( 996549 ) on Wednesday September 30, 2009 @09:59PM (#29601115)

    Well, ATI cards have some issues still. One that comes to mind:
    WoW under wine. The minimap displays solid white because of an issue with pixel buffers in the ATI Catalyst drivers.

    Not saying ATI sucks, but they do still have some issues that need to be addressed, particularly on the Linux side of the pond.

  • Re:Havok (Score:3, Interesting)

    by adolf ( 21054 ) <flodadolf@gmail.com> on Wednesday September 30, 2009 @10:03PM (#29601137) Journal

    2: I've considered using a mix of ATI and nVidia cards on my primary machine, which is also where I play games. Why? I'd like to move from having dual displays to having three, and I ostensibly do have enough hardware to do so. But due to nVidia's driver limitations, I'd have to turn off SLI in order to make all of the DVI outputs live at the same time, and I don't want to turn off SLI.

    Currently, the way around this problem is to install another GPU of a different brand. In this way, one can utilize SLI on a single monitor, and use the other GPU for one or more secondary monitors.

    And soon, it looks like that configuration will carry an additional caveat. Hooray.

  • by Moryath ( 553296 ) on Thursday October 01, 2009 @09:39AM (#29604857)

    As of last month. They just added that back. A bit too late (plus I'm still unwilling to infect my system with Vista, AND buy their $200 glasses when my old ones were exactly the same hardware, just to get it working again).

interlard - vt., to intersperse; diversify -- Webster's New World Dictionary Of The American Language

Working...