Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVidia Cripples PhysX "Open" API 393

An anonymous reader writes "In a foot-meet-bullet type move, NVidia is going to disable PhysX engine if you are using a display adapter other than one that came from their company. This despite the fact that you may have an NVidia card on your system specifically to do this type of processing. 'For a variety of reasons some development expense, some quality assurance and some business reasons Nvidia will not support GPU accelerated PhysX with Nvidia GPUs while GPU rendering is happening on non-Nvidia GPUs.' Time to say hello to Microsoft dx physics or Intel's Havok engine."
This discussion has been archived. No new comments can be posted.

NVidia Cripples PhysX "Open" API

Comments Filter:
  • Havok (Score:5, Insightful)

    by sopssa ( 1498795 ) * <sopssa@email.com> on Wednesday September 30, 2009 @04:07PM (#29598137) Journal

    Havok is a better engine anyway.

    But that's the problem with corporate buyings anyway. Even if its kinda wrong to stop supporting the other platforms, they have every right to do so.

    • Re:Havok (Score:4, Insightful)

      by negRo_slim ( 636783 ) <mils_orgen@hotmail.com> on Wednesday September 30, 2009 @04:14PM (#29598229) Homepage

      Havok is a better engine anyway.

      That may be the case but in the end we'll more than likely see corporate drama surrounding that effort as well.
      I hate to say it but I think a DirectX option is the lesser of three evils.

      • Re:Havok (Score:5, Funny)

        by Kratisto ( 1080113 ) on Wednesday September 30, 2009 @04:22PM (#29598333)
        That's a new record for a Microsoft product. Lesser of two evils? Okay, occasionally. But a lesser of three!? There's hope for them yet!
        • by sopssa ( 1498795 ) *

          Actually Windows Mobile is lesser than three of the evils too, as far as openness goes.. All iPhone, Symbian and Palm are quite closed.

        • Re: (Score:3, Funny)

          That's a new record for a Microsoft product. Lesser of two evils? Okay, occasionally. But a lesser of three!? There's hope for them yet!

          Why am I suddenly reminded of the game "Eternal Darkness"?

        • Re:Havok (Score:5, Funny)

          by azior ( 1302509 ) on Thursday October 01, 2009 @02:11AM (#29602425)

          That's a new record for a Microsoft product. Lesser of two evils? Okay, occasionally. But a lesser of three!? There's hope for them yet!

          Microsoft <3

          you mean like this?

          • Re:Havok (Score:5, Funny)

            by Fantom42 ( 174630 ) on Thursday October 01, 2009 @09:12AM (#29604539)

            That's a new record for a Microsoft product. Lesser of two evils? Okay, occasionally. But a lesser of three!? There's hope for them yet!

            .

            Microsoft <3

            you mean like this?

            .

            Close, but more like this: Microsoft <3 Evil. *

      • Havok is a better engine anyway.

        That may be the case but in the end we'll more than likely see corporate drama surrounding that effort as well. I hate to say it but I think a DirectX option is the lesser of three evils.

        What's wrong with OpenCL exactly?

        • Re: (Score:3, Insightful)

          by jgtg32a ( 1173373 )
          The Khronos Group
          • Re:Havok (Score:4, Interesting)

            by V!NCENT ( 1105021 ) on Wednesday September 30, 2009 @07:09PM (#29600045)
            Yeah it really sucks that some major vendors work together to deliver you a platform inde-fscking-pendant solution that speeds up your computer at no extra freaking costs, patents and other crap. What hidden agenda are you pushing?
        • OpenCL isn't a physics engine. PhysX, Havok, and dx physics all are.
          • Re:Havok (Score:5, Insightful)

            by DJRumpy ( 1345787 ) on Wednesday September 30, 2009 @06:47PM (#29599871)
            Doesn't one normally wait until they have a good market for a product before they try to lock people in? This will only drive people to an engine that is more widely supported, or to an open standard that does the same thing. I understand the business reason, but it seems silly to show all your cards this early in the game.
            • Re:Havok (Score:5, Insightful)

              by electrosoccertux ( 874415 ) on Wednesday September 30, 2009 @11:13PM (#29601537)

              Havok and the DX Physics are completely open and either party can use them, no proprietary api or licensing or anything silly. No hardware vendor controls what happens.

              PhysX is not. It is controlled by Nvidia. Gosh, they wouldn't have financial motives to abuse this power would they? No of course not...

              Nvidia lately seems to have been getting around the whole market segmentation issue by ... paying off forum members in all the hot PC Hardware forums? Lately my favorite has been inundated with troll and fanboy posts proclaiming the wonders of PhysX (still waiting for a game where it actually adds anything) and the death of AMD/ATi.

      • Re: (Score:3, Insightful)

        by Sinan H ( 1548991 )
        No. Bullet Physics and OpenCL is the answer to this problem. Not a closed standard like DirectCompute that you can only use on Windows. Havok will use Larrabee, PhysX uses CUDA, however they will all eventually use OpenCL eventually. Although Bullet Physics can be ported to Larrabee, CUDA (already demos exists), support for OpenCL is the right way to go.
    • Re: (Score:3, Informative)

      I'm always impressed by Havok. Whenever I pick up a game that uses it I always smile as I know I'm going to enjoy the physics if nothing else.

      This is a bonehead move from nVidia as they've essentially just killed PhysX.

      • Re: (Score:3, Interesting)

        by sopssa ( 1498795 ) *

        I'm always impressed by Havok. Whenever I pick up a game that uses it I always smile as I know I'm going to enjoy the physics if nothing else.

        This is a bonehead move from nVidia as they've essentially just killed PhysX.

        Or, they're strengthened PhysX position and on the way their gfx cards too. When company buys some technology, its never without a reason.

        • Re: (Score:3, Informative)

          by 0xygen ( 595606 )

          How can a developer now realistically choose PhysX when they know it would cut their target market by 25%?

          They've killed it.

          • Re:Havok (Score:4, Insightful)

            by jpmorgan ( 517966 ) on Wednesday September 30, 2009 @08:25PM (#29600543) Homepage
            25%? Really? There are two possible usage scenarios they've killed:

            1. An onboard NVIDIA device with a discrete ATI graphics card. From what I've heard, PhysX running on integrated devices isn't any faster than running on the CPU in software mode, so nothing has been lost. So no target market has been lost there.

            2. Having both a discrete ATI graphics card, and an unused GeForce 8000+ or Tesla. That is a pretty fucking weird configuration. I can't see that being more than a tenth of a percent of gamers. I've personally never encountered someone who runs both.

            Mountain. Molehill.
            • Re: (Score:3, Interesting)

              by adolf ( 21054 )

              2: I've considered using a mix of ATI and nVidia cards on my primary machine, which is also where I play games. Why? I'd like to move from having dual displays to having three, and I ostensibly do have enough hardware to do so. But due to nVidia's driver limitations, I'd have to turn off SLI in order to make all of the DVI outputs live at the same time, and I don't want to turn off SLI.

              Currently, the way around this problem is to install another GPU of a different brand. In this way, one can utilize SL

      • Re: (Score:3, Insightful)

        by Korin43 ( 881732 )
        This is sort of a dick move, but I don't think this is going to hurt PhysX much. I mean really, how many people use two video cards in the first place? I really doubt a large portion of Nvidia users are going to care..
    • Re:Havok (Score:4, Insightful)

      by interval1066 ( 668936 ) on Wednesday September 30, 2009 @06:20PM (#29599625) Journal

      @sopssa: "Havok is a better engine anyway."

      By saying that in the context of this article you're implying that Havoc is a more open, less ip/license/business relationship-constrained option, and I don't think that's true. If Intel wants to exert its rights over the technology in the same way we're right back to the same situation with PhysX; Havoc may be better technically but its worthless if no one can get their hands on it.

    • Even if its kinda wrong to stop supporting the other platforms, they have every right to do so.

      That's quite a contradiction you made there.

  • Anti-trust? (Score:5, Interesting)

    by headkase ( 533448 ) on Wednesday September 30, 2009 @04:09PM (#29598141)
    Why is this not anti-trust? When you paid for the nVidia card to put into your machine why should its functions depend on whether or not a competitors hardware is present? What if Windows said uh-oh you have Linux installed on another partition, disabling Windows...
    • Re: (Score:2, Insightful)

      Look at it from a technical standpoint. They probably expected people (however wrongly) using PhysX to be doing so for games while using their card to render also. Throw a third party bit of hardware in there, and when the inevitable crash and burn go down, who is to blame? They don't know either... so they "solve" the problem by keeping you from ever being able to expose it.
      • Re:Anti-trust? (Score:4, Interesting)

        by Moryath ( 553296 ) on Wednesday September 30, 2009 @04:42PM (#29598597)

        I gave up on Nvidia when they screwed over my 3D glasses setup; I'd gone through all the trouble of maintaining my rig with an NVidia graphics card, because their occasional driver updates for the stereoscopic driver still made my old VRStandard rig (coupled with a 120Hz-capable CRT) run well.

        Lo and behold, their latest set "only" works either with the Nvidia-branded "Geforce 3D Vision" glasses and a short-list of extra-expensive "approved" 120-Hz LCD's, or else red/blue anaglyph setups. No reason for them to cut off older shutter glasses setups except to force people to buy their new setup if they wanted to continue to have stereoscopic 3D.

        So add the PhysX thing in and we can chalk up two strikes for Nvidia. My new card when I updated my computer this summer was an ATi (no point wasting the $$$ on a Nvidia). One more strike and I won't bother going back to them ever. Boy am I glad I didn't buy that second-hand PCI PhysX board the other day...

    • Re:Anti-trust? (Score:4, Insightful)

      by MozeeToby ( 1163751 ) on Wednesday September 30, 2009 @04:13PM (#29598217)

      Worse than that even, this is using your strength in one industry segment (physics acceleration) to support sales of an arguably different segment (graphics acceleration).

      • Re:Anti-trust? (Score:5, Informative)

        by mcrbids ( 148650 ) on Wednesday September 30, 2009 @04:19PM (#29598299) Journal

        Worse than that even, this is using your strength in one industry segment (physics acceleration) to support sales of an arguably different segment (graphics acceleration).

        Which is nasty and unethical to be sure, but it's not illegal unless it can be legally shown that Nvidia is a monopoly. It's amazing to me how many slashbots don't understand this distinction.

        I'm pissed at ATI for dropping binary support for FGLRX for Linux kernels later than 2.6.29, and was considering getting an Nvidia GPU in my next laptop, but now it looks an awful lot like Intel is getting my $50....

        • Which is nasty and unethical to be sure, but it's not illegal unless it can be legally shown that Nvidia is a monopoly. It's amazing to me how many slashbots don't understand this distinction.

          Is there another hardware-accelerated physics computing system that we are not aware of?

        • Re:Anti-trust? (Score:4, Informative)

          by Lonewolf666 ( 259450 ) on Wednesday September 30, 2009 @04:42PM (#29598603)

          Getting a bit off topic, but I like the direction ATI is taking recently with Open Source. Long term, I think they will be the better choice for Linux.
          In a recent test at Phoronix (http://www.phoronix.com/scan.php?page=article&item=amd_r600_r700_2d&num=1 [phoronix.com]) the OS driver already offered better 2D performance over the binary one :-)

        • Re: (Score:3, Informative)

          by mikeee ( 137160 )

          I'm pissed at ATI for dropping binary support for FGLRX for Linux kernels later than 2.6.29, and was considering getting an Nvidia GPU in my next laptop, but now it looks an awful lot like Intel is getting my $50....

          It was my understanding they had only dropped updated support for older cards (R500?), which are pretty well supported by the OS driver these days anyway, now that ATI is publishing specs again. Am I confused?

        • Re:Anti-trust? (Score:4, Informative)

          by PitaBred ( 632671 ) <slashdot&pitabred,dyndns,org> on Wednesday September 30, 2009 @05:26PM (#29599115) Homepage
          fglrx has always sucked. Why not look at an ATI card because their open-source driver is really maturing? It's OpenGL 1.4, and all recent cards should be supported in Ubuntu 10.04. My Radeon 4670 is already supported in the Fedora Core 12 betas (or are they alphas? I can never remember). ATI's open-source drivers are currently supporting Doom3, OpenArena, Nexuiz... lots of stuff. And they're playing the game with Intel, doing the acceleration the right way, instead of replacing most of the graphics stack with their own binary module like Nvidia does.
    • Re:Anti-trust? (Score:5, Insightful)

      by Joce640k ( 829181 ) on Wednesday September 30, 2009 @04:21PM (#29598323) Homepage

      This phrase "anti-trust", I don't think it means what you think it means.

      How are they leveraging a monopoly to gain unfair advantage in a marketplace?

      To me it seems more like NVIDIA has finally realized that they *can't* use it to gain unfair advantage so they're dumping it.

    • Why is this not anti-trust? ...

      Because nVidia doesn't have a monopoly in the video card market.

    • Re: (Score:3, Interesting)

      by Hadlock ( 143607 )

      What's to say they won't release a more expensive dual or quad GPU card with no video output, at a higher cost (profit margin)? This sort of move indicates that's what they're planning on doing. Buying single core cheaper video card units might cannibalize that market.

  • Truth (Score:5, Funny)

    by DoofusOfDeath ( 636671 ) on Wednesday September 30, 2009 @04:11PM (#29598183)

    'For a variety of reasons some development expense, some quality assurance and some business reasons Nvidia will not support GPU accelerated PhysX with Nvidia GPUs while GPU rendering is happening on non-Nvidia GPUs.'

    At least he was 33.3% truthful.

  • But... (Score:5, Funny)

    by nicc777 ( 614519 ) on Wednesday September 30, 2009 @04:12PM (#29598199) Homepage Journal
    ...will it my $TERM faster?
  • I was about to start using it, this announcement has saved me a lot of wasted effort.

    • Re: (Score:3, Interesting)

      by j00r0m4nc3r ( 959816 )
      I don't see what the big deal is. They currently only support their cloth simulation on the GPU, so whether or not GPU is being used doesn't affect rigid body physics at all. Havok is ridiculously expensive and they've dropped GPU support for their HavokFX system. I wouldn't discount PhysX based on this announcement alone unless all you care about is cloth.
  • Was Nvidia previously offering a software framework that could run on any GPU, but now only supports their own? Can ATI (or anyone else) not implement the standard in their own drivers?

    • by cheesybagel ( 670288 ) on Wednesday September 30, 2009 @04:22PM (#29598331)
      It is no standard. PhysX was an API made by a company (Ageia) who wanted to cell physics acceleration cards. Their cards never sold well, but the free beer software libraries were used by a number of people (the libraries supported CPU execution as well). Then NVIDIA bought them and ported the thing to run on their GPUs. So I see this ending up like the 3Dfx Glide API for 3D graphics - some historic games used it, such as Mechwarrior, but no one uses it anymore.
    • by Unit3 ( 10444 ) on Wednesday September 30, 2009 @04:25PM (#29598393) Homepage

      No. The framework would only run on their GPUs. However, you could have one of their cards in the system to do purely physics calculations, and then use a competitor's card to do the actual display and 3d rendering. They've now disabled this, so if your monitors are connected to, say, and ATI card, you can no longer use the Nvidia card in your system for physics processing.

      Before you discount this as an unlikely scenario, consider motherboards with onboard NVidia chipsets. These are usually underpowered for full time duty, but are perfectly suited to being used for physics calculations while a more powerful ATI card in the PCI-E slot does the graphics rendering. This is actually a fairly likely setup these days, and NVidia has just said they're going to block it.

      Personally, I agree with others who have pointed out this must be an anti-trust issue. Intel and Microsoft have both been fined heavily recently for doing exactly this kind of anti-competitive behaviour.

      • by BitZtream ( 692029 ) on Wednesday September 30, 2009 @04:55PM (#29598765)

        Its anti-consumer, but that doesn't trigger an anti-trust charge, they don't have a monopoly.

        Why does everyone scream like its illegal when a company does something they don't like? Unless they are king of the hill and using their powers to force others into capitulating with them, its not an issue for the courts. You don't have to buy nVidia. You don't have to use PhysX. You don't have to buy a Voodoo 3 card. Sure a game may only support one of the above, but thats not something that justifies going after nVidia unless they owned the market.

      • by Old97 ( 1341297 ) on Wednesday September 30, 2009 @04:59PM (#29598805)
        Microsoft and Intel are monopolies. Nvidia is not. You also can't designate a company as a monopoly by narrowly defining some market niche either. Barriers to entry for the market in question are also a consideration. It's not an issue here. If you are not a monopoly than you can engage in a broader set of behaviors. What got Microsoft in trouble is that they continued their anti-competitive behavior after they gained their monopoly and attempted to leverage their existing monopoly to gain unfair advantages in other markets, i.e. web browsers. If Apple had done that it would have been perfectly legal because they don't have a monopoly. If Microsoft had not had a monopoly what they did to Netscape would have been legal.
      • by smoker2 ( 750216 ) on Wednesday September 30, 2009 @04:59PM (#29598819) Homepage Journal
        How can it be anti-trust if (a) they aren't a monopoly, and (b) they are disabling their own hardware ?

        If they caused the ATI card to not function then I could understand it, but a secondary function on their own card ?
      • by afidel ( 530433 ) on Wednesday September 30, 2009 @05:24PM (#29599093)
        No, the framework also has a software renderer and on many cards you get significantly better overall game performance by using it (I know this is true for my 9600 GSO 384, I got an ~22% FPS boost by uninstalling the driver component for PhysX and using the software renderer). The software renderer is also significantly less likely to crash your system. So unless you have a slow CPU with a monster GPU and are willing to accept more crashes there's really not a lot of reason to use the GPU tied renderer.
    • No, PhysX is (and was) only ever hardware accelerated on Nvidia/Ageia hardware. Before you could add a second (Nvidia) card to your system and use it for PhysX. All this announcement is saying is that people using AMD as their primary GPU can no longer do this.

  • by Flowstone ( 1638793 ) on Wednesday September 30, 2009 @04:17PM (#29598273)
    First they scoop up PhysX and try to create a market for PPUs. Now the only way PhysX is ever going to get any use is out of pure coincidence. Not the smartest move for Nvidia to make when Ati/AMD is on their heels with a new line of cards.
    • Nope... (Score:3, Insightful)

      by Junta ( 36770 )

      PhysX was trying to make a market for PPUs (and relatively failing). nVidia bought them up to make the technology another marketing bullet point for their GPU parts, not to sell GPU parts as mere physics calculations. Sure, they'll take the business as it comes incidently, but they have no interest in anything that could remotely be construed as putting something other than their role as a graphics adapter vendor first.

    • And by on their heels you mean having better performance at 1/2 the cost?
    • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Wednesday September 30, 2009 @05:44PM (#29599255) Homepage
      An OpenCL implementation of Bullet physics [bulletphysics.com] is coming. It's Open Source and is already being used in commercial games -- once it gets GPU acceleration there will probably be little demand for PhysX.
  • by headkase ( 533448 ) on Wednesday September 30, 2009 @04:24PM (#29598363)
    Heres some thoughts on the meaning of this. The PC is an open-architecture, you are free to put whatever you want into your machine. If nVidia can dictate what their hardware works with then they are effectively creating a "nVidia-Approved" list of hardware. First step down the slippery slope of closing the PC's openness. In the software world an equivalent would be Windows refusing to connect to network shares that were based off of Samba or the other way around a Windows box refusing connections from Linux machines. Standards apply to hardware as well as software and if any manufacturer gets away with an "approved" list then the platform as a whole will eventually suffer for it.
    • Re: (Score:3, Insightful)

      by poetmatt ( 793785 )

      windows is an "approved list of hardware". Ever tried to run DirectX under anything else?

      OpenGL3 is the first time that companies are breaking away from windows.

      You can't keep a PC closed forever because it's bad for business.

      • by Ant P. ( 974313 ) on Wednesday September 30, 2009 @04:45PM (#29598635)

        OpenGL3 is the first time that companies are breaking away from windows.

        It seems like OpenAL was the first. Creative have been visibly pushing it now that Vista's forced-software-only sound API has made their sound cards pointless.

        • you are correct. I meant for graphics, but I didn't really think about that with openAL. Thank you for the correction.

        • by Joe U ( 443617 ) on Wednesday September 30, 2009 @11:11PM (#29601531) Homepage Journal

          MS pulled a smackdown on Creative. Creative cards (and drivers, especially drivers! [FU creative]) have been sucking for years.

          So, new OS comes out and MS removes all the hooks that 3rd parties have been putting into the Windows sound system, instantly leveling the playing field and removing a major source of Windows instability.

          One of the few times MS really did the right thing.

  • oh well (Score:5, Informative)

    by poetmatt ( 793785 ) on Wednesday September 30, 2009 @04:25PM (#29598391) Journal

    physx seemed nice until they tried to close source it. Does Nvidia have anything left this round? Bad Yields [semiaccurate.com], physx being stupid and abusive when disabled (it only uses 1 cpu core when on AMD for example [driverheaven.net]instead of even all threads). Not to mention their crippling of batman as well. [hardwarezone.com.sg]

    So what's left for Nvidia? I don't see a whole lot.

    • by LWATCDR ( 28044 )

      Actually a lot.
      They have the ION platform which is much better than what Intel supplies for netbooks and nettops.
      They have hardware flash acceleration coming.
      They used to have the best Linux drivers but I have not been keeping up with ATI's progress with their closed source drivers or the open source drivers that people are working on with the specs that ATI released.
      And they have a lot of mindshare and support from game makers. I just hope that Nvidia gets heading back in the right direction. It is good t

      • ion still is only par for par with ATI's integrated products. Not worse, not better. I like the tegra solutions they have had but you know, that's not exactly a huge growing business sector yet (although it could become one).

        they are rapidly losing "mindshare" behind closed doors, because people aren't liking the results of physx and it's impact on sales.

        Hardware flash acceleration? That's not unique to Nvidia or a solution to anything that exists. Nobody wants flash, it's going out of style.

        OpenCL? ATI and

      • Re: (Score:3, Informative)

        by GooberToo ( 74388 )

        They used to have the best Linux drivers but I have not been keeping up with ATI's progress with their closed source drivers or the open source drivers that people are working on with the specs that ATI released.

        Contrary to the constant cheer leading here on slashdot, Nvidia still has the best Linux drivers by a wide margin. With steady progress being made with the open source drives, this may not always be the case, but it is likely to be so for at least another year, maybe more. The simple fact is, ATI's

    • Yeah, they're _doomed_ because of the massive backlash from the 50 people in the world who would give a shit about this limitation. Doomed I tells ya!
  • by H3lldr0p ( 40304 ) on Wednesday September 30, 2009 @04:27PM (#29598417) Homepage

    Stop things like this [anandtech.com] from working?

  • I know nothing about PhysX other than what I've gleaned from the article..

    If you buy an nvidia card to do some headless gpu grunt work, they will disable the functionality to do that unless the work is being shown through another nvidia card?

    The displaying of the work is pretty much superfluous to the work being done, and they've already made their money on selling, support, etc the PhysX card.

    Err?

  • Has anything changed with Windows 7 where you can run an ATI and Nvidia card at the same time? I know you could in XP, but I found out the hard way you couldn't in Vista. It was something to do with the new driver model.

    I was trying this over a year ago to get dual monitors working while having SLI enabled under Vista. The recommended solution was to use an ATI card for the secondary output since the Nvidia drivers wouldn't see it and disable your ability to use SLI. When I tried to load the ATI drivers

    • This doesn't have anything to do with multiple graphics cards (except insofar as you have two cards capable of rendering accelerated graphics in the machine). A card set up purely as a PhysX processor isn't using the WDDM (the Vista/Win7 display driver architecture) pathway, which requires the same driver for all graphics cards. The non-Physx card used for graphics owns that pathway, and the Physx card runs independently. The Physx card is like a RAID or USB add-on card; there's no real limit to how many

  • Weird to begin with (Score:3, Interesting)

    by dagamer34 ( 1012833 ) on Wednesday September 30, 2009 @04:31PM (#29598477)
    Who on earth has a graphics card from two different manufacturers? Regardless though, it means they've directly tied PhysX to their hardware, and I just don't care for them anymore. ATI all the way baby!
    • Re: (Score:2, Insightful)

      by BlueToast ( 1224550 )

      When I shop for a video card, I don't care if it is ATI or NVIDIA as long as the choice I am making is cost effective. I would much rather spend my money on the card that is cheaper for the same performance -- which happens to be ATI in this case. Originally I was going to pair an 8800GT with an ATI card for Windows 7, but this news blows. NVIDIA should straighten up and get over their emotional attention whoring. They won't get my money now unless they grow up.

  • Proprietary APIs (Score:4, Interesting)

    by Adrian Lopez ( 2615 ) on Wednesday September 30, 2009 @04:32PM (#29598485) Homepage

    I'm currently avoiding PhysX due to the fact that the license requires that credit be given to nVidia/PhysX in any advertisement that mentions the advertised product's physics capabilities. It's a real shame, because I hear that PhysX has pretty robust physics implementation.

    The current state of physics acceleration reminds me of the days when hardware-accelerated 3D graphics (except for high-end OpenGL stuff) were only supported through manufacturer-specific APIs. Hopefully, DirectX physics will be good enough that PhysX will ultimately become mostly irrelevant to game developers -- I'm just not convinced that Microsoft can pull it off.

  • by Anonymous Coward

    Between the full stack (CPU+Chipset+GPU) provided by AMD and the full stack that will be Intel (with Larrabee in 2010) Nvidia has no future in either Chipsets or GPUs. Any other outcome is a bet against integration and in electronics integration always wins.

    Good thing too; both Intel and AMD are vastly more open (at least recently) with their hardware.

    • Re: (Score:3, Interesting)

      by JSBiff ( 87824 )

      Integration only wins when the integrated chip is "good enough". Intel has had "integrated, accelerated" graphics chips on their mobos for ages, but they've been so monumentally inferior, that anyone who wanted to play even 'older' 3D games like Q3-engine based games, far cry, unreal , most MMOGs released in the last 6 years, etc, needed to add-on a GPU.

      From the reviews I've seen, unless you want to muck around with real-time ray tracing (which Intel still hasn't gotten up to very good performance, from wha

  • not a problem (Score:2, Interesting)

    Techspot [techspot.com] AMD has been working hard to develop Open Physics. Furthermore Bullet Physics has been shown running on Cuda. So that sounds to me like doom for physx...
  • Crazy (Score:3, Insightful)

    by Pedrito ( 94783 ) on Wednesday September 30, 2009 @04:37PM (#29598529)
    It's kind of crazy that this is even going to get attention. This is only going to affect people using PhysX (which requires an nVidia GPU at the moment) with an ATI card for rendering. I'm sure the two people with this configuration are going to be crushed. Yes, I realize more than 2 will have a mix of cards, and 2 is probably a bit of a low guess, but only a handful are going to actually be affected by the lack of PhysX support for the config, so please, let's not get all in a huff about it. From a support perspective, I can understand where nVidia is coming from. This could be a true support nightmare for them.
    • So would supporting people with non-Nvidia keyboards, cases, HDDs, motherboards etc. ... (get the point?)
  • by perrin ( 891 ) on Wednesday September 30, 2009 @04:37PM (#29598533)

    Once the big game engines and physics libraries get generic support for GPU programming through OpenCL, this will all be pretty moot anyway. From what I can tell, the bullet physics library is already developing this, and I am sure closed source competitors are doing that as well. Relying on anything that will only run on a single vendor's hardware is just a losing business proposition (unless that vendor pays you for it, which I guess is how PhysX got going).

  • by BitZtream ( 692029 ) on Wednesday September 30, 2009 @04:50PM (#29598691)

    http://www.bulletphysics.com/ [bulletphysics.com]

    I don't have any affiliation with the project other than I've used it in my homegrown game engine that has never left my hard drive. It is however rather easy to use. When I was looking for a physics engine, Bullet turned out to be the best license, code base, and documentation set out there for no cost.

  • by Tanman ( 90298 ) on Wednesday September 30, 2009 @04:54PM (#29598747)

    Nvidia releases announcement that they will no longer provide free driver support to ATI for interaction between Nvidia hardware and ATI competing hardware. Notes that software APIs are available for ATI to pay for and release their own damn drivers.

    NEWS AT 11!!!

  • If you think about it, physx works on all 8 series and up.

    That's a $30 card for physx support. I wonder if I can do this since I have a spare x16 port on my machine.

    I don't really know if this will work though.

One man's constant is another man's variable. -- A.J. Perlis

Working...