Modders Get Nvidia's PhysX To Run On ATI Cards 122
stress_life writes "Following controversial allegations that Nvidia is cheating in 3DMark Vantage and Unreal Tournament 3 benchmarks, executives from Futuremark and Epic moved forward to clean any confusion.
However, the game was not over — enthusiasts from Israel ported PhysX middleware to run on ATI Radeon cards, achieving remarkable performance. Owners of ATI Radeon cards will be able to play PhysX games as well, such as Ghost Recon 2 and already mentioned Unreal Tournament 3."
Cool (Score:1)
Since I tend to use mostly ATI cards, this is great news.
Could someone explain what these do. (Score:5, Interesting)
Sure I grock the term "PPU" and can maybe even imagine it's got some fast elastic particle simmulations.
But what "physics" is really there. What's the interface look like.
Is it real physics? Would it be good for say simmulating chemical dynamics with quantum or classical force fields? COuld I use it to model the hydrodynamics of a sail boat cutting through the water?
What about applied math or engineering physics like say the propagation and attenuation of sound in a turbulent atmoshere or concert hall.
What about a piece of rope falling, a flag in the wind, or a ball and spring model?
Just what does this do and how does the interface look?
if possible compare it to CUDA since I know what that does.
Re:Could someone explain what these do. (Score:5, Funny)
if possible compare it to CUDA since I know what that does.
You see, it's like a car... It takes instructions and makes something of it... like how a car takes steering and brakes and get you from point A to point B.
Re: (Score:2)
if possible compare it to CUDA since I know what that does.
You see, it's like a car... It takes instructions and makes something of it... like how a car takes steering and brakes and get you from point A to point B.
So, it's like a Hemi 'cuda?
Those things are pretty expensive nowadays.
http://dealinworf.ytmnd.com/ [ytmnd.com]
Re: (Score:2)
http://dealinworf.ytmnd.com/ [ytmnd.com]
Where, oh where, do you find this stuff?!?
Re: (Score:2, Funny)
Re: (Score:2)
The Internet is really, really great!
Re: (Score:1)
The Internet is really, really great!
For Porn? http://en.wikipedia.org/wiki/Avenue_Q [wikipedia.org]
Re: (Score:2)
They really need to come over to Amsterdam!
Those Youtube videos have the video and sound quality of 70s porn that's been transferred from 80mm to VHS to DVD to AVI.
Re:Could someone explain what these do. (Score:5, Informative)
The interface is a freely available SDK (for some uses). The physics is basically Newtonian mechanics (more in a moment). Physics for games are, first and foremost, an exercise in collision detection. The physics is simple. Determining collisions in a series of finite-length steps is the hard part.
Why I say that the physics is basically Newtonian mechanics, there is spring technology, although all spring technology in finite step simulations has errors (if you are not carefully, the springs increase in oscillation over time, instead of damping.) Chemical dynamics and quantum force fields are out. Classical force fields are included. The force fields operate based on propogation (distance, distance-squared, etc.) and other parameters.
The fluid/solid interaction is still being worked on, and fluids and cloth benefit most from hardware acceleration. Fluids use a number of points with mutual attraction/repulsion properties.
No sound properties.
Rope is emulated as a series of sticks with ball joints at the end, a flag as a series of springs with forces at points (cloth simulation is esentially a thing of springs), and the ball and spring, yes.
You left out an important question, which is the rigidity of objects other than cloth/fluids. The ball that deforms as it bounces. Currently, that's in the SDK, but I've not played with it yet.
Re: (Score:2)
Actually, I'd think that the underlying algorithms would resemble Lagrangian Mechanics, rather than Newtonian Mechanics.
Re:Could someone explain what these do. (Score:5, Informative)
The PhysX system doesn't really care about heat or energy. It primarily concerns itself with force and momentum. That, as I understand it, is the principle difference between Newtonian and LaGrangian mechanics.
Re:Could someone explain what these do. (Score:5, Informative)
Nope. Not quite :-)
Lagrangian Mechanics gives you a lot more flexibility in terms of your coordinate system, and tends to be much better for solving systems with many interacting forces. It's essentially a mathematical re-formulation of Netwonian Mechanics.. The underlying laws are all the same, but the math used to arrive at a solution is quite different.
Of course, this is all for solving problems analytically. Computers most likely do things differently.
Re: (Score:2)
Interesting. I'm not sure how it works under the hood. And during collisons (for instance, a ball could easily be touching the rim in two places and the backboard in one) there are multiple forces that need to be solved in an interdependent fashion. Could Lagrangian mechanics be used then? I suppose.
PhysX also has a special vehicle wheel with suspension class of object. The fact that it seems not to be made up of other components implies that it has a different solve method. Given the difficulty of da
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Yeah. I would be very surprised if they were solving this stuff analytically :-). My guess is that it's using techniques similar to those used for finite element analysis with quantum time increments. For FEM, I would think an approximation of Newtonian mechanics should be a lot simpler to deal with and more appropriate for GPUs.
Re: (Score:2)
You're probably right, although computerized analytical solutions certainly aren't out of the question, as proven by Mathematica and the like.
It could very well be faster to analytically solve the equations of motion every so often, and simply plug into those results as time evolves.
I doubt that this is the case, as the underlying programming would be extremely complicated, and I'm not quite sure that it would even work.
Re: (Score:1)
Analytical solutions tend to be derived with algorithms that aren't vectorable/parallelizable, whereas the strength of GPUs is in vectorized/parallel calculations. So, yes your approach might work, but wouldn't gain much efficiency from GPUs. Still, I have found some of the comments for this article much more informative than is usual for a Slashdot article. There's
Re: (Score:1)
Being able to choose whether to use an iterative or matrix solver was one of the nice things about that engine...
Re:Could someone explain what these do. (Score:5, Funny)
Was that accidental or is that the worst physics joke ever? :-P if intentional allow me to present you with a medal!
Re: (Score:1)
Determining collisions is easy.
The collisions get turned into a set of contact pairs, which get converted into a set of constraints.
Then when the simulation is stepped the great big bloody matrix of constraints needs to be solved, which is hard as hell. This is where a massively parallel vector processor comes in (like what is inside a PPU - or better, another vector processing chip with shiteloads more R&D - the GPU).
Re:Could someone explain what these do. (Score:5, Informative)
From what I understand, nVidia took the PhysX engine they bought from Ageia and ported it to their own language (CUDA) so that it would run on their graphics cards, so people didn't have to shell out for a second $300 "Physics Processing Unit", thus boosting nVidia's GPU sales.
And now someones ported it to ATI.
*Nelson Laugh*
Approximation for gaming purpose (Score:5, Informative)
Is it real physics? Would it be good for say simmulating chemical dynamics with quantum or classical force fields? COuld I use it to model the hydrodynamics of a sail boat cutting through the water?
No. Most physics middleware provide a simplified model (collision detection, rigid body physics, etc...) which is great for visual gimmicks in games, but is too much an approximation to be used in research. You would need other engines which are optimized to do accurate physics modelling - Gromacs comes as an example.
Now about the hardware behind this : Ageia's PPU could in theory be used to accelerate research calculation. The problem is the lack of a proper API. This processor has only PhysX as available API which is specialized for gaming oriented physics. The SieveC compiler is supposed to be able to generate parallel programs for the PPU but hasn't been released publicly.
Whereas, even if the GPU port of PhysX is only oriented for gaming-specific applications, ATI Radeon card also expose the much more general purpose API "Brook+" (the usage of which is already demonstrated in Folding@Home) and nVidia card have CUDA that you know.
Unlike PhysX, those API expose generic numerical methods and can be used to calculate applications as diverse as you mention. Including calculating the game-specific Ageia PhysX.
PhysX is to CUDA what, for example, Gromacs could be compared to Fortan. The first is a specific engine which is optimised to solve some very specific problems, the second is a general purpose language that can be used to crunch numbers.
Re: (Score:2)
I imagine that the "physics" computations which take place in most games are heavily vectorized. Similarly, I'm sure that you could do some sort of interpolation/guesswork either in hardware, or API that reduces the number of "hard" calculations required.
GPUs are designed specifically to do vector math, and it only seemed logical that an API would come along that properly exposed these functions to other software.
I imagine that it's very similar to CUDA, but more optimized, and with an API that contains f
Re: (Score:1)
Not any more. The GPUs since G80 and R600 are good at scheduling and running scalar math on the shader units. (Earlier they tended to be fixed 4-wide or 2-wide.) But branching is still costly, and the GPU needs to bundle many scalar ops together (a bigger batch of pixels even when the shader/program has a scalar op as such) for full performance.
Re:Could someone explain what these do. (Score:5, Funny)
Naw man. That's the sound lasers make: pew pew pew
Re: (Score:3, Insightful)
This thread is so full of misinformation I don't know where to start.
Newtonian mechanics (no matter if you dress it up as Lagrangian or Hamiltonian mechanics) is basically just solving a second order ODE with constraints. Depending on how you set up the constraints and discretize the system, you end up solving a linear system of equations on each time step. Oh, and forget analytical solutions. There are like a handful of mechanical systems that you can solve analytically (called integrable), the rest can
Re: (Score:2)
Also its not clear how much of the graphics side takes the hit with some physics sims ruining or if you can get the results back to the CPU efficiently if the physics sims need more than eye candy. And the fact that
Re: (Score:1)
There is a bigger problem though - why would you even want a few thousand boxes colliding? (Youtu
Also fun on AMD/ATI cards-- Raytracing (Score:5, Interesting)
Might also find this interesting-- AMD/ATI sure has been having a lot of fun lately.
http://www.tomshardware.com/news/Larrabee-Ray-Tracing,5769.html [tomshardware.com]
This latest round of cards from Nvidia and ATI seems to have been won by ATI as well. For $300 you can get the AMD 4870, on the performance of the $400 Nvidia 260, and sometimes as good (depending on the game) as the $600 280.
Re:Also fun on AMD/ATI cards-- Raytracing (Score:5, Informative)
Also the 48XX series ships with linux drivers.
Re: (Score:1, Informative)
Re: (Score:2, Informative)
Re:Also fun on AMD/ATI cards-- Raytracing (Score:5, Informative)
And the even worse news for NVidia is some preliminary numbers for the upcoming 4870 X2 would indicate it will completely blow away anything NVidia currently has on the market.
Re: (Score:2, Informative)
In terms of performance per $.
NVidia is still king of the hill in raw performance.
You just have to pay.
Re:Also fun on AMD/ATI cards-- Raytracing (Score:5, Insightful)
Not any more. You haven't been keeping up too well with tech news eh? Read a few reviews and look at some benchmarks of the 4850 and 4870 cards. If it were just one or two review sites showing such favorable numbers for the new ATI cards, they might be suspect. It's not one or two. It's all of them.
Re:Also fun on AMD/ATI cards-- Raytracing (Score:5, Informative)
Yes, I did misread your comment. Nevertheless, most of my comment still stands. A 4870 in Crossfire performs significantly better than the X280 and the 9800 GX2 every benchmark I've seen except Crysis, and these cards also have the capability to be run in a quad Crossfire mode. Oh, and two of them sell for less than one of NVidia's top dogs.
http://www.bjorn3d.com/read.php?cID=1301 [bjorn3d.com]
http://www.pcper.com/article.php?aid=581&type=expert [pcper.com]
http://techreport.com/articles.x/14990 [techreport.com]
Re: (Score:1)
And you can run Nvidia cards in Quad SLI.
What's your point?
Like I said, in terms of performance, NVidia wins.
You just have to pay more.
Sure, some games favor AMD (ATi) and some favor NVidia. What else is new?
Barring huge improvements for AMD (ATi) from driver updates in the future, NVidia will push itself as the performance king, and will still money hat developers to make sure a game is dripping with green (The Way it's Meant to be Played).
You stated that ATi blows away NVidia this round. I agree. I'm n
Re: (Score:2)
Re: (Score:1)
Yeah, but at what resolution.
And such.
People always want more.
I myself picked up an 8800GTX from CompUSA when it was 40% off (due to CompUSA closing down). Just cheap enough to warrant it over the 8800GT / a pair of them.
Re: (Score:2)
Re: (Score:2)
There aren't really any NEW games that will stress your video card out right now. Excluding Crysis, which nobody actually plays.
Re: (Score:2)
Nvidia has dropped the price of the 9800 GTX to the same level as the 4850, and it consistently beats the 4850 in most benches I've seen.
Add to that CUDA, Physx, more overclocking headroom, and better linux drivers (at least until radeonHD has matured) and nvidia is still the better product from where I'm sitting.
The high end cards aren't as good performance per dollar, but high end cards never are and I really don't
Re: (Score:3, Insightful)
The latest round of cards came WAY too soon.
Re: (Score:2)
... and sometimes as good (depending on the game) as the $600 280.
Like this $600 280? [augustadrifting.com]
Re: (Score:3, Insightful)
$300? $400? $600? wtf!
It's been a while since I bought a video card. I totally splurged and got a $90 card! Worked for the stupid game I was trying to play I guess, but now that game is lame and I'm out $90! I wouldn't do it again--$90 is a silly amount to spend to replace existing functionality.
If I just wait a few years, any games I might still be interested in will be cheap and play on commodity hardware--and all I've lost is, well, nothing--actually gained a little extra time.
Re: (Score:2)
So basically you are holding out till the games and hardware are almost free?
Good idea, however if you like to play online you might end up with a problem since you are most likely going to be the only one left still playing the game...
Re: (Score:2)
Based on that, I'd say that a good multiplayer game will have online playability for at least 10 years. That is a pretty decent amount of time for hardware to catch up.
Re:Also fun on AMD/ATI cards-- Raytracing (Score:5, Insightful)
The problem with playing 10-year-old games online is that, for the most part, the only people still playing 10-year-old games online are really, REALLY good at them. New games will have a wide variety of players in terms of skill, while old games tend to have just the hardcore players. If you're waiting for prices to fall to play a game, you'll have missed out on the time it takes to learn how to play the game, both in general and against other players of a similar skill level, and you'll lose every online game you play.
Re: (Score:1)
Considering that waiting 10 years, you can earn 1,000,000 dollars in sallary, I think you have to be quite damn poor not to afford saving $5/week for 20 weeks, to not afford a $100 video card. Many people spend more than $90 on one nights of drinking, so cmon, are you living in a 4th world country earning $3 a day making ipods for apple in a factory?
Re: (Score:3, Insightful)
Re: (Score:2)
I don't pay a repeating fee to play anything--learned my lesson a long time ago, so think the good games with open servers will still be available, and the lame ones or ones with closed pay per month servers, I'd have done without anyway.
If you like to pay money to waste time playing, however, I suppose my entire theory falls apart.
Re: (Score:2)
Guess you could say that for just about anything. Sloshdot loves car references, but i'll pass
But yeah, if you don't mind waiting till something is outdated, then you can save some money. But if you enjoy playing new games, using new cellphones, etc. then you go ahead and pay the money to do so.
Probable Patent Infringement (Score:5, Interesting)
My guess is that nVidia will put a stop to this pretty quickly. PhysX is covered by at least a couple of patents [uspto.gov]. There may be others pending or that were assigned to nVidia.
I don't know if PhysX is covered by patent protection in Israel, but it's possible. In any event, don't count on official PhysX support from ATI any time soon.
Re: (Score:3, Interesting)
Great, so here's yet another technology that will get split into many different versions by different companies...
Why can't these guys sit together and discuss things to come up with, say OpenPhysX? (think OpenGL)
Re: (Score:1)
Re: (Score:2)
They could still agree on an API or something, not the actual code. It would mean that ATI, intel and nVidia could make "PhysyX" compatible hardware/drivers, but maybe nVidia would have the fastest implementation.
Re: (Score:3, Interesting)
The API is free and open and AMD (ATi) is free to implement it if they wish.
They simply haven't done so.
Re: (Score:2, Informative)
AMD / ATI / Havok
vs
Intel / nVidia / PhysX
Pick your side!
(Ok so it doesn't quite work like that but dividing battle lines evenly makes it less confusing than it really is)
Re:Probable Patent Infringement (Score:4, Insightful)
Except that if I understand well, Havok == Intel since they purchased it... so ATI is between a rock and a hard place :)
Re:Probable Patent Infringement (Score:5, Informative)
Close, but off...
AMD/ATI vs. Intel/Havok vs. nVidia/PhysX. At least, Intel licensed code from Havok. Intel wants physics on the CPU, nVidia on the GPU, and AMD/ATI just wants to be able to use both.
Re: (Score:1)
Never hurts to support multiple things.
(I hate Creative and EAX!)
Re: (Score:2, Interesting)
Re: (Score:2)
Well, since I'm on Mac, it means intel for me.
Now all I need is an actual game that would use it. Starcraft II comes to mind.
Re:Probable Patent Infringement (Score:5, Insightful)
Another vote for OpenPL. It only makes sense. You feed the coordinates from OpenGL to OpenPL. OpenPL returns a new velocity and position for the objects. Maybe toss in mesh deformation because of impact. All handled by the same tightly integrated processor for speed. I want it, and I want it yesterday :)
Re: (Score:2, Informative)
I recall hearing chatter about CUDA bindings for Bullet but I'm not sure if anything came of that.
Re: (Score:2)
Simple: right now, ATI is stomping on Nvidia in terms of price and performance. The only thing Nvidia has going for them is their PhysX. Take that away - or make it an open standard by which ATI can also play - and Nvidia loses, becoming another "has been" vendor of high-end cards.
Re: (Score:3, Interesting)
Let them put an end to it if they can (Score:2)
Re: (Score:2)
We already knew that Nvidia is working on a CUDA version for x86 CPUs, but said it would leave a modification for ATI GPUs to others./quote
It will happen (Score:5, Insightful)
The nVidia people are probably well aware that hogging PhysX to themselves is a stupid idea. Game makers aren't going to go out of their way to support it unless it can be reasonably expected that most gamers will be able to use it. It's a dead fish unless ATi can use it. That doesn't mean they'll just hand it over.
Re: (Score:2, Insightful)
The nVidia people are probably well aware that hogging PhysX to themselves is a stupid idea. Game makers aren't going to go out of their way to support it unless it can be reasonably expected that most gamers will be able to use it.
Contrast with all the vendor specific OpenGL extensions that were used by developers...
Re: (Score:2)
I'll go one further. I suspect they'll ignore the situation and let things get rolling, and THEN license it to ATI.
I have been wondering how nVidia would conquer the 'only our cards use it' hurdle, and figured they'd just push games that work on the software version of Physx but you can turn on all the really cool effects if you have the hardware. Doing that while ignoring this hack for a while is a great way to get people interested.
Re: (Score:2)
Maybe I'm missing something (sorry, haven't read the article yet, only skimmed it), but this is basically an instance of using software on hardware it wasn't originally designed or intended for, right? How are patents going to prevent that? If Nvidia is allowing free downloads of this software (as they do with all their other driver code), then there's simply no way to prevent people from using it on other hardware. If the software isn't free, and people aren't paying for it, well that's just simple soft
Re: (Score:2)
How? AMD/ATI didn't do anything - A third party extracted the the cat from the bag, and we all (except nVidia) benefit as a result.
AMD doesn't need to do anything more than not break the interface used to make the port possible.
Not That Big a Deal (Score:4, Insightful)
Re:Not That Big a Deal (Score:5, Insightful)
Well, the amazing part of the whole deal here is that benchmarks show almost no decrease in framerate with PhysX turned on. So yeah, it's kind of a big deal.
Re: (Score:3, Funny)
so basically, the games hardly use any physics?
Re:Not That Big a Deal (Score:5, Insightful)
Re: (Score:2)
Don't be stupid, of course it can. Load 42 in a register and return it. -1 for getting it completely wrong; what we need is the ultimate question to the ultimate answer.
Re: (Score:1)
Re: (Score:1)
Re: (Score:3, Informative)
"Controversial allegations": Stop right there! (Score:4, Informative)
So this whole thing was kicked off by a column on the Inquirer? The same people who brought us the Rydermark "scandal"? [wikipedia.org] The Inq has shown a blatant and consistent anti-Nvidia bias over the years, so why give this any credence?
Besides, the first question that popped into my head is one that is being asked a lot of places, but not answered: If accelerating PhysX on Nvidia's GPU hardware is cheating, wouldn't accelerating PhysX on Ageia's PPU hardware be considered cheating, too? Call me cynical, but I think AMD knows the answer to that, and would rather you didn't mention it, thank you very much.
Re: (Score:1)
If accelerating PhysX on Nvidia's GPU hardware is cheating, wouldn't accelerating PhysX on Ageia's PPU hardware be considered cheating, too?
No, The physics tests in 3DMark Vantage test just the physics processing. If you're playing a game, your GPU will be busy doing graphics acceleration too leaving a lot less processing power for physics. The Ageia physics card is a separate processor so using it shouldn't affect graphics processing (although it did a bit.. probably due to keeping track of extra particle/physics data).
And if you're playing a game, your CPU will be busy doing AI, game logic, some animation, scheduling, etc. Whether you run the test on a GPU or CPU, the non-physics load is unrealistically light when running this test.
Re: (Score:2)
But AMD and/or The Inq's assertion is that the physics calculations are intended to be done on the CPU. If that were true, Futuremark would have crafted their own physics code. They had to know that PhysX could be hardware accelerated, so their choice of that API is a tacit acceptance of the effects of a PPU on benchmark scores.
Only now, that "PPU" just so happens to be an Nvidia GPU. That's why this whole thing stinks of intellectual dishonesty. If this was a pre-buyout Ageia showing off the effect of thei
What will ATI do? (Score:2, Insightful)
Relevant original phrase: All's fair in love and war.
Relevant original phrase with 21st century spin: All's fair in love and war so long as you do
Re: (Score:2)
Yeah wasn't the UT3 linux client supposed to come out months ago [phoronix.com]?
Oh, but I did find this [icculus.org] on his finger [icculus.org]. Ah, finger.
W
Soft hack, not hard (Score:2)
Get rid of the 'hardhack' tag people. Sheesh. It's just software. There is no hardware hacking involved.
you can play ut3 on ati without physx... (Score:2)
Not exactly true, they could already play the games with an ATI card, just not with physx enabled
On a side note, Sir Issac Newton would be proud of these Israelis and their accomplishment of bringing 9.8m/s^2 constant acceleration to ATI gamers.
Re: (Score:1, Redundant)
Re: (Score:2)
Or it took 1 minute, plus 4 more for Slashdot to allow him to comment again.