Is PhysX even relevant anymore?

Zyklon808

Gawd
Joined
Oct 8, 2007
Messages
518
With stronger CPU, GPUs and the advance of programming, is PhysX even really a selling point?

I've owned and played games with PhyX and with the new cards and cpu, there is really no difference with it turned on or off. I do not count PhysX specific benchmarks and demos, just retail games.
 
The current game with the best PhysX is Borderlands 2. Try using your CPU vs GPU on that game and you will clearly see a difference in performance and visual quality.
 
it was never relevant because most games don't use it...the few games that did make good use of it you can count on 1 hand- the Batman Arkham games and Borderlands 2
 
I played both batman and borderlands series and I did not pay extra for PhysX so it was and is relevant to me. all eye candy is good eye candy in my opinion :)
 
The Batman games are very different with physx. So I would say yes it's still relevant. You have to check out Batman with and without to say for yourself. Plenty of other games have extensive PhysX effects.

List of physx games and features. PhysX wiki
 
I don't think it's dead. I enjoy games more with PhysX enabled. It's icing on the cake.
 
agreed with never was relevant but i really enjoy games with PhysX metro LastLight its great and way more immersive with physX enabled..
 
PhysX is cool, but it's never been a game changer on the few games where it's used.
 
PhysX has the innate problem that the faster your primary GPU is, the faster your dedicated PhysX GPU needs to be. Games often times run faster running PhysX on your primary card because your PhysX card can't keep up.

Nvidia has created a reason to get you to buy 2 cards rather than just one. When PhysX was first announced, it sounded cool and a lot of people jumped on board. Now that everyone sees that Nvidia does nothing with it besides particle effects, it's boring and no one cares anymore. Since the PhysX integrated on many games is paid for by Nvidia (giving you a reason to use hardware PhysX), you can blame Nvdia for the lack of innovation, not developers.

Properly threaded physics that run on your CPU like Havok, or open source GPU accelerated physics like Bullet, are going to remain superior method for games. I think Microsoft can also push DX11 directcompute as well to enhance GPU accelerated physics. Lets also see what they reveal with DX12.
 
Last edited:
People seem to have forgotten or missed the fact there are two types of PhysX. GPU/PPU accelerated PhysX and CPU PhysX.

The PhysX engine has been used in 500+ games just like Havok. I will agree that the GPU accelerated side of PhysX has been less than stellar with only 30+ games being released but Bullet isn't doing much better.

Saying PhysX is only particle physics is completely false and a pretty silly claim as well. PhysX supports rigid body dynamics, soft body dynamics, ragdolls/character controllers, vehicle dynamics, volumetric fluid simulation, cloth simulation including tearing, and pressurized cloth.

GPU PhysX will always be a niche market but to say PhysX as a whole is not relevant anymore is foolish. Especially considering PhysX was used in Metro, Bioshock Infinite, ARMA 3, ACIV, and is going to be used in Star Citizen, Batman:AK, Wasteland 2, Daylight, EverQuest, and Goat Simulator.
 
It's a nice feature in high end gaming, when supported. Whether that makes it important or relevant is up to the gamer. Personally, I don't really care. It doesn't have the install base to cause developers to use it in meaningful game play ways, although it can improve performance and/or effects in some games.

I'm still holding out hope that Havok will finally release a GPU accelerated version its physics engine for Windows. While that's been available on the new generation of consoles since last year, it's not clear when it's going to be available for PC. This week may bring some updates: http://www.havok.com/gdc2014
 
I always thought it had a lot of promise. But, if you have to ask if it's still relevant, it's obviously not being marketed very well. I think the IDEA of PhysX is still very relevant. The other competitors are great, too.

Batman and others that use it have more atmosphere and are easier to get into than the more static and 'flat' worlds that don't use it. It adds something to the game, even if it's not noticed that much.
 
People seem to have forgotten or missed the fact there are two types of PhysX. GPU/PPU accelerated PhysX and CPU PhysX.
No one is/was talking about the non-GPU accelerated version because it's not proprietary hardware dependent, and you really don't even know it's included in a game unless you see the logo on the box.

Saying PhysX is only particle physics is completely false and a pretty silly claim as well. PhysX supports rigid body dynamics, soft body dynamics, ragdolls/character controllers, vehicle dynamics, volumetric fluid simulation, cloth simulation including tearing, and pressurized cloth.
What it can do, and what's being done, are two different things.
GPU PhysX will always be a niche market
I disagree. If it's done in a way that all GPU's can support it, and it actually adds something to the game more than particle effects, I think developers would be more game to jump on board. Look at all the neat things you listed it can do.
but to say PhysX as a whole is not relevant anymore is foolish. Especially considering PhysX was used in Metro, Bioshock Infinite, ARMA 3, ACIV, and is going to be used in Star Citizen, Batman:AK, Wasteland 2, Daylight, EverQuest, and Goat Simulator.
Again, to what level and why (Nvidia paying for it)? More particle effects and maybe Batmans cape?
 
Warframe shows you the power of PhysX in this video with PhysX on /off transitions.
http://www.youtube.com/watch?v=IhBZeZ5uQBQ

My buddies with Nvidia cards cut it off 100% of the time because they couldn't see the enemies to kill them. I would end up with 80% damage done in Warframe if they run Physx lol.
 
It isn't relevant because nVidia butchered it. I doubt it was worth it for them on the long run.
 
In AC 4, after the physx patch, I turned it off. The clouds of smoke surrounding the ships during fight didn't let me see anything, because it looked like Snoop Dog fighting Willie Nelson :)
 
I liked the physx effects in Batman Arkham Asylum and Arkham City. Also, I never played it but heard the effects were good in Borderlands 2. Still, the number of titles using it is small, and it never had much relevance in the first place.
 
It isn't relevant because nVidia butchered it. I doubt it was worth it for them on the long run.

Pretty much this. GPU Physx is fantastic technology but unfortunately Nvidia is hellbent on killing it by keeping it proprietary in hopes of pushing Nvidia sales. Sorry Nvidia, but AMD isnt going anywhere just because you have a nifty physics engine in your pocket.

At this point I actually hope that upcoming open source GPU physics engines prove to be as effective and pretty as Physx was so we can put a killing blow on it for good.
 
At this point I actually hope that upcoming open source GPU physics engines prove to be as effective and pretty as Physx was so we can put a killing blow on it for good.

I always saw Physx as like Glide. A great entry into the market then there is a bunch of competition that does it better/faster/cheaper and wipes out the original.

I still think that physics in game can be awesome as shit. Sure, there is the standard stuff now, which isn't bad. But, compare non-Physx to the Physx. A good difference. Bring more GPU based engines that kick ass.
 
Red faction's Havok physics was better than anything Physx ever did.
 
I had a 4870x2 with a 8800gt as the PhysX card, Batman at least for me, was a little different but not worth the extra gpu.

Now that Goat Simulator looks to have a lot of chaos going on and it would be interesting to see the difference.

I would hope PhysX processing would help drafting/ CAD programs and high end software but thats asking for a lot considering it has to be written to use a gpu, correct?
 
I had a 4870x2 with a 8800gt as the PhysX card, Batman at least for me, was a little different but not worth the extra gpu.

Now that Goat Simulator looks to have a lot of chaos going on and it would be interesting to see the difference.

I would hope PhysX processing would help drafting/ CAD programs and high end software but thats asking for a lot considering it has to be written to use a gpu, correct?
 
I had a 4870x2 with a 8800gt as the PhysX card, Batman at least for me, was a little different but not worth the extra gpu.

How did the 8800 perform for PhysX tasks? I've been wanting to buy a cheapo card specifically for it (I use an AMD gpu) but have yet to bite. Not sure how powerful/weak a GPU has to be to use it for a dedicated Physx card.


I would hope PhysX processing would help drafting/ CAD programs and high end software but thats asking for a lot considering it has to be written to use a gpu, correct?

Believe it or not, the vast majority of CAD software wouldn't benefit from it at all. A 'decent' modern GPU and a high end CPU (with lots and lots of RAM) are usually enough for even the most complicated models/assemblies these days.

What would benefit from it, and already does to some degree... at least as far as CUDA and OpenCL goes, is finite element analysis software (FEA) that simulates certain real world pressure/force/inertia tests on parts and assemblies. It shows engineers and designers potential weak areas and design flaws that could cause a failure before you have to spend all kinds of money to start making prototype parts.
 
How did the 8800 perform for PhysX tasks? I've been wanting to buy a cheapo card specifically for it (I use an AMD gpu) but have yet to bite. Not sure how powerful/weak a GPU has to be to use it for a dedicated Physx card.

The 8800GT seemed to do the trick very well as Arkham Asylum ran great and I saw all the dust fly and cape wave in the wind. I think 3DMark06 improved alot in score but I can't remember or find my results file.


Believe it or not, the vast majority of CAD software wouldn't benefit from it at all. A 'decent' modern GPU and a high end CPU (with lots and lots of RAM) are usually enough for even the most complicated models/assemblies these days.

What would benefit from it, and already does to some degree... at least as far as CUDA and OpenCL goes, is finite element analysis software (FEA) that simulates certain real world pressure/force/inertia tests on parts and assemblies. It shows engineers and designers potential weak areas and design flaws that could cause a failure before you have to spend all kinds of money to start making prototype parts.

That's so weird to me for CAD but makes alot of sense for simulation software. You would think any kind of rendering would utilize that type of processing.
 
The current game with the best PhysX is Borderlands 2. Try using your CPU vs GPU on that game and you will clearly see a difference in performance and visual quality.

If that was true then my CPU would have been at over 30% usage when playing Borderlands 2. PhysX is gimped on purpose when using anything but an Nvidia GPU. If my CPU wasn't up to the task then it would be maxed out by the game but it's not even close. PhysX isn't made to take advantage of strong CPUs.
 
If that was true then my CPU would have been at over 30% usage when playing Borderlands 2. PhysX is gimped on purpose when using anything but an Nvidia GPU. If my CPU wasn't up to the task then it would be maxed out by the game but it's not even close. PhysX isn't made to take advantage of strong CPUs.

So you consider a game that uses all cores to be one that takes advantage of strong CPUs, yet no current games can take advantage of all cores.
 
Is PhysX relevant anymore. That is a yes, maybe or a no question. Unlike physics, which your cpu or ati videocard can do. PhysX is hardware coded for NVidia gpu's only.
Unlike ati cards which tries to perform that function, it will often times slow the card to a stall almost, as it cannot figure out what to do with physX. Newer cpu's are pretty good
at doing PhysX, if it is coded in the software for cpu's to handle it, which, in most cases, it is. That is why you don't see the fps lost on ati cards that you use to see in a game,
simply because, the newer cpu's, they are able to handle it so much easier now.So, is PhysX a lost cause? It is not a bad thing, but NVidia should adapt to the idea of it being
a open source code, rather than a closed one. You will soon have mantle up and running full force, and if it comes out as being the next big thing, and causes a rift in the game
coding war, then we will all lose.
 
So you consider a game that uses all cores to be one that takes advantage of strong CPUs, yet no current games can take advantage of all cores.

Civ 5 would like to have a word with you mister...........
 
What about a flight simulator with physics being simulated throughout the simulation? Microsoft is out of the game, but X-Plane might pull it off. Other than the cheap visual effects (flying papers and shit), the flying dynamics, wind, cloud movement, etc.. It wouldn't be too much visuals (some would be cool, while on the ground, dynamic environments), but it'd enhance the simulation as well as offload some of that to the GPU...
 
What about a flight simulator with physics being simulated throughout the simulation? Microsoft is out of the game, but X-Plane might pull it off. Other than the cheap visual effects (flying papers and shit), the flying dynamics, wind, cloud movement, etc.. It wouldn't be too much visuals (some would be cool, while on the ground, dynamic environments), but it'd enhance the simulation as well as offload some of that to the GPU...

The only way I could see something like that working is through something like OpenCL and not something like PhysX that's tied to one GPU brand.
 
Back
Top