AMD and physx

Eviljoker

Limp Gawd
Joined
Jul 18, 2012
Messages
398
I've been reading alot into upgrading gpu.. seen that physx is a open source now and can be used on AMD cards and have the physx effects?

Can anyone confirm this for me?
 
PhysX, on every modern implementation, just runs on the CPU. If using an Nvidia GPU you can designate it for PhysX, but I don't believe that this actually gets used.
 
So basically it does the same on amd as it does Nvidia, even with my ski setup, still seems to run off my cpu. Can you still see the physx effects tho?? Like bl2?
 
PhysX, on every modern implementation, just runs on the CPU. If using an Nvidia GPU you can designate it for PhysX, but I don't believe that this actually gets used.

So basically it does the same on amd as it does Nvidia, even with my SLI setup, still seems to run off my cpu. Can you still see the physx effects tho?? Like bl2?

I believe what IdiotInCharge is referring to is when you go into the settings and designate which of your installed Nvidia GPUs you want to do PhysX; meaning it will not be part of your SLI setup, it will by your dedicated PhysX card...

Meaning, if you want SLI & PhysX, I believe you need to run a third Nvida GPU...?
 
meaning it will not be part of your SLI setup, it will by your dedicated PhysX card...

Meaning, if you want SLI & PhysX, I believe you need to run a third Nvida GPU...?

Not at all. It will run on whatever; the option allows you to use a second or third GPU for PhysX, but does not put it in 'exclusive' mode.


All of which is moot because games don't use PhysX in hardware.
 
Basically what I'm trying to figure out is, if I go AMD instead of nvidia(again) will the games look the same or just as as good? Graphics and effects wise?
 
Some games are written to make use of PhysX in hardware, this runs on either a CPU or NVidia gfx card. Effects will be reduced in software mode.
Most games are written for PhysX in software which runs on CPU only. There is no reliance on any brands gfx cards.
(note, they all use software, its the use of particular hardware the defines the naming)

If you want full PhysX effects for the few games that can do it, you will need to use NVidia card(s).
There used to be a hack to allow an AMD card to sit alongside an NVidia card for PhysX, I dont know if that is still feasible.
 
Want to enable PhysX? Get Nvidia. My love for replaying Arkham Knight is the only reason why own an Nvidia card.

I've found no good way to make PhysX work on an AMD card.
 
I've been reading alot into upgrading gpu.. seen that physx is a open source now and can be used on AMD cards and have the physx effects?

Can anyone confirm this for me?
Yes its open source and AMD could support it in their GPUs if they wanted. They don't

That said, at this point Physx can be done on the CPU without much performance hit, so you don't really earn much by using GPU Physx
 
  • Like
Reactions: Nenu
like this
Yes its open source and AMD could support it in their GPUs if they wanted. They don't

That said, at this point Physx can be done on the CPU without much performance hit, so you don't really earn much by using GPU Physx
This is a fair point.
As far as I remember, games let you manually set the level of PhysX to use and with todays CPUs being a lot faster, it might run ok on max.
But I cant remember if all games let you set the level of PhysX.
 
It's hilarious, PhysX came along right when CPUs starting going all multicore and suddenly we had spare cores with nothing to do. Except for running things like PhysX.

if you are running it on the cpu is it still tecnhical PhysX ?
I believe it is just physics calculation but not PhysX
Just like creative made mp3 players and not Ipods

PhysX is a brand name covering a specific set of commands/methods to do physics calculation
Physics calculations is the task of calculating physics. it seems like ppl interchange brand names with definition words

aka you can do PhysX on a GPU and you can do physics calculations a different way on the GPU but its not PhysX.
 
if you are running it on the cpu is it still tecnhical PhysX ?
I believe it is just physics calculation but not PhysX
Just like creative made mp3 players and not Ipods

PhysX is a brand name covering a specific set of commands/methods to do physics calculation
Physics calculations is the task of calculating physics. it seems like ppl interchange brand names with definition words

aka you can do PhysX on a GPU and you can do physics calculations a different way on the GPU but its not PhysX.
Most PhysX is CPU based, only a few games do GPU PhysX.
It is an extension.
 
Most PhysX is CPU based, only a few games do GPU PhysX.
It is an extension.

But are we talking PhysX (The name brand and IP) or are we talking physics?
There are other physics calculations method than just PhysX and in this thread it appears to be using physics and PhysX interchangeable in some posts.

E.G. if you are using the havoc physics engine you can do CPU and GPU Physics as well, but none of it has any PhysX involvement.


I apologize poking in it, I just want to make sure I understand the posts correctly
 
Yes. We are talking PhysX physics calculations one the CPU instead of on the GPU.
 
  • Like
Reactions: Nenu
like this
if you are running it on the cpu is it still tecnhical PhysX ?
I believe it is just physics calculation but not PhysX
Just like creative made mp3 players and not Ipods

PhysX is a brand name covering a specific set of commands/methods to do physics calculation
Physics calculations is the task of calculating physics. it seems like ppl interchange brand names with definition words

aka you can do PhysX on a GPU and you can do physics calculations a different way on the GPU but its not PhysX.
I credit you as a guy who is well read on computer tech yet you are asking the most basic of questions about an old technology when you have the internet in front of you.
I did expect better.
My last post spelled it out.
 
It's hilarious, PhysX came along right when CPUs starting going all multicore and suddenly we had spare cores with nothing to do. Except for running things like PhysX.

To a degree- there's still plenty of usefulness for hardware offload though, of course. With respect to physics, running it on a GPU when trying to do more than basic stuff can make sense and probably will make sense at some point. That point is just not right now.


-example: Ray tracing can be considered to be a form of 'physics' calculation, as can acoustic wave bounces; as game environments get more sophisticated especially with the advent of VR, putting all of the 'simulation' data on the GPU might be adventageous. I.e. everything has physical properties that affect structure, reflectivity, and acoustics. The more data you can put into the models, the more accurate it is; instead of loading .wav files for different firearms, for example, you can define a heavy machinegun or small pistol loosely and then fine-tune the output a bit, but do all of that on the GPU. You can then allow modifications and let the engine code on the GPU handle the changes. You can define environments and instead of having to do the equivalent of cinematic set design micromanagement, let the engine come up with how the room looks and sounds and 'feels'.

And with respect to physics, if you model the structure of the game world, you can let the player use different forces to literally tear it apart.


-less relevant example: In the enterprise world, hardware acceleration is still a thing; enterprises are still buying NICs with ever more hardware offload capability to free up their ever-increasing CPU capacity for their ever-growing service needs. Part of that is due to the various life cycles of different parts, where a faster CPU simply can't be had between generations, part of that is actual need for compute, and part of it is latency related; and all of those parts will likely apply to gaming and other audiovisual real-time simulations in the future.
 
It's hilarious, PhysX came along right when CPUs starting going all multicore and suddenly we had spare cores with nothing to do. Except for running things like PhysX.

to be fair the original physX api was cpu multi-threaded until nvidia bought it then it was capped at single thread when not being used on a nvidia gpu.
 
I credit you as a guy who is well read on computer tech yet you are asking the most basic of questions about an old technology when you have the internet in front of you.
I did expect better.
My last post spelled it out.

Asking is how you get information. its what a forum is made for.
When a person is putting out information that goes again your current information asking into it is considered proper way imho.

The fact that i accept the limitations of my knowledge and seek to add onto it, I see that as a good trait of mine.
Unlike what most people will do: is argue with a closed mind and bring up made-up theories and arguments with no grounds in real world facts. As an examples just see the Series Sam thread where a person constantly claimed that using procedures generation methods, would mean everything would be random. This was despite several explanations to him that was not the case.


You reused the terms when the issue was in regards to the terms used. So reusing the terms without defining them is not helping the question on whatever the terms where used correctly.
You answer did not resolved the issue on defining the terms.

People tend to use PhysX to cover non-PhysX physics calculations as well
Just like people missuses the words Kleenex or ipod, for tissue paper and mp3 players.
That was what my question was based on

Looking back at you answer. Could your answer on itself logically work with exchanging Physics with PhysX? Yes, it would still be a valid answer. So it did not help on the issue of the misuse of the word PhysX, as you answer itself could potential be affected by the same issue.


anyway I managed to read up on it myself, When I got home from work


to be fair the original physX api was cpu multi-threaded until nvidia bought it then it was capped at single thread when not being used on a nvidia gpu.

This appears to be incorrect from what I read
Nvida brought it in 2008
2010 the lack of multi threading and SIMD was brought up
2011 the 3.0 SDK came out and had automatic multi threading optimization ( prior the developer has to code for it manually

However as always: If you have any other information on this I would love to read it
 
Last edited:
Asking is how you get information. its what a forum is made for.
When a person is putting out information that goes again your current information asking into it is considered proper way imho.

The fact that i accept the limitations of my knowledge and seek to add onto it, I see that as a good trait of mine.
Unlike what most people will do: is argue with a closed mind and bring up made-up theories and arguments with no grounds in real world facts. As an examples just see the Series Sam thread where a person constantly claimed that using procedures generation methods, would mean everything would be random. This was despite several explanations to him that was not the case.


You reused the terms when the issue was in regards to the terms used. So reusing the terms without defining them is not helping the question on whatever the terms where used correctly.
You answer did not resolved the issue on defining the terms.

People tend to use PhysX to cover non-PhysX physics calculations as well
Just like people missuses the words Kleenex or ipod, for tissue paper and mp3 players.
That was what my question was based on

Looking back at you answer. Could your answer on itself logically work with exchanging Physics with PhysX? Yes, it would still be a valid answer. So it did not help on the issue of the misuse of the word PhysX, as you answer itself could potential be affected by the same issue.


anyway I managed to read up on it myself, When I got home from work
Sorry Sven, it was a bad day yesterday.
 
I don't think the OP was asking for a hate PhysX thread and just wanted to know if it'll work on his AMD card. I don't see any evidence he can so for example if he wants to enable options like Gameworks in Arkham Knight he would require an nVidia card.
 
Correct. Hardware PhysX is only available on Nvidia cards.

I think the point of the responses (or at least my response) is that not many new games use hardware PhysX, so we are only talking about old games like Batman.
 
If you look at the PhysX https://github.com/NVIDIAGameWorks/PhysX
it says
"
Requirements:
  • Python 2.7.6 or later
  • CMake 3.12 or later
"
No where does it state you need a GPU or nvidia GPU for that matter. It also has a part about having Hardware acceleration
"
PhysX GPU Acceleration:

  • Requires CUDA 10.0 compatible display driver and CUDA ARCH 3.0 compatible GPU
"

It also says it works on iphones, android, linux whatever. If i were to make a jenga game using say unity game engine that has out of the box PhysX support i can make ports to android, PC, iOS, PS4,..... pretty much anything else, and it will run the same way. it's just an API,
 
If you look at the PhysX https://github.com/NVIDIAGameWorks/PhysX
it says
"
Requirements:
  • Python 2.7.6 or later
  • CMake 3.12 or later
"
No where does it state you need a GPU or nvidia GPU for that matter. It also has a part about having Hardware acceleration
"
PhysX GPU Acceleration:

  • Requires CUDA 10.0 compatible display driver and CUDA ARCH 3.0 compatible GPU
"

It also says it works on iphones, android, linux whatever. If i were to make a jenga game using say unity game engine that has out of the box PhysX support i can make ports to android, PC, iOS, PS4,..... pretty much anything else, and it will run the same way. it's just an API,
CUDA acceleration is only available on NVidia GPUs.
In theory it could be licensed for use by Intel, AMD ... drivers, but they havent so it is NVidia only.

ps you are reading the requirements for the SDK.
The use of PhysX does not need Python or CMake for code that is already compiled.
 
CUDA acceleration is only available on NVidia GPUs.
In theory it could be licensed for use by Intel, AMD ... drivers, but they havent so it is NVidia only.

ps you are reading the requirements for the SDK.
The use of PhysX does not need Python or CMake for code that is already compiled.

Code is open source and AMD already created a tool to convert cuda to C++ https://github.com/ROCm-Developer-Tools/HIP.

Also Unreal and Unity Engine, do use PhysX but neither of them use Hardware accelerated PhysX

Also I believe PhysX is on its way out with unreal Chaos engine

And Unity high end physics systems will make use of Havok https://blogs.unity3d.com/2019/03/19/announcing-unity-and-havok-physics-for-dots/

If your worried about physics your better off getting a multi-core CPU
 
https://en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support

Most of the stuff on the list is old. Physx was hardly the game changer that Nvidia promised, almost nothing uses it any more that multi core CPUs are plentiful. I think Borderlands 2 was the only game where I noticed it, but it's really minor graphical stuff. Mostly flags waving and bouncing dirt from explosions.

Side note.. single gtx 980 work over 770 2gb sli? Can get one for about 100$

Do it. $100 for a 980 is a good deal, and it should beat 770 SLI. Just don't expect any miracle FPS in 4k.
 
Last edited:
Thanks, But i dont play at 4k.. cant really see a diff between 1080 and 4k. NO ONE CAN, Its more than human eyes can see
 
really wasnt a troll, just my opinion :) But to each his own..

thanks to all for all the advice .
 
Back
Top