Current State of running Primary AMD GPU with secondary Nvidia PhysX GPU?

Zarathustra[H]

Extremely [H]
Joined
Oct 29, 2000
Messages
38,744
So, way back when I last read about this, Nvidia was blocking the practice, but there was a way to workaround it with hacked drivers.

Is this still possible?

I realize PhysX is mostly dead today, but I was just playing Borderlands 2, and it is a PhysX title.

My expectation was that, hey I have a Threadripper, so tons of cores and much higher per core performance than when this title launched, rendering it on the CPU shouldn't be too bad, and boy was I wrong.

Running PhysX at anything but the minimum settings causes CPU limitations to set in in a hurry.

I still have a couple of Nvidia GPU's kicking around, the most powerful of which is my old 2013 Kepler Titan.

I'm wondering if I could resolve this issue for my playthrough of the game by popping the Titan in there, or is this not likely to help?

Appreciate any input.
 
I don't think that it's worth it. Even for academic reasons. Certainly not for Borderlands 2.
Because it really only goes into effect when opening loot containers.
 
I don't think that it's worth it. Even for academic reasons. Certainly not for Borderlands 2.
Because it really only goes into effect when opening loot containers.

What do you mean only goes into effect when opening loot containers?

I was having odd non-GPU related framerate reductions in the game whenever certain enemies came on screen (primarily Goliaths) as well as when there are many explosions and when some water effects are on screen. Sometimes this would drop all the way down to ~20fps with very low GPU loads.
Lowering PhysX down to medium reduced the problem slightly, but it wasn't until I set PhysX to low that it went away all together. Do I absolutely need high PhysX effects? No, but the cloth effects and water flowing are kind of nice...

As long as it is not overly convoluted, I figured it might at least be worth a try.

Up until now, I had only heard of the poorly optimized PhysX code when running on the CPU, but I had never experienced it in person as I was always using Nvidia GPU's. Now I'm on a 6900xt, and while I had heard it was bad, I didn't realize it was this shockingly bad...
 
Borderlands 2 was one of the toughest physx titles I can remember. Pretty sure there was a lack of optimization or something just hammering the GPU's. I remember adding a 750ti to a pair of sli'd 780ti's to take the physx stress off the GPU the last time I played around with that title ~2014or 2015. Didn't help much in that title. No other titles had the issues running off the installed nvidia gpu's like BL2.

Do you have an old nvidia card you can install to handle physx only? Maybe it'd help.
 
Do you have an old nvidia card you can install to handle physx only? Maybe it'd help.

I still have my 2013 kepler Titan sitting here collecting dust.

The thing is, Nvidia at least used to block running PhysX on the GPU if any other brand of GPU was installed. Not sure if this is still the case in modern drivers.

Way back there were hacked Nvidia drivers that allowed you to circumvent this, but I can't seem to find any info about if this is the case anymore.

If I can make it wotrks, it might help, but I also read that BL2 shipped with a really old version of PhysX that does not like Windows 10. Should be able to remove the physx dll files in th egame directory to force it to use the version installed with the GPU drivers if I have to.
 
I still have my 2013 kepler Titan sitting here collecting dust.

The thing is, Nvidia at least used to block running PhysX on the GPU if any other brand of GPU was installed. Not sure if this is still the case in modern drivers.

Way back there were hacked Nvidia drivers that allowed you to circumvent this, but I can't seem to find any info about if this is the case anymore.

If I can make it wotrks, it might help, but I also read that BL2 shipped with a really old version of PhysX that does not like Windows 10. Should be able to remove the physx dll files in th egame directory to force it to use the version installed with the GPU drivers if I have to.
That's a lot of work for physx in one game, I'd just disable it.
 
Not related, but I was hoping we would see more of this: https://www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview

AotS_RTG+nVidia.jpg


But then again, multi-GPU in general basically died out. Still, I thought it was really cool being able to use both a GeForce and a Radeon to render games.
 
I think this is cool in theory, but it seems like a lot of effort for minimal benefit. That being said, I don't see what you're seeing on the screen. If the performance is indeed tanking and you have idle hardware to throw at it, you may have a free solution so to speak.
 


Back in the day I experienced the opposite. Enable PhysX and you'd get a buggy inconsistent framerate dropping mess. Of course as hardware got better the game could brute force its way through it.

I would say it was less worth it back in 2012. It's up to the person putting effort into making it work to decide if it's worth it for the few brief moments you get to see it.

I remember Mirror's Edge being one of the first titles I owned that supported PhysX. Got all excited to check it out but damned if I could see anything in the game that looked like it. I had to actually look up a YouTube video to get an idea of what I was looking for. With BL2, it was obvious once you saw it. So it does have some merit.
 
Last edited:
Part of the problem here is that every time someone independently started developing possible solutions to mixing cards they get bought out. Whether it's software mods or otherwise...

I remember a short time a while back where we were supposedly going to get a chipset that could mix ATI and NVIDIA GPUs together performance-wise at the same time. I don't recall who bought them out at the time, to boost their own SLI/Crossfire performance.

BL2 is old enough I would hope the old Titan could do OK alone at 1080p, with GPU mining dying off you could probably get a GTX 1080, 2070 or similar for dirt cheap soon too.
 
Part of the problem here is that every time someone independently started developing possible solutions to mixing cards they get bought out. Whether it's software mods or otherwise...

I remember a short time a while back where we were supposedly going to get a chipset that could mix ATI and NVIDIA GPUs together performance-wise at the same time. I don't recall who bought them out at the time, to boost their own SLI/Crossfire performance.

BL2 is old enough I would hope the old Titan could do OK alone at 1080p, with GPU mining dying off you could probably get a GTX 1080, 2070 or similar for dirt cheap soon too.
The problem is you get all sorts of load balancing issues; for every combination of GPUs that scale well together, there's a ton more that simply won't, or worse, tank performance due to improper load balancing.

Remember DX12 in theory allows multi-GPU since the API handles it, but (as I predicted) it actually killed multi-GPU entirely. Why? Because it's *impossible* to load balance between any arbitrary combination of GPUs. That's why SLI/CF was only ever really supported between two of the same GPU (since you can do AFR 50/50 and only have latency related to communication between the GPUs); if you start mixing and matching you get a hot mess that no software engineer in their right mind wants to be responsible for.

Also remember the other obvious downside that you have to reduce everything to the lowest standard feature set. So no DLSS, no special AA modes, no Ray Tracing, and so on. So in addition to all the headaches you introduce, you also have to accept a loss in visual quality.

With current top-tier GPUs able to handle native 4k (nevermind upscaling techniques like DLSS) there really isn't a need for multi-GPU. Frankly, the last multi-GPU setup that was probably worth it was twin 8800GTXs, mainly because NVIDIA refreshed the dang thing for about half a decade.
 
Back
Top