How to Activate Hardware PhysX to play with an ATI Video Card

Ok, more strangeness being caused by this hack. First of all, I still can't explain why my Fluidmark score dropped from 10823 (GTX260 alone) to 7100 (HD5850 + GTX260).

Now, some games are detecting my amount of video RAM incorrectly. Doom 3 just detected 896mb instead of 1024mb, which makes it look like it's detecting the amount of RAM on the GTX260 rather than the HD5850.

I think the performance hit is due to the GTX260 being stuck at 2D clocks (haven't verified this yet). I'm not sure how the PhysX hack works, but it seems like it's spoofing the GTX260 as the primary graphics adapter in order to fool Nvidia's drivers; this has the side effect of some games detecting the GTX260's video RAM instead of the HD5850's. It also seems some games might be selecting the wrong renderpath based on the GTX260's presence (one optimized for Nvidia rather than ATi), which might hurt performance as well.
 
Last edited:
Weird. I've been running an ATI PhysX setup on my system (5870+GTS250) since this hack was discovered and I haven't really had any issues. Granted I haven't updated my Nvidia drivers in a month or two either. I wonder if its the newer drivers that might be causing problems?
 
What can I use to monitor my gpu usage to make sure this is actually working correctly? I'm running a 5870 and a 8800GT and I get the "Hardware PhysX" statement in Fluidmark, but if I try to run MSI Afterburner or evga Precision I cannot see more than 1 of the gpu's taking any stress at all -- even if I run fluidmark while running furmark. Any tips?
 
do you still need to plug in another monitor to the NV card to get this working?

I had to in order to get fluidmark to show hardware acceleration. I could physically select physx enabled without the monitor plugged in, but it didn't work in fluidmark. I use a second connection to my main monitor so it isn't really actually doing anything that I can see.
 
Yeah, I've got it working just fine without a monitor or an extended desktop on the GTX260.
 
It depends on on the driver and physx version.

This.

Also there is a VGA connection hack with resistors, though works with bent paper clips as well. I think i posted a link in this thread somewhere. Basically have to link some VGA pins together, 3 sets of 2 pins and it tricks the drivers into seeing a monitor.
 
Haven't really done much research on this, so excuse me for the noob question. If I picked up an HD5870 as main renderer for Eyefinity and added an Nvidia card for Physx, will the Nvidia card be able to drive 1 or 2 extra monitors in 2d?
 
Haven't really done much research on this, so excuse me for the noob question. If I picked up an HD5870 as main renderer for Eyefinity and added an Nvidia card for Physx, will the Nvidia card be able to drive 1 or 2 extra monitors in 2d?

Yes it will work. I've only tried with one extra, but I have heard that windows has a physical limit on horizontal pixels -- I think it's 8k.
 
I guess I should point out that I'm running windows 7 -- it might not be easy or even work at all on another version of windows. I will try to check tonight and make sure that eyefinity grouping works correctly with all four monitors connected (been playing around without the 4th for a while). I do know that they all work in 2D mode, but have not tried running an eyefinity group with the 4th monitor displaying ventrillo or the like during game.

Edit: works fine with the other three monitors in an eyefinity group.
 
Last edited:
I'm confused. I'm not trying to be a smartass, but I see tons of posts ripping on Nvidia for physx, and calling for heads to roll over the them not supporting it on ATi, and just tons of Nvidia physx bashing about how physx is just a gimmick and it sucks and it's pointless..........so why is there such great efforts to obtain it if you are an ATI user? That doesn't make sense.

Just curious. Please don't flame me. I'm genuinely curious. This just seems weird to me.
 
I'm confused. I'm not trying to be a smartass, but I see tons of posts ripping on Nvidia for physx, and calling for heads to roll over the them not supporting it on ATi, and just tons of Nvidia physx bashing about how physx is just a gimmick and it sucks and it's pointless..........so why is there such great efforts to obtain it if you are an ATI user? That doesn't make sense.

Just curious. Please don't flame me. I'm genuinely curious. This just seems weird to me.

People are only using hybrid physx as a stop-gap measure. PhysX is pointless in that it does not--and will not ever under current circumstances--add anything to actual gameplay except eye candy. There are exactly zero developers that will implement physx as a crucial gameplay feature unless Nvidia funds the game in its entirety, because the game will not run on ATI cards.

GPU physics is a fantastic idea, but brand-dependent isn't the way to go. There are open standards like OpenCL + Bullet Physics. There are closed (but still GPU-agnostic) standards like DirectCompute and Havok. Those do not require you to have a specific GPU brand. Problem is, developers don't get "help" (or "cash") to implement these. Nvidia has a monetary motivator for Physx becoming the standard thanks to licensing fees. Since there are fee-free alternatives available, we'll likely see physx dying a slow death over the coming months.



Take Battlefield 3, for example. If they were to implement an Nvidia-only physics solution to building destruction, it eliminate 40% of their potential customers--ATI users. That's a little too hefty of a price to pay for any game publisher.
 
Last edited:
PhysX is pointless in that it does not--and will not ever under current circumstances--add anything to actual gameplay except eye candy. There are exactly zero developers that will implement physx as a crucial gameplay feature unless Nvidia funds the game in its entirety, because the game will not run on ATI cards.

GPU physics is a fantastic idea, but brand-dependent isn't the way to go. There are open standards like OpenCL + Bullet Physics. There are closed (but still GPU-agnostic) standards like DirectCompute and Havok. Those do not require you to have a specific GPU brand. Problem is, developers don't get "help" (or "cash") to implement these.

Take Battlefield 3, for example. If they were to implement an Nvidia-only physics solution to building destruction, it eliminate 40% of their potential customers--ATI users. That's a little too hefty of a price to pay for any game publisher.

I don't debate any of that, but you didn't answer my question.

Why so much foaming at the mouth about it, but then a 9 page thread about how to get it? Doesn't make sense, and makes ATI people look like hypocrites.

Again, I'm not being snarky, but when you bash something, but then put in effort to get it, it makes you a hypocrite.

Again, saying that as lightly as I can. :D
 
Why so much foaming at the mouth about it, but then a 9 page thread about how to get it? Doesn't make sense, and makes ATI people look like hypocrites.

Again, I'm not being snarky, but when you bash something, but then put in effort to get it, it makes you a hypocrite.

Again, saying that as lightly as I can. :D
I noticed myself that I forgot to address that in my original post...I was editing as you typed this haha. I added: People are only using hybrid physx as a stop-gap measure. They are the extremists who want every little feature they can get...even though there are like three or four games that extensively (key word) use physx. That is just fine. There's nothing wrong with wanting the best of everything as long as you can reasonably afford it.

I think what they are "foaming at the mouth" about is that they bought hardware, specifically to accomplish what it was originally made to accomplish, and nvidia actively worked to ensure the hardware that the users paid real money for didn't work, just out of spite. In fact, they even inserted a timebomb bug in a recent physx driver.
 
PhysX is pointless in that it does not--and will not ever under current circumstances--add anything to actual gameplay except eye candy. There are exactly zero developers that will implement physx as a crucial gameplay feature unless Nvidia funds the game in its entirety, because the game will not run on ATI cards.

GPU physics is a fantastic idea, but brand-dependent isn't the way to go. There are open standards like OpenCL + Bullet Physics. There are closed (but still GPU-agnostic) standards like DirectCompute and Havok. Those do not require you to have a specific GPU brand. Problem is, developers don't get "help" (or "cash") to implement these.

Take Battlefield 3, for example. If they were to implement an Nvidia-only physics solution to building destruction, it eliminate 40% of their potential customers--ATI users. That's a little too hefty of a price to pay for any game publisher.

Also, I'd like to point out that in one breath you say phsyx is pointless, but in the next you say it's a fantastic idea.

Let's be honest, as gamers we are always looking for ways for games to be more real, and as you said, physics is a fantastic idea and can do that.

I mean, to say it doesn't add to games is obviously nothing but butthurt speech. And it's proved by the fact that there are threads like this all over the internet on how to get physx running with an ATI setup.

When I had a 5870, I was annoyed that I didn't have physx, but I got over in it in a second. I didn't go around bashing Nvidia, saying physx was garbage and pointless, and then search the internet looking for hacked drivers.

See what I'm getting at? I'm not calling you or anyone else out. I'm just trying to point out behavior that is filled with hypocrisy.

I would totally see if everyone is was like, "yeah, physx is pretty cool, let's see if we can port it to ATI". But that isn't what I see. I see, "fark Nvidia in the ass with aids and physx is worthless and I don't miss it one bit", and then see those same people run to the thread telling them how to get physx running.

Just doesn't make sense.
 
I noticed myself that I forgot to address that in my original post...I was editing as you typed this haha. I added: People are only using hybrid physx as a stop-gap measure. They are the extremists who want every little feature they can get...even though there are like three or four games that extensively (key word) use physx. That is just fine. There's nothing wrong with wanting the best of everything as long as you can reasonably afford it.

I think what they are "foaming at the mouth" about is that they bought hardware, specifically to accomplish what it was originally made to accomplish, and nvidia actively worked to ensure the hardware that the users paid real money for didn't work, just out of spite. In fact, they even inserted a timebomb bug in a recent physx driver.


Gotcha. Cool.

I guess most people aren't foaming at the mouth about it. I'm sure most people think it's cool and just want it, but you always see the people who complain on the forums.

Not that I'm calling them out. Certain things work me up.

Cheers man.
 
Also, I'd like to point out that in one breath you say phsyx is pointless, but in the next you say it's a fantastic idea.
fattypants said:
PhysX is pointless in that it does not--and will not ever under current circumstances--add anything to actual gameplay except eye candy.
PhysX, the proprietary API from a single GPU company that locks out their competitor, that's pointless since nobody would ever make a game that absolutely required it. GPU physics is a fantastic idea.

Also, someone actually did make a wrapper (an editor named regeneration at a website that Hardforum wordfilters for some reason) that allowed Physx to work on ATI. Free of charge. Then Nvidia invited him to their developer program and Physx on ATI was never heard from again. Here's the last thing he said about it:
regeneration said:
I've been asked by a few sources to delay Update #2 for several days.

He ended up never releasing it.
 
Last edited:
PhysX, the proprietary API from a single GPU company that locks out their competitor, that's pointless since nobody would ever make a game that absolutely required it. GPU physics is a fantastic idea.

Also, someone actually did make a wrapper (an editor named regeneration at a website that Hardforum wordfilters for some reason) that allowed Physx to work on ATI. Free of charge. Then Nvidia invited him to their developer program and Physx on ATI was never heard from again. Here's the last thing he said about it:


He ended up never releasing it.

I wonder if he sold out and now has a fat check written by Nvidia.
 
Back
Top