Anybody miss PHYSX? Recently switched over from Nvidia that is

dukenuke88

[H]ard|Gawd
Joined
Jun 22, 2011
Messages
1,924
To all AMD owners, do you miss PHYSX? Assuming you recently switched over from the nvidia crowd
 
I'm the opposite...I came from a 6970 to a GTX580 and fuck all those that are like "Mehhh, Physx is just a useless gimmick mehhh..."! Go play Batman Arkham City with Physx and tell me that shit ain't awesome!

I am loving Physx through and through...couldn't go back to AMD now!
 
i could never stand physX even using nvidia cards its the first thing i turn off if a game has it. but then again there aren't may games that have it so its not really an issue.
 
Not really. While physx is great in the arkham games, those are the only titles that I own that really make use of it. I threw a random nvidia card that I had lying around into my rig and boom, physx with my 4890 (admittedly it was slightly more complicated than that, but not much).
 
Same here. A very good friend sold me his GTX280 to me for really cheap just so I can play Batman: AA for a second time with Physx on and I thoroughly enjoyed it. I didn't need the card at all as I already have 4870X2 but I just had to see what Physx is all about. I have since upgraded to a 6970 couple months ago and I do miss it after seeing the side by side of Batman: AC from [H] reviews. I don't know, I have always liked ATI more than Nvidia so the Red logo is my first choice. Both the Batman games though are the only games so far to make me wish I have an Nvidia card.
 
Last edited:
thanks guys...yeah I hear you on the whole batman AA thing and PHYSX...it seems like only Batman AA/AC and Mafia 2 are the only real games that take advantage of PHYSX fully...I know Metro 2033 has a physx feature but it doesn't really show as much as Batman.
 
When Batman AC goes to $5 I'll take the time to replug my old GTS 250 as a physx card in hybrid with my 5870 (7970 by then!). Like I did for the first Batman.

But otherwise I don't care about it.
 
I went from a pair of GTX 470's to my current 6950's. PhysX wasn't a big loss to me. Heck, not being able to use my 3DVision kit wasn't a big deal to me. What I miss the most is being able to fully utilize my 120Hz monitors. I went the cheap route and bought a single link DVI->DP ($30 vs $100x2) adaptor so I'm currently stuck on 60Hz.

I would have to do the same thing with the 7970, so for now, I'm waiting to see what Kepler offers.
 
I think Cryostasis was the best use of Physx...cloth simulation and to date the only game I've seen that actually uses water...and lots if it...definitely a game changer!

Plus it was actually a damn good horror/psychological first person shooter too!
 
Tried out Physx with my 8800gt.

Took it out when I realized only 1 game uses it, and i beat it already and rarely load it up.
 

Ignorant and/or AMD fanboy.

I love AMD but even I can admit that it's not a gimmick...that's just stupid. That is unless you mean it's a gimmick because only a number is games support it...which if that's the case I guess DirectX10/11 are only gimmicks too?

Tried out Physx with my 8800gt.

Took it out when I realized only 1 game uses it, and i beat it already and rarely load it up.

Was this like in 2006 or something? There are tons of games that support it...I mean really! Now maybe (I sincerely hope) you meant that you personally only had one game at the time that supported it?
 
Ignorant and/or AMD fanboy.

I love AMD but even I can admit that it's not a gimmick...that's just stupid. That is unless you mean it's a gimmick because only a number is games support it...which if that's the case I guess DirectX10/11 are only gimmicks too?



Was this like in 2006 or something? There are tons of games that support it...I mean really! Now maybe (I sincerely hope) you meant that you personally only had one game at the time that supported it?

Naw was the first batman. And it did completly nothing other then make more heat in the case, and a few pieces of paper flying around.

The 8800gt still sits in a box in my closet....
 
The only reason the Batman and Mafia games look better with PhysX is because NVIDIA paid to lock them down. Of course PhysX "looks amazing" when the alternative is nothing. :rolleyes: Most PhysX effects can be run on or mimicked by other techniques that have a fraction of the performance cost. PhysX is a gimmick and is thankfully dying a slow death. And before the NVIDIA fanboys get their panties in a knot, I had a GTX295 in '09, when PhysX was really being pushed, and was completely underwhelmed.
 
I don't think anyone can argue that the extra PhysX effects actually suck...

Whether or not it's worth using nVidia over AMD just because of PhysX, though, is another matter.
 
Crucify me for my ignorance, but what exactly does Physx bring to Batman titles? Moar cardboard boxes falling moar realistic-like?
 
The biggest downfall of PhysX is there's no reason for a dev to ever put it in their game themselves. It's only put in if nVidia does it and pays for the privilege. Why would anyone want to take the time to put code in their game that 1/2 their customers can't use? If I was coding Physics into my game I'd put in something all of my customers can use.
 
I don't think anyone can argue that the extra PhysX effects actually suck...

Whether or not it's worth using nVidia over AMD just because of PhysX, though, is another matter.

To me it's like whether you would use NVIDIA over AMD just because of anti-aliasing. It's worth it for me to have the extra visual effects in games. It's kind of why I buy an expensive video card instead of using integrated graphics.

Here's a list of games using GPU PhysX
http://physxinfo.com/

Here's a list using "software" PhysX
http://physxinfo.com/index.php?p=gam&f=cpu
 
Its a gimmick when a Video game with no Nvidia Physx support looks better then a game with Physx support.

It's call BF3

Surely that depends entirely on how the Physx effects are implemented. For all you know, BF3 with Physx support might have looked even better.
 
Was this like in 2006 or something? There are tons of games that support it...I mean really! Now maybe (I sincerely hope) you meant that you personally only had one game at the time that supported it?

No, there are not.

http://physxinfo.com/

The Batmans, Mirror's Edge, Mafia 2, Metro 2033, Alice Returns, and Cryostasis is not a ton of games. I will admit that the effect in Batman AC is quite nice, but I honestly can't say that I miss it when I've already finished all of the other games that support it (using an 8800GT for hybrid physx btw).
 
Surely that depends entirely on how the Physx effects are implemented. For all you know, BF3 with Physx support might have looked even better.

If it would of looked better, Dice would of used it.

They even know it sucks.
 
the demos looked pretty, but it wasn't all that groundbreaking or even noticeable in the games I play.

I was too busy shooting things to notice all the pretty crap it delivered.

Now, I only did it for a month or so, so my sample size isn't enough to form a valid perspective.
 
I feel like we've beaten this horse to death many times already, but going to respond anyways.

I may catch flack for this, but my favorite PhysX effects thus far have been in Mirror's Edge. The cloth and shattering glass effects may be doable on a modern CPU today, but back in 1914 when we were all playing the game on our wood fired computers it was definitely a good use for GPU physics. I have not played Batman:AC, but Mr. Freeze's ice beam looked pretty impressive with all the shrapnel being thrown around, not to mention what I have seen of the game's volumetric fog.

All the ill will towards the technology is (rightfully) due to games that don't enable awesome visual effects via PhysX, but instead disable mundane effects that have would work perfectly in other games without GPU physics. Catwoman's whip, for instance: the game does not need GPU physics enabled to show it being swung around. Give us a break.

PhysX needs more games to push its boundaries before it actually influences a buying decision for me. It's certainly nice icing on the cake if you have a card that can run it, but it's not worth buying into the technology just for the two or three games I've seen implement it well.
 
That is what I am saying. 2500k processor can't handle it in software?

It's well known, and proven, that nvidia gimps the CPU version of physx to 1 thread or something similar on the PC version only...

Else there would be no point to gpu physx right?
 
I, personally, did not miss it moving from 8800's SLI to 6xxx Xfire. The only game I tried was UT3 at the time and the Physx pack I installed was a horrible piece of shit that made UT3 crash far more than it ever had. When it did run, it looked good, but I really can't say today I'd base a buying decision on it.
 
It's well known, and proven, that nvidia gimps the CPU version of physx to 1 thread or something similar on the PC version only...

Else there would be no point to gpu physx right?

yup, before nvidia took it over it was multithreaded then gimped to a single thread.
 
I would have missed it if their software implimentation was compiled to actually take full advantage of my CPU but alas they didn't and I am Red Team only in terms of GFX
 
I pretty much gave up on physx when they killed off my PPU. the multi-player agea tech demos were actually fun to pay, but GRAW had a little extra dust - it was very gimmicky.

I did try out hybrid physx with a borrowed green team card, but I didn't think it was worth buying a midrange nvidia card just for a few games, so I didn't. I don't feel like i'm missing out.
 
Battlefield 3 was the latest game to remind me that PhysX is a ripoff. I know all about the Havok engine and what adorable bugs it's got, but you know, without losing any performance or having to invest in anything or take up an extra card slot it does the job. PhysX is a ripoff and I pity the fools who get suckered into it. Certainly the shit looks pretty but those green glasses need to be taken off. They were purchased by Nvidia and successfully used as a cash-grab and gamer-fucker-over method. Batman games would have managed just fine with Havok, instead they have artificially reduced environmental detail because of the PhysX developer program Nvidia churned out that's made specifically to make PhysX modes be the only modes. Battlefield 3 didn't use PhysX because Havok does the job, and it also does the job on all platforms, all graphics cards.

PhysX is basically Hitler. Do you open up your eyes and see the reality, or do you listen to the brainwashing and propaganda until you get to smell the ashes?

Welcome to the red team. You'll soon look at your past self in disbelief.
 
Can't wait until AMD finally releases their version (like SLI, 3d and surround with nvidia) then it will become "accepted" :D
 
I love the idea of what PhysX can bring to games with more realism and all that good stuff. I enjoyed it on my 8800 and 9800 for the games that supported it that I owned. However right now I think my GPU has enough to do with 3 displays and a lot of eye candy. Where as my CPU is usually only 30% used by current games... I would rather see it split off into a CPU driven option for those with the CPU power to spare.

If AMD puts it on their GPU's I wont complain but I will probably turn it off to use that power on other eye candy... we'll see how / if it is implemented. I did like it in older games that I played with it.
 
I feel like we've beaten this horse to death many times already, but going to respond anyways.

I may catch flack for this, but my favorite PhysX effects thus far have been in Mirror's Edge. The cloth and shattering glass effects may be doable on a modern CPU today, but back in 1914 when we were all playing the game on our wood fired computers it was definitely a good use for GPU physics. I have not played Batman:AC, but Mr. Freeze's ice beam looked pretty impressive with all the shrapnel being thrown around, not to mention what I have seen of the game's volumetric fog.

All the ill will towards the technology is (rightfully) due to games that don't enable awesome visual effects via PhysX, but instead disable mundane effects that have would work perfectly in other games without GPU physics. Catwoman's whip, for instance: the game does not need GPU physics enabled to show it being swung around. Give us a break.

PhysX needs more games to push its boundaries before it actually influences a buying decision for me. It's certainly nice icing on the cake if you have a card that can run it, but it's not worth buying into the technology just for the two or three games I've seen implement it well.

I love this post.
 
Can't wait until AMD finally releases their version (like SLI, 3d and surround with nvidia) then it will become "accepted" :D
I've seen post like these before and they show a complete lack of understanding for the argument at hand. This is amplified by the fact that there's a comparison among SLI, 3D, and Surround, which are all in different technological realms. If AMD released their own proprietary GPU-accelerated physics with the same shoddy implementation, people would be just as pissed. However, if AMD's track record is worth anything, they'll develop something on OpenCL that can be used by both manufacturers. And yes, if there becomes a solution that is available to both manufacturers, you can bet that game developers will take it more seriously, as will the community.
 
Can't wait until AMD finally releases their version (like SLI, 3d and surround with nvidia) then it will become "accepted" :D

Not necessarily. When the 5xxx series launched one of the key talking points was tessellation. However, when Fermi launched and it did overwhelmingly better in tessellation suddenly that feature was being downplayed.

PhysX makes games look better. Like Tessellation, AA, AF, etc. It's just that simple. You buy video cards for the eye candy and that's exactly what PhysX gives you. I think game developers should just stop waiting for AMD to come around and start using PhysX for gameplay as well. Destructible environments and such.
 
Back
Top