Too Bad PhysX Lowers FPS in Games...is it worth these numerous trade offs?

Status
Not open for further replies.
he's saying if havok didn't get bought out by intel, then things may have been different for havok fx.

If we hadn't evolved opposing thumbs, things might have looked different for us.
I deal with facts...not "guessing based on a hypothetical past that didn't happen" ;)
 
Didn't say that...so....?

You implied that everyone who does not share your purchasing power for technology somehow shirks advancement. That is wrong. I was providing an example that you were...wait for it...wrong. People care about PC gaming even if you do not think that they do.


Show me any game doing the same with CPU physics.
Havok has been around for years, so it should be easy.

Well shit, let me go to the game I play. I have smoke in DoD:Source. This game is YEARS old (probably 3+), and when I throw a smoke grenade I get smoke. It may not be completely accurately rendered smoke, but that was my point. You can have smoke that is not PhysX compatible that is a fair proxy, and even if it is not swirling around you, is still a fairly good approximation without needing a second GPU to process it. Your video comparison is SHIT. Compare non-PhysX smoke to PhysX smoke and you'll have a more fair comparison...don't compare PhysX smoke to nothing then say, "OMG PHYSX ROCKS THE WORLD!!!!"
 
I lol'd at the holodeck. nice one. Quoted for more to see. I agree 100%



Surely there have been some great to legendary games before all this phsyx came along and immersion was enough ...or maybe I'm wrong? Keep drinking what Nvidia marketing feeds you and tells you what you need to enjoy pc games.

there have, there is, and there will be. there have also been awesome games before dx10/11 (or gpu physx) came along, doesn't mean they can't make future games visually better or more immersive. and some quality big name games use physx like gears of war and mass effect, so physx certainly isn't preventing them from being great games either. same can be said for havok (and maybe bullet in the future) or any physics engine for that matter.

If we hadn't evolved opposing thumbs, things might have looked different for us.
I deal with facts...not "guessing based on a hypothetical past that didn't happen" ;)

i guess if that's what you believe, lol. speculating on alternative pasts/ futures can past the time though.
 
Last edited:
Show me any game doing the same with CPU physics.
Havok has been around for years, so it should be easy.
I've never said that Havok was equal to hardware-based PhysX. Can you find where I said so?

You're lumping me into the group claiming that CPU-based PhysX = GPU-based PhysX. Sorry, but I'm not part of that crowd.

It adds immersion...if not more immersion, what word you you use?
Immersion is very much the wrong word to use, unless you are suddenly Batman himself. Think of it this way: if you were wearing a VR-style goggle system, plugged in so that anything you felt in-game, your body itself actually felt, that would be an example of immersion.

Simply sitting in your computer chair, looking at a monitor, and controlling a character using your keyboard/mouse/gamepad/whatever, is not immersion. At least, most wouldn't view it that way. If you have the ability to disassociate your consciousness from your body, and immerse yourself in whatever it is you're controlling, then I very much envy you.

It's the same as with AA and AF.
It adds more I.Q. giving more immersion...try playing without any AA of AF...it's the same game...but is the experince the same, nope.
It all adds up and where AA and AF are "static" addistions, physics is interactive addistion...a whole different ballgame.
I agree, in terms of the game play experience, in terms of what each adds, physics adds more generally than AA and AF. AA and AF are concerned with image quality. Physics adds to image quality, but also adds to the experience itself when implemented correctly. Batman does this right. So did Havok with HF2, in terms of what physics added. Many implementations of physics, however, have been solely for image quality (smoke effects, particle effects), etc.. In that regard, it's far less useful.

ARMA2, WiC, Crysis and Doom4 are/will not be console ports.
A console have limitations...most RST/RPG games simply don't cut it on a console.
FPS on console need to be dumbed down (autoaim ect) due to poor control.
CryEngine3 will show just how limited the 4 ears old consoels are now.
While I agree that RTS games are hard to do on a console, at the same time, look at what Blizzard is doing with a Havok-based Starcraft II to see what can be done with software implementations of physics. GPU-accelerated physics is hardly a requirement for PC RTS games.

And, uh, what? Most RPG games don't cut it on a console? Apparently you've never played some of the greatest RPGs in history. Almost all of them were console exclusive.

Well, DX11, GPU physics (No matter the PhysX nay-sayers, AMD and Intel are pushing hardware physics too on the PC) and the sheer amount more power on PC's (compared to 4 years old console hardware will (hopefully) show otherwise.
Of course AMD and Intel are pushing GPU physics on the PC. In fact, Intel will likely be pushing it in much the same way as nVidia is pushing PhysX, in regards to pushing a proprietary API that Intel wishes to see everyone licenses from it.

AMD is the only one going for an open platform solution. Whether it's because that's what they truly want to see, or simply because they could never do their own proprietary standard properly, it doesn't matter. AMD is the closest to what a lot of enthusiasts want to see. Apparently you don't agree with it though.

Look at the OP, the OP is whining about the performance hit...
It's a legitimate concern. Sure, some are pushing it a bit too much. But what if PhysX resulted in a 60% performacne hit. Would it still be so desirable to you?

DirectX is proprietary.
No one besides Microsoft can add features.
You can't get physics in DX...unless Microsoft decides too.
How is DirectX fairing?
Yes, DirecX is proprietary. The difference is, DirecX is fully available for all developers and hardware companies to use/implement/etc. AMD can use it without restriction. nVidia can use it without restriction. There's none of this "Sorry, but if you have an ATI card, we won't let you use your GeForce card as a PhysX card" nonsense.

That would be like Microsoft saying "Sorry, we detected Linux on your system also, DirectX will not properly run anymore."

What I want is for AMD and Intel to get off their collective asses and get on the market.
It will level the peaying field..and make threads like this funny to read.
Because I am dead certain both Intel's and AMD's solutions will have the same performance impact.
Uh, have you been living under a nVidia-sponsored rock for the last several months? AMD is on the market, and working to support Bullet.

Intel is still a ways off with Larrabee, but I have no doubt that when Larrabee does finally come out, you're going to see Havok being pushed with a passion. And if you think that nVidia getting a few developers to implement hardware PhysX is a coup'd etat, just wait until Intel puts its marketing muscle behind Havok.

Excuse me, try searching [H] for threads about Crysis...and see the vast amount of people whining that Crysis won't run at +60FPS maxed out on their mainstream rig.
Hell, try looking a Dan_D's signature..."inspired" by people in this very thread, whining about if games don't run maxed out on their year old rig...why should they buy it?
Apparently a few dozen complainers = the entire millions-strong PC enthusiast community.

Hmm, any chance you work for Fox News as a pollster?

To be blunt, most of the people whining about PhysX don't have the first clue about physics, the computations behind the scene...like the OP they whine about aded I.Q. cost performance.
Uh, many of the people whining about PhysX, whine about how it's been implemented.

Once again, and for what likely won't be the last time, it's not a issue with *what" PhysX is. It's an issue with how it's being implemented. As in, it's subject to nVidida's control.

Then upgrade your hardware...or lower the settings
It is really that simple.
"the grapes are sour" mentality is not a valid argument.
"Lower the settings"? Really? That's your counter-point.

I'm sorry, but for someone who constantly talks about image quality, and all that physics adds to a game, to hear that person say "lower your settings", it's laughable at best.

In fact, I think I'll add it to my signature. Tomorrow. After I get some sleep.


You can alway play PONG at +1000 FPS...better?
I like how you completely ignore the argument. Then again, that's par for the course with you.

Yes, let's play PONG at 1000 FPS... since everything over 60 FPS is useless.
The whole argument is that you would gladly trade playable performance for "PhysX effects". And that's probably the truth.


I tweak both my hardware (overclocking) and my software (games) to get the best experience.
Look at [H]'s real world gameplay evalutions.
They don't strive for +60FPS, but the settings that gives the BEST I.Q(and thus the best immersion)....at playable FPS.
But part of that is because the hardware can't handle it at max settings, at 60 FPS. So yes, they go for the overall most well-rounded game play experience.

However, the "goal" if you can understand that, is to be able to play some of these games, at max settings, at 60 FPS. Why not? For someone who harps about image quality, and physics effects, and "immersion", it's funny that you then say "hey, max settings don't matter. max viewable FPS doesn't matter."

Those physics effects you harp about. Yeah, they're nice to look at. They're nice to experience in game. You know what else is nice? Gaming at a steady 60 FPS, or close to it. I don't know about you, but I can see the difference between 30 FPS and 60 FPS, and ya know what? It adds a hell of a lot to the game also, when you can play it at close to 60 FPS.



The only thing the 4 year old console are doing right now is slowing progress.
Quantity != quality.

How are they slowing down progress? You make it sound like the only good games that come out, are those on the PC, and only then, those that support stuff like PhysX.

You know how people talk about those "gamers" who only care about pretty graphics, and don't care as much for the experience of the game itself? Yeah, that's you.

Like the anti-PhysX crowd isn't driving by AMD fans? :rolleyes:
Um, it's not just AMD fans. You only see it that way, because you're so rabidly pro-nVidia.

I don't count myself as a fanboi of any particular company. Once upon a time? Sure. If anything, I was a fan of nVidia. I also used to fervently promote Sega, and despise Sony. You know what happened? I grew up, and matured. Now, I just want what performs the best. Right now, that's ATI's cards.

See, this is where it gets murky.
Some games (like Batman AA), shows that even the new AMD cards don't give the best I.Q.
Instead of whining about PhysX, punk AMD for dragging their asses.
AMD(then ATI) made a lot of PR-FUd about "we can do GPU-physics" too...back in 2006.
It's now close to 2010..and they have nothing to show.

That is sad.
Really? That's what you go with? The *only* reason that AMD cards have reduced image quality in Batman: AA, is because of software restrictions put into the game at what was likely nVidia's behest (note: there is no evidence I currently know of that actually says nVidia told them to do it. However, it seems a bit... suspicious that developers would ever want to restrict the quality of a game on any particular hardware).

When those restrictions have been eliminated, such as in th case of AA/AF, the game has looked just as good (minus hardware PhysX of course).

You really need to try better in your examples.
 
What should they do?
Ask the pixie fairy for some magic?
It's not like Batman - AA is castrated.
It looks the same on the Xbox, PS3 and none-GPU physics PC.
What GPU physics did, was to enable them to ADD more I.Q...because of the more power available.
And that wasn't NVIDIA's doing...that was the doing of the game developers.
Ask them, why they did as they did...instead of falsely blaming PhysX.

You're being an asshole. Look, man, I gave away an 8800GT on these forums because I did not need it anymore. Understand...I had what I needed to use GPU physics and I did not want it. I am not a fanboy. I have owned both brands.

And yes, Batman is castrated for non-PhysX users. Why not allow the physics, but CPU-processed? We have four cores now. I don't know how many times I can say this, BUT YOU ARE COMPARING GPU-OPTIMIZED PHYSICS WITH GENERAL PROCESSING THAT DOES NOT INVOLVE ANY PHYSICS. I do not know just how simple you can be, but you apparently lower the bar every time you post. Take X. That is our starting point. Add Y. That is PhysX. You are comparing X to X+Y, without adding the Z of CPU physics or anything else to the equation. You do not compare full PhysX with approximated physics, you compare full with nothing. That is a completely ignorant viewpoint to take, but hey, as long as you feel good about yourself...
 
AMD pays Intel for x86...where is the difference?
The difference is that in order to support software written for x86, at that time you essentially had to be making an x86 processor.

Thus, the *only* option AMD really had, was to license the technology. It's similar to the chipset issue nVidia is having with Intel and the DMI bus. There is "no alternative".

With physics APIs, however, there are plenty of options. Thus, why pay your competitor to license it, when you can support another API that will hopefully (oneday, at least) provide all of the same functionality.
 
Well shit, let me go to the game I play. I have smoke in DoD:Source. This game is YEARS old (probably 3+), and when I throw a smoke grenade I get smoke. It may not be completely accurately rendered smoke, but that was my point. You can have smoke that is not PhysX compatible that is a fair proxy, and even if it is not swirling around you, is still a fairly good approximation without needing a second GPU to process it. Your video comparison is SHIT. Compare non-PhysX smoke to PhysX smoke and you'll have a more fair comparison...don't compare PhysX smoke to nothing then say, "OMG PHYSX ROCKS THE WORLD!!!!"

only thing is just because dod source has smoke type effects doesnt mean every other game has it. it's up to the devs whether it makes sense in the game or not. as for batman, they probably never thought of adding smoke/ fog (or any of the other additional effects) in their game prior to integrating gpu physx towards the very end of the development cycle prior to release. otherwise, they would have been on the console versions too. so the fact that there is a comparison with smoke on vs. off is the only valid comparison that can possibly be made. a game like graw had comparison videos showed static flags vs. dynamic flags in it probably because the devs already intended to have the flags in the game in the first place. that's why it's pointless to ask why they didn't add cpu "approximations" of said effects if the intention didn't exist to have said effects. gpu physx was just something tacked on probably after the game was essentially done right before the release on pc. and physx doesn't rock the world. marijuana does.
 
Last edited:
only thing is just because dod source has smoke type effects doesnt mean every other game has it. it's up to the devs whether it makes sense in the game or not. as for batman, they probably never thought of adding smoke/ fog (or any of the other additional effects) in their game prior to integrating gpu physx towards the very end of the development cycle prior to release. otherwise, they would have been on the console versions too. so the fact that their is a comparison with smoke on vs. off is the only valid comparison they can possibly make. a game like graw had comparison videos showed static flags vs. dynamic flags in it probably because the devs already intended to have the flags in the game in the first place. and physx doesn't rock the world. marijuana does.

Marijuana does. ;) For Batman, they were retarded for not adding effects other than PhysX. IT'S A BAD COMPARISON. That's the only thing that I want to point out. You cannot point out lazy non-effects with optimized PhysX. IT's just wrong. The original thing that I quoted was a deceiving video. That's it, I only wanted to point out a horrible comparison.
 
Immersion is very much the wrong word to use, unless you are suddenly Batman himself. Think of it this way: if you were wearing a VR-style goggle system, plugged in so that anything you felt in-game, your body itself actually felt, that would be an example of immersion.

Simply sitting in your computer chair, looking at a monitor, and controlling a character using your keyboard/mouse/gamepad/whatever, is not immersion. At least, most wouldn't view it that way. If you have the ability to disassociate your consciousness from your body, and immerse yourself in whatever it is you're controlling, then I very much envy you.

It's a legitimate concern. Sure, some are pushing it a bit too much. But what if PhysX resulted in a 60% performacne hit. Would it still be so desirable to you?

Those physics effects you harp about. Yeah, they're nice to look at. They're nice to experience in game. You know what else is nice? Gaming at a steady 60 FPS, or close to it. I don't know about you, but I can see the difference between 30 FPS and 60 FPS, and ya know what? It adds a hell of a lot to the game also, when you can play it at close to 60 FPS.

Really? That's what you go with? The *only* reason that AMD cards have reduced image quality in Batman: AA, is because of software restrictions put into the game at what was likely nVidia's behest (note: there is no evidence I currently know of that actually says nVidia told them to do it. However, it seems a bit... suspicious that developers would ever want to restrict the quality of a game on any particular hardware).

i think anything that adds to the game can boost immersion: better graphics, more realistic physics, surround sound, higher resolutions, stereoscopic 3d, haptics/ tactile feedback, etc. sure it's not "perfect" immersion like what virtual reality might be like, but it's still progress in that department.

as far as fps, i think it really depends from game to game. some games i don't even notice a difference from 60 to 30 fps. other ones i do. so i would say what might not be acceptable performance to someone might be acceptable to another.

yeah i wouldn't even mind a 100% performance hit if the game was still acceptably playable to me. for example, if a game went from 200 to 100 fps, i could care less. if it went from 10000 fps to 50 fps, i would still be okay with it. just depends up to what point for a particular game i might deem performance unacceptable. no lower than 30 is a good baseline for me, though i know that isn't always realistic given my penchant for visuals coupled with my non high end setup. so i make compromises if necessary. but at least i have the option to decide what is good enough or acceptable to me.

as for the aa in batman, i recall the ue3 engine (which batman uses), not having native aa support; so it is something that would have to be worked in, which is partly what twimtbp probably provided for. i know there are some other recent pc games that don't have any aa support whatsoever.
 
Last edited:
Marijuana does. ;) For Batman, they were retarded for not adding effects other than PhysX. IT'S A BAD COMPARISON. That's the only thing that I want to point out. You cannot point out lazy non-effects with optimized PhysX. IT's just wrong. The original thing that I quoted was a deceiving video. That's it, I only wanted to point out a horrible comparison.

lol, yup. hard to type with all this real life smoke in the way :D. unlike the fake gpu physx smoke, lol. okay, this opinion is something i can agree with based on your perspective. point duly noted.
 
Last edited:
lol, yup. hard to type with all this real life smoke in the way :D. unlike the fake gpu physx smoke, lol. okay, this is something i can agree with based on your perspective. point duly noted.

UG, you're kind of all over the place, or maybe it's just my interpretation of your perspective. I would rather assume more than you've said if only because it means I'm covering more bases. First admitting to being familiar with smoke then accusing me of smoking something means that I have no idea what you're going for...let's stick to video games. It's really the same as it has always been. Show a normal smoke demo, then use Batman's swirly smoke. Then gauge reaction based on framerate. My bet is that you get get close enough to swirly smoke to not need a second GPU and perfect swirls.

Once again, I am not discouraging the use of PhysX. I am just saying that comparing PhysX to nothing at all is deceiving. You can certainly achieve smoke without PhysX, so why don't they do that? It's a biased game and not fair game for comparison. That is all.
 
I've never said that Havok was equal to hardware-based PhysX. Can you find where I said so?

So we are back at comparing chickens and motorbikes?

You're lumping me into the group claiming that CPU-based PhysX = GPU-based PhysX. Sorry, but I'm not part of that crowd.

The only difference is the level of performance achived.
You CAN Batman - AA with the same effects, just using the GPU via a "hack"...but then look at the performance...ask emosevil-something to run the build in benchmark, using that hack.

Immersion is very much the wrong word to use, unless you are suddenly Batman himself. Think of it this way: if you were wearing a VR-style goggle system, plugged in so that anything you felt in-game, your body itself actually felt, that would be an example of immersion.

No, immersion is when you get immersion in a game....even PONG can have great immersion...I am still waiting for your explation/substitute word.

Simply sitting in your computer chair, looking at a monitor, and controlling a character using your keyboard/mouse/gamepad/whatever, is not immersion. At least, most wouldn't view it that way. If you have the ability to disassociate your consciousness from your body, and immerse yourself in whatever it is you're controlling, then I very much envy you.

http://www.google.com/search?q=define%3Aimmersion&rls=com.microsoft:da&ie=UTF-8&oe=UTF-8&startIndex=&startPage=1



I agree, in terms of the game play experience, in terms of what each adds, physics adds more generally than AA and AF. AA and AF are concerned with image quality. Physics adds to image quality, but also adds to the experience itself when implemented correctly. Batman does this right. So did Havok with HF2, in terms of what physics added. Many implementations of physics, however, have been solely for image quality (smoke effects, particle effects), etc.. In that regard, it's far less useful.

Immersion...again, the word don't agree with you reasoning.


While I agree that RTS games are hard to do on a console, at the same time, look at what Blizzard is doing with a Havok-based Starcraft II to see what can be done with software implementations of physics. GPU-accelerated physics is hardly a requirement for PC RTS games.

I would love more physics in WiC..a 3D game, not just an old isotropic 2D game.

And, uh, what? Most RPG games don't cut it on a console? Apparently you've never played some of the greatest RPGs in history. Almost all of them were console exclusive.

Fallout 1+2..that RPG's..the most RPG I have seen for consoles are "anime" type games, that I really don't care for.
But that is a subejctive view, I agree.


Of course AMD and Intel are pushing GPU physics on the PC. In fact, Intel will likely be pushing it in much the same way as nVidia is pushing PhysX, in regards to pushing a proprietary API that Intel wishes to see everyone licenses from it.

I also think Itnel will take the crown in the end.
But untill then, I take the best solution available.
On NVIDIA has one right no.
But that is really guess work...and irrelevant today,

AMD is the only one going for an open platform solution. Whether it's because that's what they truly want to see, or simply because they could never do their own proprietary standard properly, it doesn't matter. AMD is the closest to what a lot of enthusiasts want to see. Apparently you don't agree with it though.

False.
AMD fist pushed for a Havok solution.
But I guess Intel didn't play "nice"...so they crapped that...not out of will...but need.
They are now going for Bullet OpenCL physics.
But they still got nothing to show...not even a title to tease people with.
Same situation since 2006...lots of talk...nothing to show.



It's a legitimate concern. Sure, some are pushing it a bit too much. But what if PhysX resulted in a 60% performacne hit. Would it still be so desirable to you?

Can not answer that...it's game dependant, but I can say this:
I have since my first PPU in 2006 not tried a single hardware PhysX games, where I didn't like the added I.Q or where the added I.Q. made the game unplayable.


Yes, DirecX is proprietary. The difference is, DirecX is fully available for all developers and hardware companies to use/implement/etc. AMD can use it without restriction. nVidia can use it without restriction. There's none of this "Sorry, but if you have an ATI card, we won't let you use your GeForce card as a PhysX card" nonsense.

Forgetting important facts dosn't make a good argument.
AMD was offered to license PhysX.
They rejected it...don't blame NVIDIA for that decision.

That would be like Microsoft saying "Sorry, we detected Linux on your system also, DirectX will not properly run anymore."

Can you run DX under linux?


Uh, have you been living under a nVidia-sponsored rock for the last several months? AMD is on the market, and working to support Bullet.

Really?
When?
What games?

It's the same empty talk they have talked since 2006.
Comming up to 4 years now, I don't see why I should be clapping my hands just yet?

Intel is still a ways off with Larrabee, but I have no doubt that when Larrabee does finally come out, you're going to see Havok being pushed with a passion. And if you think that nVidia getting a few developers to implement hardware PhysX is a coup'd etat, just wait until Intel puts its marketing muscle behind Havok.

I agree...Intel is likely towin in the end.
But does that mean that I have to forsake PhysX until then?
Why limit myself now...on a future hunch?


Apparently a few dozen complainers = the entire millions-strong PC enthusiast community.

Hmm, any chance you work for Fox News as a pollster?

The tend is clear...and it's more than a "few dozen".
You do know the meme:
"But will it play Crysis" don't you?
It's hardly an obscure notion, seldom mentioned.

Uh, many of the people whining about PhysX, whine about how it's been implemented.

Tells us how it whould be implemented differently in Batman - AA...please?

Once again, and for what likely won't be the last time, it's not a issue with *what" PhysX is. It's an issue with how it's being implemented. As in, it's subject to nVidida's control.

So NVIDIA developed Batman - AA?


"Lower the settings"? Really? That's your counter-point.

Unless you have the number for the pixie-fairy, then yes?
It's been a rule of thumb for OC gaming since Qauke 1.
If your hardware is not up for the taks(even when overclocked to the max...you lower your settings? :confused:

I'm sorry, but for someone who constantly talks about image quality, and all that physics adds to a game, to hear that person say "lower your settings", it's laughable at best.

Yeah right...look above.

In fact, I think I'll add it to my signature. Tomorrow. After I get some sleep.[/qoute]

Please do...it's sound advice.
Any PC gamer should know this.
It would most likely cause less whining if it where common knowlegde.


I like how you completely ignore the argument. Then again, that's par for the course with you.

Se above.

Yes, let's play PONG at 1000 FPS... since everything over 60 FPS is useless.
The whole argument is that you would gladly trade playable performance for "PhysX effects". And that's probably the truth.

False.
I can play Batman - AA at 1600x1200 4xAA/16xAF/PhysX:High at playable FPS.
And don't lie about what I would or wouldn't.
Ask me...that way you don't make false statements.



But part of that is because the hardware can't handle it at max settings, at 60 FPS. So yes, they go for the overall most well-rounded game play experience.

False again:
http://www.hardocp.com/article/2009/09/01/wolfenstein_gameplay_performance_iq/4
FPS as low as 41 FPS
http://www.hardocp.com/article/2009/08/10/arma_ii_gameplay_performance_image_quality/4
FPS as low as 9 FPS
http://www.hardocp.com/article/2009/07/21/call_juarez_bound_in_blood_gameplay_perf_iq/4
FPS as low as 27 FPS
http://www.hardocp.com/article/2009/07/01/ghostbusters_gameplay_performance_iq/4
FPS as low as 15FPS
http://www.hardocp.com/article/2009/06/15/demigod_gameplay_performance_iq/4
FPS as low as 21 FPS

That was the lastest 5 gameplay reviews, perhaps you should read them again?




However, the "goal" if you can understand that, is to be able to play some of these games, at max settings, at 60 FPS. Why not? For someone who harps about image quality, and physics effects, and "immersion", it's funny that you then say "hey, max settings don't matter. max viewable FPS doesn't matter."

You are borderlining lying now, please stop.
A lot of games don't need 60 FPS to be playable, se above, and I never stated what you try and put in my mouth now.

Those physics effects you harp about. Yeah, they're nice to look at. They're nice to experience in game. You know what else is nice? Gaming at a steady 60 FPS, or close to it. I don't know about you, but I can see the difference between 30 FPS and 60 FPS, and ya know what? It adds a hell of a lot to the game also, when you can play it at close to 60 FPS.

Se above, you have painted yourself in a corner with your notion that +60FPS needs to be sustained in all games at all time to be playable.





How are they slowing down progress? You make it sound like the only good games that come out, are those on the PC, and only then, those that support stuff like PhysX.

Come on...4 year old "DX9" style hardware being the target of development is now called making progress?
Why do you think Crysis didn't come out for the consoles?
They pushed the envelope = Console couldn't be in the game anymore.

You know how people talk about those "gamers" who only care about pretty graphics, and don't care as much for the experience of the game itself? Yeah, that's you.

False, again.
I enjoy games such a EVE - Online, that are hardly pushing the hardware...but scares people of because EVE is a cruel world, where the game dosn't hold you hand and you actually LOOSE your stuff when killed.
You are so WRONG here...and gettign way offtopic and into the "personal attack" realm.


Um, it's not just AMD fans. You only see it that way, because you're so rabidly pro-nVidia.

Yeah, because I really like to run a NVIDA chipset in my rig :rolleyes:
You just crossed the line into "personal attack" country,

I don't count myself as a fanboi of any particular company. Once upon a time? Sure. If anything, I was a fan of nVidia. I also used to fervently promote Sega, and despise Sony. You know what happened? I grew up, and matured. Now, I just want what performs the best. Right now, that's ATI's cards.

Not in Batman - AA and other games.
Again, +60FPS is wasted.


Really? That's what you go with? The *only* reason that AMD cards have reduced image quality in Batman: AA, is because of software restrictions put into the game at what was likely nVidia's behest (note: there is no evidence I currently know of that actually says nVidia told them to do it. However, it seems a bit... suspicious that developers would ever want to restrict the quality of a game on any particular hardware).

You mean like NVIDIA wote the AA for Batman - AA...and the developers asked AMD for the same, but never got an answer back?

When those restrictions have been eliminated, such as in th case of AA/AF, the game has looked just as good (minus hardware PhysX of course).

You can't have your cake and eat it to...make up your mind.

You really need to try better in your examples.

Really?
 
UG, you're kind of all over the place, or maybe it's just my interpretation of your perspective. I would rather assume more than you've said if only because it means I'm covering more bases. First admitting to being familiar with smoke then accusing me of smoking something means that I have no idea what you're going for...let's stick to video games. It's really the same as it has always been. Show a normal smoke demo, then use Batman's swirly smoke. Then gauge reaction based on framerate. My bet is that you get get close enough to swirly smoke to not need a second GPU and perfect swirls.

Once again, I am not discouraging the use of PhysX. I am just saying that comparing PhysX to nothing at all is deceiving. You can certainly achieve smoke without PhysX, so why don't they do that? It's a biased game and not fair game for comparison. That is all.

lolwut? i was just joking about the marijuana. then i made a wisecrack about pretending to have problems typing due to marijuana smoke. and then brought it full circle by tying it back to the fact that we are "arguing" about smoke in video games. if you didn't get it, that's okay. as for everything else - yes i got your point. and they didn't do it possibly due to time constraints and more importantly cause no one gave them money to go back and add "approximations" of the effects in software (which would have to be done for the console versions as well). um btw, i thought you were sleeping? lol, guess not.
 
Last edited:
lolwut? i was just joking about the marijuana. then i made a wisecrack about pretending to have problems typing due to marijuana smoke. and then brought it full circle by tying it back to the fact that we are "arguing" about smoke in video games. if you didn't get it, that's okay. as for everything else - yes i got your point. and they didn't do it possibly due to time constraints and more importantly cause no one gave them money to go back and add "approximations" of the effects in software (which would have to be done for the console versions as well). um btw, i thought you were sleeping? lol, guess not.

Alright, sorry, hard to figure out subtleties online. Never know when someone might be serious. I'm glad you got my point, and that the developer only added the features due to time constraints does not say that one is better than the other. It only says that the developer was under time constraints. I feel like you're mocking "approximations" but that is what most effects are. You model dymanic effects with a generic effect and it is close to the actual thing...it just doesn't swirl when you run through it. ;)

And I never sleep muahahahahahahaha...*falls asleep*
 
Alright, sorry, hard to figure out subtleties online. Never know when someone might be serious. I'm glad you got my point, and that the developer only added the features due to time constraints does not say that one is better than the other. It only says that the developer was under time constraints. I feel like you're mocking "approximations" but that is what most effects are. You model dymanic effects with a generic effect and it is close to the actual thing...it just doesn't swirl when you run through it. ;)

And I never sleep muahahahahahahaha...*falls asleep*

not not mocking. just using your phrasing since it is easier to communicate with. not really a common part of my vocab, lol. and i am going to sleep forealz.

edit: aahh, it was toastybread who was sleeping. lol, toastybread, truffles, same difference. :p
 
there have, there is, and there will be. there have also been awesome games before dx10/11 (or gpu physx) came along, doesn't mean they can't make future games visually better or more immersive. and some quality big name games use physx like gears of war and mass effect, so physx certainly isn't preventing them from being great games either. same can be said for havok (and maybe bullet in the future) or any physics engine for that matter.



i guess if that's what you believe, lol. speculating on alternative pasts/ futures can past the time though.

This I can agree with, however making it proprietary in that you have to use Nvidia only cards is what these rabid Nvidia fans will never see as the wrong direction because love is blind.
Physics should be for all gamers, its not a Nvidia or AMD thing.
 
"Too Bad PhysX Lowers FPS in Games...is it worth these numerous trade offs? "

Yes, it lowers framerates but not to the point where the games becomes unplayable. If it did then there wouldn't be any point in including the feature. Someone posted that the performance drops in half and I would agree. I think there is some syncing/latency issue in play here. That's my guess. I find it interesting that in Batman enabling PhysX seems to result in a minimum frame rate drop to about 27-30FPS regardless of using a single or dual card setup. Although there is a difference as to how much processing can be done in each configuration. Is it worth these numerous trade offs? The only real tradeoff is the current requirement that you don't have any ATI video hardware co-mingling with your Nvidia video hardware in order to use GPU acceleration for PhysX. The frame rate issue is a non-issue in my opinion.

Everyone seems to weigh in on whether or not they think PhysX is worth it but I don't see any info as to why it works the way it does. What I'm hoping to read are some technical reasons as to why there is such a dip in framerate when the physics load has been moved to a card dedicated to the task. If the video card is rendering 60 FPS minimum, why does it dip to 30 FPS when a second card is doing the math for the physics?

I agree with those of you that feel the devs can approximate most of the effects on the CPU, but it wouldn't be the same. Depending on the types of effects it probably won't be close. It would also require PhysX libraries to support that or the devs would have to include and code for another engine suited for that purpose. Those of you thinking that the PhysX effects can somehow scale on a multicore CPU and match GPU effects, I'd like to see some technical reasoning behind that. If the physics simulations require a lot of floating point operations then I don't see how a CPU could begin to tackle the issue when an Intel Core i7 is in the area of about 50 GFLOPS for the 3.2Ghz model. A low end GTS250 is about 700GFLOPs. A 9800GTX is about 400something GFLOPS. The Intel CPU at theoretical max is running at a fraction of the GPU computing capacity. Remember, the whole point of these advanced physics solutions is to introduce effects that are more realistic which require multiple complex simulations for the various effects, especially the particle based effects which seems to cause the most trouble performance wise.

I think the technology needs to mature some more. I'm not confident that we'll see better implementations from the other camps in the immediate future. Who knows, maybe Intel or AMD will get it right the first time. In either case I'm not so excited about the possibility of having three competing physics engines all requiring specific brands of hardware for acceleration. I just want to see physics implemented in way that allows everyone to enjoy it regardless of the hardware that they are running (yeah, I know, probably not going to happen).

Just my $.02.
 
I haven't read this thread, so to answer the title, YES. Yes, it's worth it. The hit in FPS is not even noticeable.

Give it a rest already.
 
^ Neither have I, but Holy flamewar and bad communication batman, this is a ridiculous thread.
Here's the summary:

BAA Looks better with PhysX on
It does not change the gameplay
Some people wish there was a cpu-based middle ground btw physX and none (hard to disagree)
some people think it could all be one on the cpu (most disagree)
some people want physx to be open (will never happen and why should it)
others are waiting for bullet and opencl

Others really like moaning about how not fair it is.
Others say deal.

I say if you want the effects, get off your butt and buy a cheap nvidia card and enjoy them via physX because that's one of the only ways to get B:AA-style phyisics effects today.

If you don't want the effects, deal with not having them.

I wanted them, I paid $60 for a physX card. I'm glad I did.
 
Meh Physx's days are numbered, with directx 11, openCL and the open source physics solution AMD is pushing (which works with any GPU/CPU combination and all 3 major platforms) which developer in their right mind would opt for a nVidia only solution?
 
I say if you want the effects, get off your butt and buy a cheap nvidia card and enjoy them via physX because that's one of the only ways to get B:AA-style phyisics effects today.

If you don't want the effects, deal with not having them.

I wanted them, I paid $60 for a physX card. I'm glad I did.

I was going to buy a $60 card for PhysX and CUDA, but Nvidia decided to be whiny emos and prevent me from doing that, due to me having a 5870. Thus, company politics more than anything means PhysX sucks ass and can die in a fire. I'm not going to pay a company to try and think up ways of disabling my hardware.
 
Video of xbox 360 game using the open source, universal bullet physics engine here. xbox 360: That's the machine without an x86 CPU and with an ATI GPU.

EDIT: more here.

PS, I'm not 100% sure but I think the xbox is produced by that little known company Microsoft.

So it's AMD, Microsoft and the rest vs nVidia (there's already a CUDA version of Bullet btw)... I wonder who'll win out?
 
Last edited:
Video of xbox 360 game using the open source, universal bullet physics engine here. xbox 360: That's the machine without an x86 CPU and with an ATI GPU.

EDIT: more here.

PS, I'm not 100% sure but I think the xbox is produced by that little known company Microsoft.

So it's AMD, Microsoft and the rest vs nVidia (there's already a CUDA version of Bullet btw)... I wonder who'll win out?

i played trials hd on my 360. cool game, but i wasn't really impressed by it. just curious, what other games make use of bullet?

also, there are plenty of games that use havok & physx on the 360 compared to bullet, which has been around almost as long as physx.

ms is not even involved. they are more interested in making money with the xbox360. it's amd & nvidia & maybe intel. thing is, nvidia will support all three major physics engines (havok, physx, bullet). nvidia can easily port physx to opencl. bullet is actually using the opencl drivers nvidia provides as well.

http://icrontic.com/news/nvidias-take-on-amds-open-source-bullet-physics

dunno who will "win" out though. could be all three will still be continue to be used just like today. could be something even better eventually comes along. who knows?
 
Last edited:
I don't see a lot of physics there. At the end of one of the levels, a pile of crap fell, which was pretty nice, but everything else was pretty limited it seemed. Just ramps and a few panes of glass...
Any games use Bullet and make more use of it?

BTW, great article, Kum^^
 
You must be blind if you can't see the physics in Trials 3d. Watch the driver, his bike, the tyres, (some of) the track boards etc which all flex in true physics accelerated fashion (the second video shows this better). I'd also say it contributes with friction/momentum modelling where the biker is atop those big ball things. There aren't many things that fall over and tumble but that's not the only thing physics engines do.

I wasn't saying MS were involved, but the point is, it's a system that works on their hardware. I agree havok might seem the obvious alternative to physx, but Intel now have control of it. Just like Physx is going to be sidelined by a more open solution if they decide to make it dependent on their hardware for GPU accelleration.

And ... AMD (current flavour of the month) have switched allegiance to Bullet.

It supports all platforms equally well, and it's FREE, which must be attractive for developers.

Not sure how many other games currently support it if any, I'd say it was early days, but the thought of a cross platform demo.

Currently I believe it is supported by nVidia hardware better than AMD, but then again CUDA is the nVidia version of Open CL, supported by ATI.Directx 11, so I can see a port being relatively straight forwards.

Whatever, I can't see either Havok (of it retains open hardware support) or Bullet, losing out to the "PC and nVidia hardware only" system "Physx".

Physx is a gimmick nVidia is trying to sell you to justify their under performing, expensive video cards, which *may* be able to compete in terms of FPS but we all know that comes at the expense of image quality. Never mind dx 11, most of their cards don't even support 10.1 yet. I just hope they get their act together, stop relying on smoke and mirrors and produce something good soon. We all saw what effect the lack of competition has had on 5770 prices, and that sort of thing benefits no one (other than AMD)!
 
Physx is a gimmick nVidia is trying to sell you to justify their under performing, expensive video cards, which *may* be able to compete in terms of FPS but we all know that comes at the expense of image quality.

wtf? nvidia cards lack IQ? :confused: last time I checked, some of nvidias old, underperforming cards, are still competing head to head with ATi's latest and greatest. Until a title actually uses DX11, it's a worthless feature :)
 
wtf? nvidia cards lack IQ? :confused: last time I checked, some of nvidias old, underperforming cards, are still competing head to head with ATi's latest and greatest. Until a title actually uses DX11, it's a worthless feature :)

Yea, I cannot remember the last time either company's cards lacked IQ. It's been a long time...
MAYBE when l4d came out and ATi had that weird shadow issue that was soon fixed, but it wasn't that big of an issue...
 
Yea, I cannot remember the last time either company's cards lacked IQ. It's been a long time...
MAYBE when l4d came out and ATi had that weird shadow issue that was soon fixed, but it wasn't that big of an issue...
Agreed about the IQ comment...that goes to show you the amount of crap and non sense we have to read on the internet every day...amazing :eek:
 
wtf? nvidia cards lack IQ? :confused: last time I checked, some of nvidias old, underperforming cards, are still competing head to head with ATi's latest and greatest. Until a title actually uses DX11, it's a worthless feature :)

Like Cryengine 2 games? Dirt2? Battleforge. The new STALKER. Oh, but I forgot, we only buy video cards for yesterday's games, not tomorrow's.

I want nVidia to succeed, even though I rarely buy their hardware, but they better get their fingers out! Rebranded 8800GT anyone?
 
You have not been around too long have you? This has happened many times before....the flavor of the month comes out and people go crazy with the amazing features that make everything else obsolete.....
 
Like Cryengine 2 games? Dirt2? Battleforge. The new STALKER. Oh, but I forgot, we only buy video cards for yesterday's games, not tomorrow's.

I want nVidia to succeed, even though I rarely buy their hardware, but they better get their fingers out! Rebranded 8800GT anyone?

By that definition the R800 is a rebaged R700 that is a rebagded R600...sure you wnat to go down that path?
 
I wasn't saying MS were involved, but the point is, it's a system that works on their hardware. I agree havok might seem the obvious alternative to physx, but Intel now have control of it. Just like Physx is going to be sidelined by a more open solution if they decide to make it dependent on their hardware for GPU accelleration.

And ... AMD (current flavour of the month) have switched allegiance to Bullet.

It supports all platforms equally well, and it's FREE, which must be attractive for developers.

Not sure how many other games currently support it if any, I'd say it was early days, but the thought of a cross platform demo.

Currently I believe it is supported by nVidia hardware better than AMD, but then again CUDA is the nVidia version of Open CL, supported by ATI.Directx 11, so I can see a port being relatively straight forwards.

Whatever, I can't see either Havok (of it retains open hardware support) or Bullet, losing out to the "PC and nVidia hardware only" system "Physx".

Physx is a gimmick nVidia is trying to sell you to justify their under performing, expensive video cards, which *may* be able to compete in terms of FPS but we all know that comes at the expense of image quality. Never mind dx 11, most of their cards don't even support 10.1 yet. I just hope they get their act together, stop relying on smoke and mirrors and produce something good soon. We all saw what effect the lack of competition has had on 5770 prices, and that sort of thing benefits no one (other than AMD)!

interesting reads about amd/ nvidia and bullet/ havok:

http://www.hitechlegion.com/our-news/1411-bullet-physics-ati-sdk-for-gpu-and-open-cl-part-3?start=1

"This made me question some things ATI has said about NVIDIA being “proprietary”, if there is an NVIDIA SDK for OpenCL, then they are developing it. Due to NDA, Erwin was unable to comment if an SDK for OpenCL for ATI (GPU). So is there an SDK that ATI has developed? These were my next line of questions for Dave Hoff of ATI."

"When I asked about an SDK and work on OpenCL, he went right back to marketing how great the future potential is going to be because it’s open source. No direct answer, yes or no, just we showcased it at GDC. I wasn’t there so, when I asked what, he responded that it was showcased at GDC."

"I decided to do a little more research and found that Havok and ATI had teamed up at GDC to show Havok cloth for physics (OpenCL). Great! But wait again! Didn’t ATI just announce that they were going to partner with Bullet? What happened to Havok? Now I also remembered reading an article that NVIDIA was also considering porting PhysX to OpenCL. Guess what, they already have a structured physics engine, and when I read Dave the quote from question number six above he agreed with the comment: “Trivial” to convert the kernel. So how open is this new physics going to be? Why do we need three different (NVIDIA, ATI, Havok) Open CL SDK’s for physics? Isn’t the beauty of OpenCL that everything will work and it’s not proprietary? Are we still looking at a programming code that, even though Open Source, will make it proprietary depending on the SDK the developer chooses to use, thus giving one video card an edge over the other?"



reflection on future of physx & physics (decent translation):

http://translate.google.com/transla...e-karty/15247-zamysleni-nad-budoucnosti-physx
 
Last edited:
By that definition the R800 is a rebaged R700 that is a rebagded R600...sure you wnat to go down that path?

lolwut... the directx 10.1 r700 was a rebadged directx 9 R700 and the R800 gained directx 11 with the mere addition of a badge? All the shaders and such are built in to the badge? nVidia has literally rebadged products again and again with little or no redesign, except maybe a die shrink.

Meh I'm not a fanboi, but hell, in for a penny in for a pound.

What about the defective laptop chipsets the supplied to Dell and HP? I here there was a $100 mil price tag on that particular fiasco, great quality control nVidia. Then there's the faked up Fermi cards they showed to everyone at the product launch without telling them they were fake (they didn't actually have a card to show). I know 'paper' launches are common, but why not be honest with people? Smacks of desperation to me.

I have to admit the last nVidia card I owned and image quality (particularly the AA) was very bad on that. As for the buggy movie playback which frequently required a reboot to put straight, a problem nVidia knew about before over a year, don't get me started. Sure both chip manufacturers have problems with certain games and certain chipsets, nVidia's is worse than most.

What about the current nvidia mid to high range of parts, currently selling at pretty much break even. Can they bring the price down to compete with ATI? Nope.

Then there's that top nVidia exec who sold all his shares in the company. Insider trading anyone?

Worse than ATI/AMD even under Hector Ruiz.

And if the new Physx works on all hardware, then of course it will succeed.
 
By that definition, Atech is a rebadged troll.

*but* he lost some SP in the process. (lasered out).

Anyhow... how did the post you quoted even tranlate into your post? Because it doesn't.

BEcause Nvidia says Dx11 is useless. And we need to buy Batman with Phyixs!!!!!
 
Status
Not open for further replies.
Back
Top