How PhysX Makes Batman: Arkham Asylum Better

Most PhysX titles suck. They remove the ability to boost the Resolution and AA levels (as the added crap from the PhysX engine makes most of them unplayable).

I have a 9800GT dedicated to PhysX therefore I am not bashing PhysX because I don't own a card that supports the wretched technology. None of the things we see in Batman Arkham Assylum warrant needing a GPU. They would have been possible with a Multi Core processor.

In fact, paper and cloth simulations would have been easily available to multi core users.

Leave the GPU for Graphics goodness and start using our multiple cores damn it!
 
It would be nice to see cpu/gpu/dedicated physx gpu load % graphs during gameplay.

I really wasn't impressed by the physics effects in those videos and would agree with ElMolsEviL in that I think that most of the effects (with the exception of perhaps steam) could and should have been produced simply by multicore usage.
 
It'll be interesting to see

1: CPU usage in this game in general
2: If there is any "software" mode of PhysX that'll run on an i7
 
anyone notice the Sept 18 release date on the video? What happened to the 15th?

Not that it really matters, just need to reschedule my day off. :)
 
anyone notice the Sept 18 release date on the video? What happened to the 15th?
15 sept - North America, 18 sept - Europe and Australia, afaik
 
Ah. Just finished my pre-order on Steam- can't wait!

Using a 384MB 9600 GSO for PhysX; (GTX 260 primary) Hope that helps offset the load.
 
Well, it's kinda obvious nVidia 'pressured' the developers to add Physx, and castrate the normal simulations.

Physx in Softimage doesn't need a nVidia GPU (not now, and hopefully, not ever).

I still think Physx was a great idea until nVidia tried to make it exculsive.

There are plenty of other physics engines out there,
 
Well, it's kinda obvious nVidia 'pressured' the developers to add Physx, and castrate the normal simulations.

Physx in Softimage doesn't need a nVidia GPU (not now, and hopefully, not ever).

I still think Physx was a great idea until nVidia tried to make it exculsive.

There are plenty of other physics engines out there,

Open for license ? Do tell which, besides Havok and PhysX (we are talking about the physics API only btw).

Also, PhysX is NOT exclusive. You and others like you, really need to stop spreading this nonsense around.
 
Open for license? Do tell which, besides Havok and PhysX (we are talking about the physics API only btw).

Also, PhysX is NOT exclusive. You and others like you, really need to stop spreading this nonsense around.

I'd like to know what your definition of exclusive is. This looks pretty exclusive to me.

It may purely be a developer issue, but from looking at the Batman video there are a few things there that definitely didn't need to be removed.

The cloth banners could have remained while physx just enhanced the flow of the cloth. The tiles on the floor didn't need to break into a million pieces, but a standard "dent" overlay would've been fine. We've had those since at least HL1.

Physx may provide better smoke and spark effects, but there's no reason to remove them outright from the game for people who don't use Nvidia as their primary video card.

My personal attempt has shown me that the latest drivers do block me from using an 8600gt with my 4850 on 186+ drivers. (I know its a poor example, but it's still a proof of concept.)

Little things like that change the atmosphere of the game. I don't know who's responsible for removing that stuff, but it makes me a little dissapointed in having an ATI card as my primary.

Windows 7 is the future, and Nvidia is only letting Physx work on PC's that exclusively use nvidia as the primary video card regardless of if the secondary card is capable.
 
Last edited:
*Several great points*
I definitely agree. I think it's lame of the developer to be bought off by NVIDIA, and I refuse to purchase this game. Most of the effects seen in the video are easily produced without GPU hardware physics but instead they're simply omitted. Whether this is by request of NVIDIA or simple laziness, who knows.
 
Dude really, why don't you research before you post nonsense like this?
It's far from nonsense. Maybe you should do your research (you can start by reading the link in this thread).

Nvidia excludes anybody who uses an ATI GPU from running physx on an nvidia GPU. Note the use of the word "excludes".
 
Maybe you should try reading how it ALL started...ATI is here to blame not the other way around and btw, I own cards from both companies so I am not a fan boy but I hate people just repeating the same nonsense all over that they heard from someone else....gets boring.
 
Link me to how it all started and how ATI is to blame. I'm new to the subject.
 
Most PhysX titles suck. They remove the ability to boost the Resolution and AA levels (as the added crap from the PhysX engine makes most of them unplayable).

Batman: Arkham Asylum has received very good reviews though.
 
Maybe you should try reading how it ALL started...ATI is here to blame not the other way around and btw, I own cards from both companies so I am not a fan boy but I hate people just repeating the same nonsense all over that they heard from someone else....gets boring.
I'm waiting :)
 
No need to wait, you can use google can't you?
oh wow. you got me there.....:p

Sounds to me like you don't have any facts to support your claims.

But, just because I'm bored, and just because you so politely suggested so, I did search google, and found nothing to suggest ATI prevents people from running physx on an nvidia GPU.

What I do know is I've heard from a couple different places now that trying to run Physx on an nvidia GPU with an ATI GPU powering the video doesn't work.

So, please, enlighten us with the facts that support your claims or risk being a deliciously ironic example of somebody spreading baseless nonsense :D
 
oh wow. you got me there.....:p

Sounds to me like you don't have any facts to support your claims.

But, just because I'm bored, and just because you so politely suggested so, I did search google, and found nothing to suggest ATI prevents people from running physx on an nvidia GPU.

What I do know is I've heard from a couple different places now that trying to run Physx on an nvidia GPU with an ATI GPU powering the video doesn't work.

So, please, enlighten us with the facts that support your claims or risk being a deliciously ironic example of somebody spreading baseless nonsense :D
Did I say ATI was preventing users from doing this?
Gotta read better so that researching shows you correct info my friend.
I said they started it all by not going with Physx...then again, they are always like that which is why I started using Nvidia again when they wouldn't support 3d glasses....
 
Did I say ATI was preventing users from doing this?
by implication, yes :p

observe:


I still think Physx was a great idea until nVidia tried to make it exculsive.

This looks pretty exclusive to me....Nvidia is only letting Physx work on PC's that exclusively use nvidia as the primary video card regardless of if the secondary card is capable.

Dude really, why don't you research before you post nonsense like this?


I do see what you were trying to say though. Nvidia has said they would support ATI using physx on ATI cards, if ATI wants to do it. So, nvidia isn't trying to keep physx running only on nvidia cards. I get that.

It's just that the "exclusive" discussion in this thread arose because nvidia locks out ATI users who want to run physx on a secondary nvidia card, which is really inexplicable.
 
I'd like to know what your definition of exclusive is. This looks pretty exclusive to me.

Well, as usual the PhysX naysayers like to take everything out of context...

When I replied to the other post, the implication that PhysX as a physics API is exclusive, is wrong. PhysX can be used by everyone and they do not need to have a NVIDIA card to use it. PhysX calls sent directly to a GPU (GPU physics) are however tied to NVIDIA hardware, since no one else can do it at this point (not because NVIDIA blocked them, they just didn't license the tech). If you don't have a NVIDIA GPU, then obviously GPU physics won't work with PhysX and the physics processing will be done on whatever is the default hardware component (typically the CPU).

As for what you linked to, it's pretty much the standard practice and I'll explain in a very simple manner. Having one video card, render graphics and process physics is "simple". There is no context change and the data is present in the same GPU that can act/react to what was calculated in both graphics and physics.
Having different video cards, one for graphics and another for physics is quite another thing. Even more so when they are from different brands. NVIDIA can't guarantee proper functionality with cards from other manufacturers, if they (NVIDIA cards) are only being used for physics.

People that complain either don't know how it works, or think that NVIDIA is in the business of charity or something and that they need to invest in a tech and still support other manufacturers that will reap the benefits of that tech, without effort. If other manufacturers want support, then they have to work with NVIDIA so that they get support.
 
It's just that the "exclusive" discussion in this thread arose because nvidia locks out ATI users who want to run physx on a secondary nvidia card, which is really inexplicable.

It "arose" because some people like to confuse terms. PhysX is not exclusive. PhysX GPU physics is (for now). Whoever wants support, then they have to work with NVIDIA to get support. This far, AMD doesn't want that support and that certainly is not NVIDIA's fault...

As for the "inexplicable" read my previous post. It's not inexplicable and it has a very simple reason...
 
Having different video cards, one for graphics and another for physics is quite another thing. Even more so when they are from different brands. NVIDIA can't guarantee proper functionality with cards from other manufacturers, if they (NVIDIA cards) are only being used for physics.

People that complain either don't know how it works, or think that NVIDIA is in the business of charity or something and that they need to invest in a tech and still support other manufacturers that will reap the benefits of that tech, without effort. If other manufacturers want support, then they have to work with NVIDIA so that they get support.
You're talking about two different things. As for your last paragraph, nobody is insisting that physx should work on non-nvidia hardware.

For your first paragraph, guaranteeing proper functionality and locking out potential users are two totally different things.

Using separate hardware for video and physx already exists in the software, in the form of the PPU. It's not a matter of software not being there, or not being properly supported. It's a matter of nvidia actively blocking potential users from using their hardware as intended.

Your "physx naysayers" comment reveals your bias. I've been a proponent of physx since it was announced, long before hardware was released. I also am an exclusively nvidia user at this point specifically because I use CUDA extensively. However, I think what nvidia has done to block physx use with non-nvidia video rendering is crap.
 
You're talking about two different things. As for your last paragraph, nobody is insisting that physx should work on non-nvidia hardware.

Really ? You must've missed all the other threads then...

jebo_4jc said:
For your first paragraph, guaranteeing proper functionality and locking out potential users are two totally different things.

No they are not, because the "locking" is done to assure proper functionality with their hardware, which is the only thing they need to guarantee to work properly.

jebo_4jc said:
Using separate hardware for video and physx already exists in the software, in the form of the PPU. It's not a matter of software not being there, or not being properly supported. It's a matter of nvidia actively blocking potential users from using their hardware as intended.

That's more or less true, but you can't compare the state of the software as it is now, with the software used for PPUs. They are a world apart from each other, especially given the translations needed in drivers for PhysX -> CUDA. How exactly does NVIDIA guarantee proper functionality and send data from a CUDA ready GPU to a non CUDA ready GPU ?

And the other question is: why would they ? That's exactly why licensing terms exist. A license guarantees support. If the others don't want to license something that a company invests in, why would said company do all the work to guarantee proper functionality with all hardware combos possible and thus allowing others to reap the benefits of the technology, without moving a damn finger ?

jebo_4jc said:
Your "physx naysayers" comment reveals your bias. I've been a proponent of physx since it was announced, long before hardware was released. I also am an exclusively nvidia user at this point specifically because I use CUDA extensively. However, I think what nvidia has done to block physx use with non-nvidia video rendering is crap.

Sure it does. It's always a question of bias isn't it ? :rolleyes:
Maybe that's why most of the people complaining don't even know what PhysX is and still complain that NVIDIA should be doing anything and everything to accommodate other "players" in the industry, without those "players" moving a finger...

And naysayer is a proper word. Check this and the other threads and you'll see the "PhysX is about curtains" and all that...it gets so tiresome that it's just not worth it to say anything anymore...
 
haha your points may be fair enough. I was obviously only referring to the points made in this thread. I actually hung out in this subforum a lot when it first opened because of my excitement for physx, but I was then turned off by the constant bickering and general sad state of the physx project in general.

That all being said, I am personally excited to get this game tomorrow, and I may be moving a 9800GT into my dual GTX275 rig if the 275's can't handle physx + video at 60fps by themselves.

Getting back to the topic at hand, I think it is stupid for nvidia to lock physx from ATI users. It's not a zero-return investment, as you seemed to indicate. Nvidia would obviously make money from additional GPU sales as ATI users snap up 9800GT or 9600GTs to run physx. Nvidia has excluded (there's that word again) . what, 40% to 50% of the discrete GPU users out there from using physx and buying additional GPUs.

Also, again, I would love to see how Batman with PhysX would run on an i7. I understand the architectural differences, but I want to know real world results.
 
No they are not, because the "locking" is done to assure proper functionality with their hardware, which is the only thing they need to guarantee to work properly.

That's a load of BS, we already went through this with SLI. Especially since before the blocking the combination was working without a hitch. What you are suggesting with the licensing is the equivalent to banning ppl from using ATI cards on NV chipset M/Bs.

And a very valid complaint about the physx implementation in BAS was presented. Instead of having the effects scale down it was out right removed. They aren't just not supporting users who don't have physx, they are out right punishing them.

NV: "You don't want it? HA! You can't have it!"
 
This is a bit more difficult than enabling SLI or crossfire on a motherboard.
Its a morphing software driver that has to keep pace with the opposing teams driver revisions and GPU capabilites which are NOT the same as NVidias.

The quality of the PhysX driver has to be par excellence as it is under heavy scrutiny.
The PhysX dev team wont have the same level of access to ATI engineers and will not be able to recommend driver and hardware changes to suit future PhysX features either, development for ATI cards will be expensive and slow.
This is what makes it so much more difficult as some features may need more than one pass to complete on different architecture when the hardware/driver cannot be optimised.

Keeping AMD cards in the mix will hold NVidia back from developing PhysX in a number of ways (and more):
1) they need to coerce information from ATI engineers which may not be 100% correct leading to longer development times.
2) they will have to reduce functionality or quality to cope with the lowest subset of available features increasing development time.
3) they will need to optimise 2 hardware platforms increasing development time.
By going NVidia exclusive, they free up many engineers so can increase speed, features, quality and/or development times.
After all, there have been complaints that PhysX doesnt do enough in hardware, this is how they will meet expectation.

It makes sense that NVidia keep this in house since AMD cannot use PhysX.
Sad but theres not much NVidia or AMD can do about it unless Intel cuts AMD a break.
 
That's a load of BS, we already went through this with SLI. Especially since before the blocking the combination was working without a hitch. What you are suggesting with the licensing is the equivalent to banning ppl from using ATI cards on NV chipset M/Bs.

And a very valid complaint about the physx implementation in BAS was presented. Instead of having the effects scale down it was out right removed. They aren't just not supporting users who don't have physx, they are out right punishing them.

NV: "You don't want it? HA! You can't have it!"

Of course it is...:rolleyes:

NVIDIA must support everything and anything from competitors, without those competitors moving a finger, even though NVIDIA owns the tech...They even have to beta test competitor's drivers to test their own drivers and get proper functionality, without the competitors doing a thing...:rolleyes:

Is your "color" binding fan reasoning, taking you to places outside of the real world or something ? Because if this makes sense to you, it must be...

CUDA is NOT supported in AMD's GPUs. Is that so hard to understand ? Leave your "color" reasoning for a minute and think...
 
Of course it is...:rolleyes:

NVIDIA must support everything and anything from competitors, without those competitors moving a finger, even though NVIDIA owns the tech...They even have to beta test competitor's drivers to test their own drivers and get proper functionality, without the competitors doing a thing...:rolleyes:

Is your "color" binding fan reasoning, taking you to places outside of the real world or something ? Because if this makes sense to you, it must be...

CUDA is NOT supported in AMD's GPUs. Is that so hard to understand ? Leave your "color" reasoning for a minute and think...

They can very well fund that support through the extra sale to ATI users who want physx. And I'm pretty sure that during QA for AMD MB's they test build with NV cards in them. These things go both ways considering that modern vga drivers stick their hands all over the OS.

"CUDA is NOT supported in AMD's GPUs. Is that so hard to understand ?"
When did I bring up CUDA?
 
They can very well fund that support through the extra sale to ATI users who want physx. And I'm pretty sure that during QA for AMD MB's they test build with NV cards in them. These things go both ways considering that modern vga drivers stick their hands all over the OS.

Which is why you still don't understand how it works...

Hypernova said:
"CUDA is NOT supported in AMD's GPUs. Is that so hard to understand ?"
When did I bring up CUDA?

Again, you don't even know how it works, which gets very, very tiresome...
PhysX can be calculated on GPUs, because its instruction set is translated to CUDA, through drivers. How does NVIDIA guarantee that any necessary data is sent from one CUDA-ready GPU, to a non CUDA ready GPU, without the proper support of the other hardware's manufacturer ?

Does "magic" work in your world ? Because it doesn't in software development...
And since AMD isn't interested in teaming up with NVIDIA on this front, they don't get support, that would be needed to at least translate PhysX to Brook+.

Actually I find it hilarious that the same people that criticize every single game that uses PhysX (and supports GPU physics), just because it's owned by NVIDIA, blame NVIDIA for not supporting it on competitor's hardware (If it sucks, why do you want it ?)

So It sucks from the point of view of it being available in games (that are always bad, blah blah blah), but it's suddenly good to have from another point of view and so NVIDIA sucks for not supporting it all over the place.

Fanboys are truly a very odd breed...the double standards are incredible really...
 
Which is why you still don't understand how it works...



Again, you don't even know how it works, which gets very, very tiresome...
PhysX can be calculated on GPUs, because its instruction set is translated to CUDA, through drivers. How does NVIDIA guarantee that any necessary data is sent from one CUDA-ready GPU, to a non CUDA ready GPU, without the proper support of the other hardware's manufacturer ?

Does "magic" work in your world ? Because it doesn't in software development...
And since AMD isn't interested in teaming up with NVIDIA on this front, they don't get support, that would be needed to at least translate PhysX to Brook+.

Actually I find it hilarious that the same people that criticize every single game that uses PhysX (and supports GPU physics), just because it's owned by NVIDIA, blame NVIDIA for not supporting it on competitor's hardware (If it sucks, why do you want it ?)

So It sucks from the point of view of it being available in games (that are always bad, blah blah blah), but it's suddenly good to have from another point of view and so NVIDIA sucks for not supporting it all over the place.

Fanboys are truly a very odd breed...the double standards are incredible really...

I know very well how physx works. My point is it's a matter of if NV wants to spend the $$$ and man hour on it. I have not for a second suggested ATI writing a Brook+ implementation of physx (or have NV do it for them), simply that they keep the existing CUDA implementation working on W7 and XP. The key word is on 'keep'.

And remember, NV need stuff like physx to help push card sales. You don't expand your camp just by fortifying it. You also need to infiltrate the other camp.
 
Last edited:
Which is why you still don't understand how it works...



Again, you don't even know how it works, which gets very, very tiresome...
PhysX can be calculated on GPUs, because its instruction set is translated to CUDA, through drivers. How does NVIDIA guarantee that any necessary data is sent from one CUDA-ready GPU, to a non CUDA ready GPU, without the proper support of the other hardware's manufacturer ?

Does "magic" work in your world ? Because it doesn't in software development...
And since AMD isn't interested in teaming up with NVIDIA on this front, they don't get support, that would be needed to at least translate PhysX to Brook+.

Actually I find it hilarious that the same people that criticize every single game that uses PhysX (and supports GPU physics), just because it's owned by NVIDIA, blame NVIDIA for not supporting it on competitor's hardware (If it sucks, why do you want it ?)

So It sucks from the point of view of it being available in games (that are always bad, blah blah blah), but it's suddenly good to have from another point of view and so NVIDIA sucks for not supporting it all over the place.

Fanboys are truly a very odd breed...the double standards are incredible really...

I love the idea of a physics API enjoyed by all. However, we will never get that with Nvidia owning PhysX or Intel owning Havok. Personally I think Physics processing was harmed by both of them being bought up. What we'll get is another round of API wars which sucks for consumers.

No matter what you say Nvidia has direct control over PhysX. This is why I would prefer companies backing OpenCL but then you're left with Havok which is owned by Intel so we have the same problem. I think Havok is a little easier to swallow because it uses OpenCL versus the closed CUDA but it's still vomit inducing to think about especially with Larrabee on the horizon.

A major competitor having direct control over a Physics API is bad and this isn't the same as OpenGL or DirectX. Yes, DX is proprietary because it is an MS product but ALL 3rd parties follow the same standards and have no control over the tech directly. Plus MS doesn't sell hardware to run DX so them owning DX doesn't hurt NV, AMD or Intel as it doesn't affect competition. However, this doesn't hold true with Nvidia and Intel owning the physics API's.
 
I love the idea of a physics API enjoyed by all. However, we will never get that with Nvidia owning PhysX or Intel owning Havok. Personally I think Physics processing was harmed by both of them being bought up. What we'll get is another round of API wars which sucks for consumers.

No matter what you say Nvidia has direct control over PhysX. This is why I would prefer companies backing OpenCL but then you're left with Havok which is owned by Intel so we have the same problem. I think Havok is a little easier to swallow because it uses OpenCL versus the closed CUDA but it's still vomit inducing to think about especially with Larrabee on the horizon.

A major competitor having direct control over a Physics API is bad and this isn't the same as OpenGL or DirectX. Yes, DX is proprietary because it is an MS product but ALL 3rd parties follow the same standards and have no control over the tech directly. Plus MS doesn't sell hardware to run DX so them owning DX doesn't hurt NV, AMD or Intel as it doesn't affect competition. However, this doesn't hold true with Nvidia and Intel owning the physics API's.

:rolleyes:

First, NVIDIA supports OpenCL (they even chair the group behind it)
Second, PhysX can and certainly will be ported to OpenCL eventually.
Third, Havok doesn't use OpenCL...AMD's GPU physics acceleration efforts (which thus far exist only in "video" and "slides" form...PR stuff) uses OpenCL to call Havok's instructions and run them on the GPU.

Why do you confuse OpenCL and Havok as being two "entities" for one purpose ? PhysX's instruction calls can be ported to OpenCL aswell and NVIDIA has far more experience there, given what they did just that with CUDA and PhysX already.

Lastly, yes NVIDIA "controls" PhysX, but only the rights to the tech, because...well...they bought it, so they have every right to get royalties from its use by other parties. However, once a license agreement is set, both parties need to respect it. And if AMD wants to use it, they are welcome to it. They don't seem to want it, so they are the only ones to blame here.
 
And if AMD wants to use it, they are welcome to it. They don't seem to want it, so they are the only ones to blame here.
There are two sides to every negotiation. NVIDIA might be asking an unreasonable price for the license to accelerate physx on ATI GPUs.
 
Back
Top