PhysX on nVidia cards won't work if an ATI card is used

What does Matrox have to offer? Just kidding.
What ever the new AVP runs best on gets my graphics money.
ATI anyone?
 
Flying paper, broken glass, broken rocks/tiles, realtime smoke, water particles dripping down collecting puddles, are all neat but they are used strictly for visual presentation and nothing affecting gameplay.

Exactly my point... all of those things are already implemented without the use of PhysX in countless games.

These things do add to games by adding realism and atmosphere, but PhysX is not required to create these things and have them look great, which again, continues to be done without the use of PhysX... with the exception of tearing cloth or "interactive smoke"... not a "big deal" by any means.
 
What about those people with the stand alone PhysX cards? Did Nvidia just disable their hardware when used with an ATI card?
 
Just out of curiosity but isn't this considered monopolizing, and if it is isn't that illegal in the states. I mean they are forcing you to buy nothing but their hardware and nobody else to use the physx system which is separate from the rendering system.
 
nVidia bought Ageia, right? right.
nVidia owns PhysX, right? right!
People have the right to do whatever they want with what they own, right? right!
ATi is capable of making something like PhysX, right? right!

So why doesn't ATi just make their own version of PhysX?
 
Are you actually trying to argue that its Nvidias job to port Physx to run on ATI's hardware??? Seriously??? :rolleyes::rolleyes::rolleyes::rolleyes:

If Nvidia wants PhysX to run on any hardware, then yes, its their job to do so. PhysX is owned by Nvidia and the sourcecode of it as well. It runs on CUDA, which is a closed standard. If Nvidia were to put PhysX with its source code on sourceforge.net telling "here it is, do what you wish with it", then I would agree that its other companies job if they want to use it or not.


I use and love both verders systems. Still have working 9500-9700 card, my work PC has a 9800 in it, and I have 7800GTs and 8800GTs. Neither company is superior, as each takes their own approach, and who ever is king one day, is the joker the next usually. Just get what you think is the best bang at the time, and stop the madness!

Dedicated Fanbois can be funny/insame with their drivel!

EDIT - I should add that I completly DISAGREE with what NV did here, this is just bad customer service. Afterall, if you own the hardware, who cares if its the primary GPU/rendering device or not..

I also use ATI and Nvidia. Couldn't care less about the companies, but I do care about the products from a consumer point of view. I've talked a lot down on ATI for not having proper widescreen support (scale to aspect and custom res) for years.

Calling me fanboy would be way off, so I'd rather take it with some humor. For all we know, I will have a GT300 next time (or 5800 series and then GT300 if it offers something I want. I can afford buying a new card the next day anyway.) :)

Besides, I'm not arguing ATI's point of view here. ATI has only recently entered the game with an actual GPU physics (Bullet) and even here its opencl based and is for Nvidia just as much. I'm arguing the consumers point of view and shake my head every time someone asks me to understand why a company (regardless if its ATI or Nvidia) screws its customers and which financial policies that made them do it.

PhysX on GPU has done little to nothing for us gamers. We've seen the effects before, some in scripted form and others as realtime physics on CPU. The comparison video's that are presented doesn't show us the effects in scripted vs. GPU or rendered on CPU vs. GPU.

As example:
On Mirrors edge, the physx comparison video's were presented with banners while without physx there were no banners at all. Personally, I find that somewhat pathetic. I mean, I've seen banners wave in the wind scripted, as realtime physics or totally static (even old wolfenstein had those).

Or shattering glass. We've seen that before, both scripted and realtime.

Take a look at this video showing the physics in force unleashed (and explaining the different effects):
http://www.youtube.com/watch?v=e2G7JRAH9LU&feature=related

Here they have actually taken the time to create good physics on CPU. You'll probably recognise some of the effects as PhysX GPU only effects in some of the PhysX games.

Another game is Farcry 2:
http://www.youtube.com/watch?v=cfSBHJRU9_Q
Interactive weather physics that impacts how you actually play the game. Also run on CPU.

Yet another game is Undisputed:
http://www.youtube.com/watch?v=PWhpRDQq3Ss&feature=PlayList&p=2B96F0F5E9204514&index=0
Realtime rendered cloth effect on CPU.

I can go on forever.

PhysX has potential. But, in its current state (where they simply remove effects that non PhysX GPU showcase titles have) and doesn't add anything special, I can't say I'm impressed.

As a consumer, I don't like the way PhysX is implemented. Not adding effects that some find more immersive to gamers and instead run them on GPU only. Removing consumers choice to run PhysX on secondary card if primary card is from another company by actively blocking it. Running it on closed standard instead of open standard as other middleware producers.

As a consumer, I would prefer that developers spend their energy on physics that are on open standards and optimize it there first. Then they can add physX effects where its NOT POSSIBLE to run the same/similar effects either scripted or on CPU. Meaning, have scripted banners (or realtime banners) and just add PhysX GPU banners as option.

As a consumer, Havok on OpenCL (when that comes) and Bullet on OpenCL (which comes this month) are better alternatives, since it doesn't limit any choice and we don't deal with someone who actively blocks our other hardware if we buy a card from them as second GPU (as PPU).

Its not against Nvidia or PhysX, but what they've done with it.
 
[H] thread fail. too many fanboys.

One simple fact is that the phyx api is getting widespread use. The other open source or OpenCL or whatever do not seem to be in use in any games.

The physx api is free to use. ATI turned it down. So who to blame?

Argument is moot since the physx api runs in software mode for ATI card users, and most people are running mulitcore processors, so they get physx anyway, and likely just as good as if it ran on a GPU.

big effing deal.

and unless you are a GPU hardware/software engineer you have no business stating whether or not its feasible for nvidia to run it when something else is rendering... I know i've had alot of driver issues with ati cards... so while it will usually work fine paired this way, (1) theres no need since the cpu can do it, and (2) ati can't go blaming nvidia drivers when their shit isn't working right. (I would honestly point the finger for drivers issues at ati before I would nvidia based on my own experience over the last 7 years. It's not fanboi-ism it simply is).
 
Some of you may know that Creative not only did not properly support their older cards in Vista (they claimed that their older cards could not do this, couldn't do that under Vista), but also threatened to sue a modder who was providing fully functional Vista drivers for these older cards!
There was a very severe backlash from the community, and Creative backed up, and let that modder do its thing. Moreover, I just saw that they released Windows 7 drivers for one of those older cards that I have!

nVidia should learn from this, and let its customers use the hardware that they paid for, instead of crippling it intentionally by means of drivers.

I'd say that it's a very fine line nVidia is walking!

Not a Creative defender, in fact they are definitely on my sleazy corporations to avoid if possible list but, what Creative did was different by a long shot. The features they stripped out of the Audigy 2 drivers for Vista had less to do with marketing the Xfi than it did with third party licensing for many of the 3rd party owned features built into the Audigy 2. Creative licensed a load of 3rd party shit for the A2, but the licenses didn't cover Vista, which was not even on their radar at the time they purchased those licenses. They did not want to spend their own cash to re license for Vista for an old card. So for while they held off to milk Xfi marketing value, and then for a time they sold those licensed features to A2 owners, via a paid update, presumably to offset the license fees they would have to pay and pad their own pockets a bit. Now they do not require the fee for the fully functional drivers/software, and under Vista, the A2 again has similar functionality to what it did in Xp.

What Nv is doing now appears to have more to do with milking PhysX for marketing value, and perhaps an attempt at forcing a standard, and has little, to nothing to do with any licensing issues or technical merits. After all, an AMD for graphics with a Nv card for PhysX worked just fine til the latest drivers came out.
 
Last edited:
I think the key issue is not to purchase any nvidia gpu for graphics, physx, or cuda till they stop this BS.
 
[H] thread fail. too many fanboys.

One simple fact is that the phyx api is getting widespread use. The other open source or OpenCL or whatever do not seem to be in use in any games.

The physx api is free to use. ATI turned it down. So who to blame?

Argument is moot since the physx api runs in software mode for ATI card users, and most people are running mulitcore processors, so they get physx anyway, and likely just as good as if it ran on a GPU.

big effing deal.

and unless you are a GPU hardware/software engineer you have no business stating whether or not its feasible for nvidia to run it when something else is rendering... I know i've had alot of driver issues with ati cards... so while it will usually work fine paired this way, (1) theres no need since the cpu can do it, and (2) ati can't go blaming nvidia drivers when their shit isn't working right. (I would honestly point the finger for drivers issues at ati before I would nvidia based on my own experience over the last 7 years. It's not fanboi-ism it simply is).

I think that everyone's argument is more about nvidia screwing over the end user, not physx.
 
The physx api is free to use. ATI turned it down. So who to blame?

Free to USE, not free to create your own implementation.

Argument is moot since the physx api runs in software mode for ATI card users, and most people are running mulitcore processors, so they get physx anyway, and likely just as good as if it ran on a GPU.

The problem is that people want to run PhysX on Nvidia cards but render with ATI cards, and Nvidia has disabled that. The problem that complicates matters is that games like B:AA don't allow for the advanced physics effects that multicore CPUs can handle even though they are present. They also didn't enable multicore scaling (which is literally a single flag that you set in PhysX to enable multiple threads), so physics runs in a single thread anyway. Nvidia is trying to make GPU physics look more impressive. They are trying to force us to be impressed by it, when the tech can stand on its own.

big effing deal.

It IS a big deal for everyone that was previously using an ATI and Nvidia card at the same time, as Nvidia just disabled hardware they purchased. I'd be fucking pissed if I had a setup like that.

and unless you are a GPU hardware/software engineer you have no business stating whether or not its feasible for nvidia to run it when something else is rendering... I know i've had alot of driver issues with ati cards... so while it will usually work fine paired this way, (1) theres no need since the cpu can do it, and (2) ati can't go blaming nvidia drivers when their shit isn't working right. (I would honestly point the finger for drivers issues at ati before I would nvidia based on my own experience over the last 7 years. It's not fanboi-ism it simply is).

PhysX runs completely independently of whatever is doing the rendering. There is no driver cross talk, none of that crap. That is something fanboys made up to try and defend Nvidia's obvious decision to fuck over consumers.
 
I think that everyone's argument is more about nvidia screwing over the end user, not physx.

Exactly. If I bought an Nvidia card and I want to use it just for PhysX, why can't I? Did I not spend my good money for it?
 
wonder if nvidia could be sued for this? :D

They do market their cards as being able to accelerate PhysX, and since they do actually accelerate PhysX, they would prolly skate. Even if they do not do so while an AMD card is present. Plus there may not be enough people that actually use an AMD for rendering + Nv for PhysX, to get a class action going. A card dedicated to hardware accelerated PhysX with another card doing rendering is still a niche inside a niche after all. I did to try it out at one point, but I removed it because few of the games I play used it and my case at the time was not the greatest for keeping everything cool
 
I'm a diehard Nivida fan, but they haven't been looking too good lately. I really would like a good Nivida card to replace my 8080gt sli...but see nothing... but see nothing except the 5870!
 
Woot! My thread made through the front page, no wonder this thread grows so fast in one day.
 
[H] thread fail. too many fanboys.

One simple fact is that the phyx api is getting widespread use. The other open source or OpenCL or whatever do not seem to be in use in any games.

The physx api is free to use. ATI turned it down. So who to blame?

Argument is moot since the physx api runs in software mode for ATI card users, and most people are running mulitcore processors, so they get physx anyway, and likely just as good as if it ran on a GPU.

big effing deal.

and unless you are a GPU hardware/software engineer you have no business stating whether or not its feasible for nvidia to run it when something else is rendering... I know i've had alot of driver issues with ati cards... so while it will usually work fine paired this way, (1) theres no need since the cpu can do it, and (2) ati can't go blaming nvidia drivers when their shit isn't working right. (I would honestly point the finger for drivers issues at ati before I would nvidia based on my own experience over the last 7 years. It's not fanboi-ism it simply is).

the problem with your argument is it's based on some fairly wrong assumptions.

1. physx is getting next to no support whatsoever.
2. physx enabled titles whether they be gpu accelerated or just have physx period number in the single digits.
3. of those titles only 2 are of any interest to anyone; unreal 3 and arkham
4. havok has been around for a long time, in fact lets see...
source engine games
doom 3 engine games
quake 3 engine games

etc...

the list of 'a' titles with havok in them is staggering compared to physx.

the reason ati decided to go it the way they did is to support the mainstream games out there. havok was a licensed physics engine used by many developement houses in their own game engines. if ati does actually accelerate havok in gpu under open cl you will be very happy they did as it will mean support for the majority of games that actually have physics in them and easy patches to allow several more objects and interactivity in those titles.
 
hows this wrong?

How is it NOT wrong? Nvidia just pushed out an update that disables their video cards if a competitors card is installed as well, I have no idea how anyone can justify that.

If WD pushed out an update that disables WD hard drives if they are RAIDed with a Seagate drive, would you support that? Fuck no, and this isn't any different.
 
the problem with your argument is it's based on some fairly wrong assumptions.

1. physx is getting next to no support whatsoever.
2. physx enabled titles whether they be gpu accelerated or just have physx period number in the single digits.
3. of those titles only 2 are of any interest to anyone; unreal 3 and arkham
4. havok has been around for a long time, in fact lets see...
source engine games
doom 3 engine games
quake 3 engine games

etc...

the list of 'a' titles with havok in them is staggering compared to physx.

the reason ati decided to go it the way they did is to support the mainstream games out there. havok was a licensed physics engine used by many developement houses in their own game engines. if ati does actually accelerate havok in gpu under open cl you will be very happy they did as it will mean support for the majority of games that actually have physics in them and easy patches to allow several more objects and interactivity in those titles.

#1 and partially #2 are completely false

physx is being used by as many devs as havok. actually, it is used more than havok now by well established developers with multiplatform support across the pc, all the current consoles, and the iphone, with many publishers backing it such as ea, sega, capcom, take2, etc. it also happens to be used in two of the most used, if not the most used game engines in ue3 and gamebryo.

physx games supported by gpu acceleration are only a few like you said, but software physx based games number in the lower hundreds now.

#3 is a matter of your opinion.

#4 is completely true; though given that physx has overtaken them as the leading physics engine in the span of time of about a third of havok's existence, it makes this fact irrelevant.

havok has more "A" titles like you said only because it's been around much longer than physx. that's the main reason. ati went with havok (and also bullet) because they don't want to support nvidia's tech in which they are completely justified in doing. it is highly unlikely gpu accelerated havok will be backwards compatible in order to improve older games unless the developers are willing to go back and patch them with this kind of support. many quality titles are currently in development using software physx (including some with additional gpu physx support): such as mafia 2, dark void, mass effect 2, borderlands, aliens colonial marines, dragon age origins, etc.

physx and havok are both great engines with mainstream support that will still be used for some time to come. gpu based physics acceleration is still in the stages of infancy and it will take time before it can mature to the point where not only will it become relevant, but well-established.
 
Last edited:
Don't care or like it at all. I hope it will die, and even probably will if nvidia is keeping it closed like this.
 
Proprietary standards suck smelly fart juice ass. I hope physx dies quickly and DirectX Compute is used for physics calculations in future DX11 games.

1. Call proprietary standards 'sucky'.
2. Declare your love for the pinnacle of proprietary standards.
3. Priceless ;)

Also, DirectCompute is no bloody physics library. It's more akin to GLSL in OpenGL and a bit of CUDA/Stream (Brook+). Someone would still have to implement a bloody physics library on top of bloody DC.

Oh wait, you were just being sarcastic, right? :)
 
And this is where we see the benefit of AMD being stubborn and not offering PhysX on their cards depsite the fact they are perfectly capable doing so.

Because Nvidia own the technology and will be prone to tantrums like this that just happen put them at a massive advantage as a business and put all of their competitors at a disadvantage.

It's all bullshit anyway I ran the batman demo with physx on full and was getting 25-30fps on my Q9450 @ 3.6Ghz, and the CPU usage was at about 30%

If PhysX actually used the CPU in the same way that the ghostbusters new physics engine used the CPU (i.e capable of almost exact load balancing across all 4 cores up to 100% CPU usage) then people with quad cores would be happily running PhysX without the need for graphics cards in the first place, and in 2-3 years time all the average joes will be doing it as well, as quad becomes the average gaming CPU.

But Nvidia dont sell CPUs so why bother right? :rolleyes:
 
Proprietary standards suck smelly fart juice ass. I hope physx dies quickly and DirectX Compute is used for physics calculations in future DX11 games.

And this is why most (not all) anti-PhysX people are clueless about anything regarding this tech.

1) Did you know that DirectX Compute is proprietary ?
2) Did you also know that DirectX Compute is NOT a physics API ?
3) Did you know that PhysX and Havok (both proprietary, Havok being more expensive to license) have the biggest percentage of use among developers ?
4) Did you know that PhysX is actually the number 1 physics API at this point in time ?
 
And this is why most (not all) anti-PhysX people are clueless about anything regarding this tech.

1) Did you know that DirectX Compute is proprietary ?

Yes, its proprietary. Belongs to Microsoft. Its a proprietary open standard (as oposed to PhysX on GPU being a proprietary closed standard on CUDA).

2) Did you also know that DirectX Compute is NOT a physics API ?

Direct compute can be used as API for physics, but it doesn't have physics libraries itself.

3) Did you know that PhysX and Havok (both proprietary, Havok being more expensive to license) have the biggest percentage of use among developers ?

Yes. And, Havok is the most common used among the major developer studios.

4) Did you know that PhysX is actually the number 1 physics API at this point in time ?

Number one according to the one survey of a magazine as shown here:
http://www.bulletphysics.com/wordpress/?p=88

or in rewards?

http://games.ign.com/articles/987/987463p1.html

or number one as the most used in the developer studios?

Havok is just growing and growing. For each AAA title that comes with PhysX, there's like 10 on Havok.

THQ licenced PhysX, but 9 out of 10 of their studio's uses Havok:
“As a part of our long-standing partnership with Havok, nine out of our ten internal studios, including Relic, Rainbow and Volition, are actively using Havok Physics and other Havok products in development today. We have found unique value in Havok's cross-platform physics solution, and in Havok's industry-leading support, especially in critical franchises like Saint's Row, Warhammer 40,000: Dawn of War, Red Faction, Smackdown vs. Raw, and UFC (Ultimate Fighting Championships). Given our confidence and success working with Havok over the years, we are expanding our use of Havok products in seven new multi-platform titles slated for development over the course of the next two years.” –Roy Tessler, THQ
http://www.havok.com/index.php?page=customer-quotes
 
Last edited:
Sure DirectX is also propriatary and thats a good thing in some ways, but it's not owned by a company who design GPUs which are essential for mainstream gaming. The problem is Nvidia are making PhysX and they have a very strong interest in being able to control industry standards.

In the long run its bad for gamers to have one company like Nvidia in control of an API which becomes a standard in games development, they will ultimately abuse the position to make more money which will be designed to crush the competition.

The very fact that PhysX is incredibly unoptimised for the CPU is an early example of why it's a bad thing, who wants to be forced into buying new hardware like a 2nd GPU when we could run a lot of these effects on the CPU?
 
Yes, its proprietary. Belongs to Microsoft. Its a proprietary open standard (as oposed to PhysX on GPU being a proprietary closed standard on CUDA).

Hehe "proprietary open standard"...you know..."proprietay" and "open standard" aren't really supposed to be used in the same sentence :)

And you're still confusing PhysX as a physics API, with physics computed on the GPU. PhysX is nothing more than a physics API and it really doesn't care where its instructions will be executed. Its open for license and besides that, all that developers care for is the flexibility and robustness of its API and also the tools available to work with it. The fact that GPU physics is a GeForce only thing for now, is highly irrelevant for them, because it's not a physics API that determines if a game is going to be successful or not, But as I said before, from the developers point of view, they want something that's easy to work with and PhysX is surely providing them that ability. And they can even provide extra options/effects (for those that can use GPU physics) to improve the realism of some of the effects in game.

Tamlin_WSGF said:
Direct compute can be used as API for physics, but it doesn't have physics libraries itself.

"API for physics" ? Are you trying to play with words ? Direct Compute has nothing to do with physics, just like CUDA and OpenCL don't either. You'll always have to use a physics API. Direct Compute = CUDA = OpenCL.

Tamlin_WSGF said:
Yes. And, Havok is the most common used among the major developer studios.

Ah so now it's "major" studios that matter. Actually, EA licensed PhysX not long ago, for some of their upcoming games and EA is, well, pretty big. And 2K aswell for that matter. Mafia 2 is probably one of those titles.

Tamlin_WSGF said:
Number one according to the one survey of a magazine as shown here:
http://www.bulletphysics.com/wordpress/?p=88

Yes that one. An independent source at that. It's not "Havok.com" saying its the best and it's not "PhysX.com" saying it's the best. It's the third best showing where it stands in the grand scheme of things.

Tamlin_WSGF said:

What do rewards mean anyway ? They can win whatever they want, but if developers don't use their API, those "rewards" (which I'm pretty sure you wanted to say "awards") are meaningless.

It seems you are, as usual, trying to imply that anyone that may somehow "defend" PhysX, says that Havok is crap. Not at all (unlike the anti-PhysX crowd, that openly say "as long as it's not PhysX), Havok is in the industry for far longer than PhysX, yet its use brought nothing new to the table, after the initial boom of popularity.

Tamlin_WSGF said:
or number one as the most used in the developer studios?

Havok is just growing and growing. For each AAA title that comes with PhysX, there's like 10 on Havok.

THQ licenced PhysX, but 9 out of 10 of their studio's uses Havok:

http://www.havok.com/index.php?page=customer-quotes

That quote is pretty obvious. They licensed the tech for a period and obviously they'll keep it while the license is valid. Given how expensive it is to license Havok anyway, I would do the same.

As for Havok growing and growing and your 10:1 ratio for Havok over PhysX, I'm sure you didn't base that solely on your THQ quote did you ? Seems like you did...:rolleyes:
 
Hehe "proprietary open standard"...you know..."proprietay" and "open standard" aren't really supposed to be used in the same sentence :)

LOL! I think you are confusing open standard with open source.

And you're still confusing PhysX as a physics API, with physics computed on the GPU. PhysX is nothing more than a physics API and it really doesn't care where its instructions will be executed.

PhysX itself is nothing but a lot of libraries. Its API is based and ported to CUDA. It cares where the instruction is executed.



"API for physics" ? Are you trying to play with words ? Direct Compute has nothing to do with physics, just like CUDA and OpenCL don't either. You'll always have to use a physics API. Direct Compute = CUDA = OpenCL.

Physics is a bunch of libraries where calls are made through an API. Physics is not an API itself. There have been several considerations of porting physics libraries to DX11. As I said, Directx compute can be used as API for physics.


Ah so now it's "major" studios that matter. Actually, EA licensed PhysX not long ago, for some of their upcoming games and EA is, well, pretty big. And 2K aswell for that matter. Mafia 2 is probably one of those titles.

Blockbuster titles matters most, yes. Indie games, not so much. Very few titles are actually upcoming on physx (both CPU and GPU) compared to Havok and even fewer has GPU acceleration path (Like NFS-shift doesn't have GPU acceleration).

Yes that one. An independent source at that. It's not "Havok.com" saying its the best and it's not "PhysX.com" saying it's the best. It's the third best showing where it stands in the grand scheme of things.

And you don't consider frontline awards an independent source? I'll take what the major studios use over a survey anytime.

What do rewards mean anyway ? They can win whatever they want, but if developers don't use their API, those "rewards" (which I'm pretty sure you wanted to say "awards") are meaningless.

And you feel that a survey where they ask questions what they think is best, is better then what the developer studios actually use? :rolleyes:

It seems you are, as usual, trying to imply that anyone that may somehow "defend" PhysX, says that Havok is crap. Not at all (unlike the anti-PhysX crowd, that openly say "as long as it's not PhysX), Havok is in the industry for far longer than PhysX, yet its use brought nothing new to the table, after the initial boom of popularity.

No, I am implying (actually I directly mean) that you are hyping up PhysX support and size to something it isn't.
And Havok is industry leading as well. Nothing new? Havok managed to run cloth realtime on CPU and introduced Havok AI, also on CPU (for everyone) Didn't you pay attention?


That quote is pretty obvious. They licensed the tech for a period and obviously they'll keep it while the license is valid. Given how expensive it is to license Havok anyway, I would do the same.

They have relicensed Havok. Even though they have to pay for it and PhysX is given away for free. That should tell you something.

As for Havok growing and growing and your 10:1 ratio for Havok over PhysX, I'm sure you didn't base that solely on your THQ quote did you ? Seems like you did...:rolleyes:

No, I took a survey... :D
 
Just goes to show you that Nvidia can be a bunch of corporate pricks not caring about the consumer. And there are valid reasons that they can be sued for doing this. You buy a their card strictly for physx before their 185 drivers and then they disable it.
 
LOL! I think you are confusing open standard with open source.

No. But it seems I have to explain why "proprietary" and "open standard" can't be used freely on the same sentence. To be a standard it implies that everyone can and most likely will use, without any kind of license fee and/or any kind of "tie in" to the tech. That isn't the case with Direct Compute, since it's tied to a specific API, which is proprietary. With OpenCL however, there are no limitations imposed like this. It's truly set to be an "open standard" and all that's needed is that the hardware manufacturers provide the proper driver translation for its isntructions.

Tamlin_WSGF said:
PhysX itself is nothing but a lot of libraries. Its API is based and ported to CUDA. It cares where the instruction is executed.

The first part is correct. It's a bunch of libraries. The second however, is not entirely. Its instruction set was ported to CUDA, so that NVIDIA's GPUs can compute the data, on its processing units. PhysX itself, as a physics API doesn't "care" where it's executed, although it defaults to the common denominator: the CPU and thus has a path to x86 instructions.

Tamlin_WSGF said:
Physics is a bunch of libraries where calls are made through an API. Physics is not an API itself. There have been several considerations of porting physics libraries to DX11. As I said, Directx compute can be used as API for physics.

There are ? That's great. Proof ? Surely the major physics APIs holders, are keen to work with Microsoft on that front...except that Microsoft doesn't want to...:p

Tamlin_WSGF said:
Blockbuster titles matters most, yes. Indie games, not so much. Very few titles are actually upcoming on physx (both CPU and GPU) compared to Havok and even fewer has GPU acceleration path (Like NFS-shift doesn't have GPU acceleration).

And where's this list ? Must be of public domain, for you to know all this. PhysX list is public and easily checked along with news info of major studios, doing deals with NVIDIA and PhysX.

And for PhysX to be used, you don't really need to have GPU physics support...but keep insisting on that (for the millionth time or something like that)...Maybe if you keep saying it, it will become true ? :rolleyes:

Tamlin_WSGF said:
And you don't consider frontline awards an independent source? I'll take what the major studios use over a survey anytime.

I didn't put it aside (as you are trying to discredit the survey from bullet physics). All I did was mention that awards mean nothing, if developers don't use the tech. Havok still has a big chunk of the physics market, but it's diminishing. So unless these awards automatically make developers want to license Havok, they don't have much importance do they ?

Tamlin_WSGF said:
And you feel that a survey where they ask questions what they think is best, is better then what the developer studios actually use? :rolleyes:

When the survey involves senior developers from various development houses, that clearly chose one over the other, because it's easier to work with, yes. It's quite valuable info. And being a developer myself, I can only relate to that.

They surveyed over 100 senior developers of various development houses, mainly working on PC, PlayStation 3 or XBox 360.

Tamlin_WSGF said:
No, I am implying (actually I directly mean) that you are hyping up PhysX support and size to something it isn't.
And Havok is industry leading as well. Nothing new? Havok managed to run cloth realtime on CPU and introduced Havok AI, also on CPU (for everyone) Didn't you pay attention?

Here comes the tiresome argument again :rolleyes:
I'm not hyping anything. The most I do in these threads, is to usually correct the misconceptions that some have over what PhysX is and how it is used. And also why it is used by developers. Developers choose what's best for their product and as can be seen in that survey, some developers (the majority of the survey) prefer PhysX over Havok. These are people that know what they are talking about.

And show me a game that uses those heavy Havok effects on a CPU. Are we talking about a tech demo ? A real game ? What ?

Tamlin_WSGF said:
They have relicensed Havok. Even though they have to pay for it and PhysX is given away for free. That should tell you something.

PhysX is not given away for "free", but it is open for license. And what does that tell you exactly ? That companies that use the same engines from one game to the next, will not change the physics engine in it, because that's a major undertaking that will endanger their schedules. This is even clearer, when you think about how most development studios work nowadays i.e. cross platform development.

Tamlin_WSGF said:
No, I took a survey... :D

:rolleyes:
 
PhysX is not given away for "free", but it is open for license. And what does that tell you exactly ? That companies that use the same engines from one game to the next, will not change the physics engine in it, because that's a major undertaking that will endanger their schedules. This is even clearer, when you think about how most development studios work nowadays i.e. cross platform development.


A physics standard should be free. The crossfire license is free which is why intel always supported the platform and has helped to sell a lot more ati graphics cards than if ati would have made money from the license. If nvidia didn't shut down physx on ati cards, I would have bought an nvidia card for physics.
 
No. But it seems I have to explain why "proprietary" and "open standard" can't be used freely on the same sentence. To be a standard it implies that everyone can and most likely will use, without any kind of license fee and/or any kind of "tie in" to the tech. That isn't the case with Direct Compute, since it's tied to a specific API, which is proprietary. With OpenCL however, there are no limitations imposed like this. It's truly set to be an "open standard" and all that's needed is that the hardware manufacturers provide the proper driver translation for its isntructions.

Are you seriously saying that DirectX 11 isn't an open standard? :eek:



The first part is correct. It's a bunch of libraries. The second however, is not entirely. Its instruction set was ported to CUDA, so that NVIDIA's GPUs can compute the data, on its processing units. PhysX itself, as a physics API doesn't "care" where it's executed, although it defaults to the common denominator: the CPU and thus has a path to x86 instructions.

PhysX cares where its executed. Did you read the topic title of this thread even?
For PhysX to be executed at all, it needs an API. Aegia had its own API where it accessed its libraries. Nvidia ported that API to CUDA.



There are ? That's great. Proof ? Surely the major physics APIs holders, are keen to work with Microsoft on that front...except that Microsoft doesn't want to...:p

If you wish, you can search up on Havok FX which were supposed to run (on both ATI and Nvidia cards) on DirectX SM3.


And where's this list ? Must be of public domain, for you to know all this. PhysX list is public and easily checked along with news info of major studios, doing deals with NVIDIA and PhysX.

A comprehensive list of ALL the Havok games haven't been made, but here is an extensive list:
http://www.havok.com/index.php?page=available-games

And for PhysX to be used, you don't really need to have GPU physics support...but keep insisting on that (for the millionth time or something like that)...Maybe if you keep saying it, it will become true ? :rolleyes:

No, you don't. I have never disagreed upon that. I do however contest that you need GPU physics support to get features like destruction on PhysX.


I didn't put it aside (as you are trying to discredit the survey from bullet physics). All I did was mention that awards mean nothing, if developers don't use the tech. Havok still has a big chunk of the physics market, but it's diminishing. So unless these awards automatically make developers want to license Havok, they don't have much importance do they ?

Awards is a recognition. That most! major studio's actually prefer to pay for Havok then get PhysX for free, says even more. :cool:



When the survey involves senior developers from various development houses, that clearly chose one over the other, because it's easier to work with, yes. It's quite valuable info. And being a developer myself, I can only relate to that.

It depends really on the questions and what actions are made afterwards. Since most major developement houses uses Havok physics (I can provide with a lot of links, but its saturday, so maybe sunday I will), I wonder who they ask.



Here comes the tiresome argument again :rolleyes:
I'm not hyping anything. The most I do in these threads, is to usually correct the misconceptions that some have over what PhysX is and how it is used. And also why it is used by developers. Developers choose what's best for their product and as can be seen in that survey, some developers (the majority of the survey) prefer PhysX over Havok. These are people that know what they are talking about.

You are really hyping PhysX and not only in this thread. You take one survey (a bunch of question and something we see every election) and use that as a final conclution how big PhysX is. Aegia tried to give physx away for free earlier and Nvidia did so last year as well. Still the major D houses are PAYING to get Havok instead.

And show me a game that uses those heavy effects on a CPU. Are we talking about a tech demo ? A real game ? What ?

I've given several. Check out this post (from this very thread):
http://hardforum.com/showpost.php?p=1034670849&postcount=86


PhysX is not given away for "free", but it is open for license. And what does that tell you exactly ? That companies that use the same engines from one game to the next, will not change the physics engine in it, because that's a major undertaking that will endanger their schedules. This is even clearer, when you think about how most development studios work nowadays i.e. cross platform development.

You get a license for free according to Nvidia. Are you accusing them for lying?:
http://developer.nvidia.com/object/physx_downloads.html

:rolleyes:[/QUOTE]
 
I have to disagree. first because the cards would mostly likely be low margin cards. and once you factor in the oppurtunity cost it's a no brainer for them to do this.
1. they would have to be responsible by any driver interactions caused by ati's driver
2. Ati gets a free pass as GPU development since they could seed physx acceleration to nvidia
3. ATI would never have any reason to support physx. if physx gets populer enough, A big if i know, but if it did, ATi would probally have to support it.

4. most people would be using thier old nvidia card, why, because nvidia as 70% marketshare, so by far away the most likely statisical situtation would be one,where someone was buying a new ATI card to use with an old Nvidia card.

so in sum, the losses for nvidia easily dwarf any material gains by allowing the pratice.
it just good business pratice by Nvidia, so company worth it's salt allows free riding when it doesn't have to.

Sadly you totally miss the point that GPU Physics acceleration is not the same market as GPU 3D Graphics acceleration. And I don't think you have a solid grasp on opportunity costs because Nvidia can choose which cards it manufactures and force the market to choose which cards it uses for physx acceleration. The re-hashed 8800GT probably has a cost to manufacture of $5 (the GPU processor itself) now and it probably sells to manufacturers for close to 5-10x this amount. Nvidia can continue to manufacture these or phase it out and push the market to their next midrange card which will probably have low manufacturing costs as well compared to the high end cards.

1. The Nvidia Physx drivers run independently of the graphics drivers and their interaction causing problems is about as likely as soundcard drivers not working well with graphics drivers. It can happen, but it it would be more of a quirky uncommon thing than a major problem and at worst GPU accelerated physics would have to be disabled in that particular game. If ATI's drivers were borking Physx from running in a particular game and it was fixable they would fix it much like they would for any other driver interaction problem.

2. ATI chose not to hop on Physx as its choice for physics acceleration. If the market chooses Physx then ATI will either have to license the technology from Nvidia or allow customers to use an Nvidia card for physics acceleration. And since ATI never blocked this from happening I don't see why they would if the market goes that way and they choose not to license.

3. Same answer as #2.

4. If Physx continues to develop there may be a need for faster GPUs to drive Physx. So an old card may work well for some time, but it may reach a point where a more powerful card is needed. And if Nvidia's graphics cards are not the best choice for a person why not keep a Nvidia Physx card running in their system? Because when it comes down to it very few people are going to buy a lower performing graphics card and physics card from Nvidia for the same price as a higher performing ATI card.
 
Are you seriously saying that DirectX 11 isn't an open standard? :eek:

Haha! You mean *Microsoft* DirectX? The one that works on Microsoft products?

As compared to, say, OpenGL that works on everything from Solaris, Linux, OSX, Windows, even BeOS, to the PS3 (well, sorta; they use a subset called OpenGL ES) to handhelds like the iPhone 3G or Nokia phones... Microsoft DirectX isn't an open standard at all. Just because lots of developers use it doesn't mean it's open.
 
Microsoft made DX as a way to allow applications on their operating systems to access the gaming hardware. What business does DX have on any other OS?

It just happens to be popular, because windows is popular, get over it.
 
Who the fuck cares which one is better right now (PhysX vs Havok). You guys bickering back & forth miss the real travesty & argument. That is, Company A rendering something you bought/ or plan to buy, which is touted as such purpose, useless in its utility b/c it finds you have Company B's product. :rolleyes: :mad: This is bullshit & it is stupid.
 
Last edited:
Agreed. I bought a GTX 275, I should be able to use it for physx regardless of whether I switch to an ATI DX11 card for graphics rendering or not...
 
Who the fuck cares which one is better right now (PhysX vs Havok). You guys bickering back & forth miss the real travesty & argument. That is, Company A rendering something you bought/ or plan to buy, which is touted as such purpose, useless in its utility b/c it finds you have Company B's product. :rolleyes: :mad: This is bullshit & it is stupid.


That is the true issue here.
We do tend to slide into arguing the technical merits of the respective API's, standards, libraries, dev tools, and etc. when this sort of thread comes up.

I don't really know why anyone is surprised over this behavior. Nv did this with SLI for the longest. Just to sell their less than stellar, generally over priced SLI mobo chip sets, even when it was painfully obvious there was no technical reason for it.
I am actually surprised they did not do this sooner to be honest.
 
Agreed. I bought a GTX 275, I should be able to use it for physx regardless of whether I switch to an ATI DX11 card for graphics rendering or not...
I feel for you. Nvidia could have been getting extra money from me, cause I wanted a gpu from them to run physX with my 4870x2, now they lose out on a prospective customer as myself & shun all others who would be willing to use their tech for PhysX. Lost sales????? Imagine if they detected you used an AMD CPU or MOTHERBOARD! :confused: :eek: Where does it end???
 
Back
Top