Dark Void PhysX Gameplay Performance and IQ @ [H]

Swimming in money for a 8800gt at $50 in the used market or a new but slower card (still better than nothing) on the new market? Perhaps someone needs to look for a new job cause if you cant afford that you are in the wrong hobby...look at 360 or PS3 for budget action. ;)

Umm, you do know that nVidia in thier driver updates removed the ability to run an 8800 GT for PhysX in tandum with an ATI card even though it was working right? Infact, they went further to disable PhysX anytime any ATI card is located in the system, including the ATI TV Tuner cards. So some customers are now forced between TV Tuning on their PC or using PhysX. Seriously, wth is up with that?

PhysX - Bringing Phyics to games, at the cost of watching tv on your pc -- Its worth it (tm).

So right now, I could have a very expenisve 5970 and be unable to use an nVidia 8800gt or even a more expensive card for that matter. PhysX might be too taking for an 8800gt even though since it HALFED the frame rates of a more powerful GTX 260. That might be 100% or more of the power of an 8800GT.

The sooner the industry uses a more open no lisencning fee technology like bullet, direct compute /w physics engine or havock the better. It's interesting that the Crytech 2/3 engine are being built with their own non-physX engine in mind. I imagine the same will be done with Half Life 3's engine with some posts by Gabe Newell and his thoughts on that issue and more likely to use an open-standard cpu/gpu combo. Rage3D engine has its own physics, etc. Really, its just UE in UE4 that needs to make its own Physics engine and PhysX will be kind of useless as the most popular game development engines all have their own PhysX engine.
 
Right, the reason why there's never (to my knowledge) any ATI branded games is simply because publishers want to put the nVidia logo on their box's cover art. I'll buy that.

All I can think of recently is Dirt2 and.. hmm definitely one other title, with AMD splash screens at the start.
 
The other question would be why are the game developers using PhysX and not using an open standard of development that can do the same or better graphics.

Simple. This fabled open standard of development that can do the same thing doesn't exist. Bullet is the closest thing but that's not yet ready for prime time and it's also CUDA based.

With games like this coming out more and more it will become a larger issue of NVIDIA forcing Devs to screw us out of a solid game experience just for choosing a competing product.

Don't be silly. Nvidia doesn't make a dollar if they screw you out of a solid game experience. They make a dollar if they convince you that their cards are superior because they support PhysX. I'm not sure how you can objectively analyze this issue if that's the angle you're taking.

A. I'm sure some other company can make sparks, water, and cloth.

Yeah I'm sure too. Still waiting for "some other company" to step up though.
 
Lots of nonsense crap being spewed in this thread.

So nvidia goes "hey, our GPUs can do physics, here is the engine, its free. we'll even help you add it to your game".

Developers says "Ok cool, but not everybody has your GPUs so we gotta make it an optional eyecandy thing"

Result: people who own nvidia GPUs get some extra shit. People without nvidia GPUs get the same game they would have gotten anyways.

If you think those effects are freaking awesome you should go buy an nvidia GPU.
If you think they are stupid, then don't buy nvidia GPUs.

Bitching at nvidia for not adopting nonexisting standards if absolutely ridiculous.
 
Lots of nonsense crap being spewed in this thread.

So nvidia goes "hey, our GPUs can do physics, here is the engine, its free. we'll even help you add it to your game".

Developers says "Ok cool, but not everybody has your GPUs so we gotta make it an optional eyecandy thing"

Result: people who own nvidia GPUs get some extra shit. People without nvidia GPUs get the same game they would have gotten anyways.

If you think those effects are freaking awesome you should go buy an nvidia GPU.
If you think they are stupid, then don't buy nvidia GPUs.

Bitching at nvidia for not adopting nonexisting standards if absolutely ridiculous.

But if you own an nvidia GPU and want to use your ATI card as your main card you can't use physx, effectively crippling the card you paid money for.
 
Lots of nonsense crap being spewed in this thread.

So nvidia goes "hey, our GPUs can do physics, here is the engine, its free. we'll even help you add it to your game".

Developers says "Ok cool, but not everybody has your GPUs so we gotta make it an optional eyecandy thing"

Result: people who own nvidia GPUs get some extra shit. People without nvidia GPUs get the same game they would have gotten anyways.

If you think those effects are freaking awesome you should go buy an nvidia GPU.
If you think they are stupid, then don't buy nvidia GPUs.

Bitching at nvidia for not adopting nonexisting standards if absolutely ridiculous.

Well said!

You pretty much hit the nail on the head. Fact remains PhysX adds effects to our PC games. But chances are, if you were an anti Ageia PPU before, you will have the same attitude today with Nvidia's PhysX now and it dosen't seem to matter what effect pops them in the eye they will not like it period. What do these idiots want...a blow job at the end of the game too? LOL
 
Fact remains PhysX adds effects to our PC games.

It doesn't add effects to games. It prevents others from having those effects.

IMO PhysX enabled hardware should offload the physics from the CPU, not prevent it from existing. There is more than ample CPU power to spare for at least minimal effects. There should be a toggle to enable hardware processing.
 
Last edited:
It doesn't add effects to games. It prevents others from having those effects.

IMO PhysX enabled hardware should offload the physics from the CPU, not prevent it from existing. There is more than ample CPU power to spare for at least minimal effects. There should be a toggle to enable hardware processing.

No, it doesn't. Again this is more misleading bs like others have tried to point out. These physics affects cannot be reproduced in the same way without using *gasp* PHYSICS! How much of a genius do you have to be to understand that physics effects require physics engines? The effects they can make work outside, they make work outside of the engine. The physics effects are individual polygons that act in an independent way when they go off. This requires extra computing, something that cannot be done without a physics engine. Otherwise you would get the same old tired effects we have gotten for years. You don't NEED this extra eye-candy, but it definitely makes the game more enjoyable for many people. Also, some of these effects will eventually be able to be reproduced in some way using DX11. But DX11 is not as complete as the PhysX engine and CUDA development.

But if you own an nvidia GPU and want to use your ATI card as your main card you can't use physx, effectively crippling the card you paid money for.

Cry me a freaking river. Seriously! How much do people have to nitpick? You can do it with hacked drivers if you really want to. But why should Nvidia help support ATI, when they have reached across the table a few times just to have their hand slapped away? Nvidia is putting up all the money to develop these technologies and enable their cards to run them. They offer licensing to AMD/ATI which they refuse. Why should they share 'anything' with them then? No amount of contrary reasoning is going to change this fact. They are businesses, you don't help out your competition if they are going to spit in your face every time you offer something to them.

Now if you were to ask me if I think AMD/ATI should license PhysX, I would say I don't really see the need for them to do so. But neither do I fault Nvidia for not allowing them to take advantage of that ability at all.
 
No, it doesn't. Again this is more misleading bs like others have tried to point out. These physics affects cannot be reproduced in the same way without using *gasp* PHYSICS! How much of a genius do you have to be to understand that physics effects require physics engines? The effects they can make work outside, they make work outside of the engine. The physics effects are individual polygons that act in an independent way when they go off. This requires extra computing, something that cannot be done without a physics engine. Otherwise you would get the same old tired effects we have gotten for years. You don't NEED this extra eye-candy, but it definitely makes the game more enjoyable for many people. Also, some of these effects will eventually be able to be reproduced in some way using DX11. But DX11 is not as complete as the PhysX engine and CUDA development.
You missed his point. PhysX on the GPU should supplement PhysX on the CPU, not add effects to the games.

Cry me a freaking river. Seriously! How much do people have to nitpick? You can do it with hacked drivers if you really want to. But why should Nvidia help support ATI, when they have reached across the table a few times just to have their hand slapped away? Nvidia is putting up all the money to develop these technologies and enable their cards to run them. They offer licensing to AMD/ATI which they refuse. Why should they share 'anything' with them then? No amount of contrary reasoning is going to change this fact. They are businesses, you don't help out your competition if they are going to spit in your face every time you offer something to them.

Now if you were to ask me if I think AMD/ATI should license PhysX, I would say I don't really see the need for them to do so. But neither do I fault Nvidia for not allowing them to take advantage of that ability at all.
They disabled the functioning of a product you paid for simply because you're also using a product from a competitor. That's about as close as you can get to anti-competitive practices without crossing into illegal. But go ahead and keep being an apologist. Seems like everyone is hellbent on expressing his/her stupidity in this thread.
 
Nvidia disabled this option in a driver update. They don't want you mix and matching cards even though it works becaues it could cause 'unexpected' results...like a crash? Yet when games don't crash, it's removed anyways.

It's basically nVidia doesn't want technology that can work on ATI/nVidia combos to work as heaven help us if we all bought 5970s and then a cheap-o 9800 GT for PhysX to get the best of both worlds while giving most of our money to ATI. The world would end!! ohh noes!

He is actually referring to a workaround that modifies drivers to enable the use of an ATI card for video and a NVIDIA card for PhysX acceleration. I currently have a 5870 for my display and 8800GTS for PhysX. For the most part it works well, although driver updates become an ordeal.

I think it would be interesting to see a hybrid in one of these evaluations.
 
You missed his point. PhysX on the GPU should supplement PhysX on the CPU, not add effects to the games.

Well what you think it should do isn't really relevant. What's relevant is the relative capabilities of GPUs and CPUs at producing these effects. I'll be the first one to cheer if someone brings out a CPU based physics engine that does particle and fluid simulations. Still waiting.
 
Well what you think it should do isn't really relevant. What's relevant is the relative capabilities of GPUs and CPUs at producing these effects. I'll be the first one to cheer if someone brings out a CPU based physics engine that does particle and fluid simulations. Still waiting.
Yeah, heaven forbid we state our opinion in a discussion, what kind of spineless apologist response is that? And we have had CPU based physics engines that do particle and fluid simulations for almost a decade anyway.
 
Well boys, I won't even try to reply to some of the silly stuff I am reading because as usual this comes down to the same old CRAP.
RED VS GREEN.
I wish one day computer users can stop this non sense crap of taking sides like this were sports teams or something...
Is like no matter what the other team does is always a crappy feature, worthless, etc....
I own Nvidia for my main rig and ATI for my HTPC and my Wife's pc so I buy whatever delivers the stuff I want at the price I need.
I went with Nvidia because of 3D support...then they dropped it for a long time and finally got it back.
To me, playing all those awesome 3d games on a 2d monitor is like using my tv speakers to enjoy a DTS-HD 7.1 blu ray soundtrack...worst a mono speaker at that.
Just does not make sense.
Is Eyefinity better than one screen, heck yeah...but it is still flat.
I wish ATI would do 3d as well to give me another choice when buying cards but not happening so far.
But when I went shopping for a card for my wife I found a 4870 that did perform great at a very low price so I got that because guys...it is just a video card...nothing else nothing more...leave fanaticism to sports...go Pens!
Oh and someone said that ATI did not want to get Physx because they did not want to support a closed format...well, if ATI were supporting it and Nvidia too...that wouldnt be such a closed format anymore dont you think? ;)
Peace!
 
Well boys, I won't even try to reply to some of the silly stuff I am reading because as usual this comes down to the same old CRAP.
RED VS GREEN.
I wish one day computer users can stop this non sense crap of taking sides like this were sports teams or something...
Is like no matter what the other team does is always a crappy feature, worthless, etc....
I own Nvidia for my main rig and ATI for my HTPC and my Wife's pc so I buy whatever delivers the stuff I want at the price I need.
I went with Nvidia because of 3D support...then they dropped it for a long time and finally got it back.
To me, playing all those awesome 3d games on a 2d monitor is like using my tv speakers to enjoy a DTS-HD 7.1 blu ray soundtrack...worst a mono speaker at that.
Just does not make sense.
Is Eyefinity better than one screen, heck yeah...but it is still flat.
I wish ATI would do 3d as well to give me another choice when buying cards but not happening so far.
But when I went shopping for a card for my wife I found a 4870 that did perform great at a very low price so I got that because guys...it is just a video card...nothing else nothing more...leave fanaticism to sports...go Pens!
Oh and someone said that ATI did not want to get Physx because they did not want to support a closed format...well, if ATI were supporting it and Nvidia too...that wouldnt be such a closed format anymore dont you think? ;)
Peace!
No, that's what apologists turn these threads into because there's no good explanation for the pathetic tactics NVIDIA is using. Just because you call out NVIDIA doesn't mean you support AMD or anything other company, you just don't support NVIDIA's bullshit. Proprietary technologies stagnate development, the less support they get, the faster we will progress.
 
Cry me a freaking river. Seriously! How much do people have to nitpick? You can do it with hacked drivers if you really want to. But why should Nvidia help support ATI, when they have reached across the table a few times just to have their hand slapped away? Nvidia is putting up all the money to develop these technologies and enable their cards to run them. They offer licensing to AMD/ATI which they refuse. Why should they share 'anything' with them then? No amount of contrary reasoning is going to change this fact. They are businesses, you don't help out your competition if they are going to spit in your face every time you offer something to them.

Now if you were to ask me if I think AMD/ATI should license PhysX, I would say I don't really see the need for them to do so. But neither do I fault Nvidia for not allowing them to take advantage of that ability at all.

They aren't supporting ATI, and they aren't supporting your nvidia card. How do you think all the people who had physx cards with their ATI cards felt when nvidia decided to disable the cards they paid hard earned money for? Sorry man, but it's bullshit any way you spin it.
 
You missed his point. PhysX on the GPU should supplement PhysX on the CPU, not add effects to the games.

Umm, why? That makes no sense whatsoever. There is no reason for the GPU to "supplement" anything. It already does the graphics, the CPU does not do the graphics, so why would it supplement another graphic process when the sole purpose of the GPU is for *gasp* Graphics Processing.

They disabled the functioning of a product you paid for simply because you're also using a product from a competitor. That's about as close as you can get to anti-competitive practices without crossing into illegal. But go ahead and keep being an apologist. Seems like everyone is hellbent on expressing his/her stupidity in this thread.

They are not disabling a function of a product. Last I checked, PhysX is Nvidia technology. They are "enabling" it for their devices. It is a supplemental effect that Nvidia helped design and made hardware for. The sooner you understand that, the better. They are giving you something EXTRA, not taking something away.

An apologist is: a person who makes a defense in speech or writing of a belief, idea, etc.

You are the one trying to defend the idea that PhysX should be available for everyone and that is is a core part of the game that should be open to everyone. That is not what it is though. I am a realist: a person who tends to view or represent things as they really are.

PhysX was developed by Nvidia to work with hardware that is licensed for it. AMD/ATI can license it if they want, but they don't. So guess what? They don't get to use it, it is as simple as that. That is how things are, not the idealistic view you have. So the apologist here is you and the stupidity is yours.
 
No, that's what apologists turn these threads into because there's no good explanation for the pathetic tactics NVIDIA is using. Just because you call out NVIDIA doesn't mean you support AMD or anything other company, you just don't support NVIDIA's bullshit. Proprietary technologies stagnate development, the less support they get, the faster we will progress.

And the signature shows:
HD5870 @ 900/1250MHz
I rest my case ;)
 
No, that's what apologists turn these threads into because there's no good explanation for the pathetic tactics NVIDIA is using. Just because you call out NVIDIA doesn't mean you support AMD or anything other company, you just don't support NVIDIA's bullshit. Proprietary technologies stagnate development, the less support they get, the faster we will progress.

Such as DirectX? Or how about x86 architecture? Or how about DDR technology? Or how about SLI and Crossfire tech? Or how about Multi-touch? Or any of the other myriad of proprietary technologies? Almost all technology is proprietary. Your post is simple FUD. AMD also has pathetic tactics like their whole marketing schemes and AMD Game! and other useless marketing points. Both companies do their share of stupid things. PhysX is not one of those things. You are merely trying to make a mountain out of a ant hill.

Now if you were to argue about AA like in Batman, I would understand. But that was also an isolated event and is not indicative of anything else, especially the degraded CPU performance of physics. It has been shown over and over again that the GPUs are far superior in Physics processing than the CPU (using either Nvidia OR ATI technology). So you really have no leg to stand on there. Again you keep showing your apologetic nature here.
 
They aren't supporting ATI, and they aren't supporting your nvidia card. How do you think all the people who had physx cards with their ATI cards felt when nvidia decided to disable the cards they paid hard earned money for? Sorry man, but it's bullshit any way you spin it.

Incorrect, they do support all of my Nvidia cards. They just won't support a system that uses both ATI and Nvidia. The reason for this is then ATI gets to profit some off of Nvidias PhysX technology. That is a valid business case no matter how you try to skew it. And even so, you can STILL use older drivers and have both ATI and Nvidia in your system. The only BS here is you trying to force Nvidia to support using one of their cards in conjunction with their competitor's cards. Why should they spend money supporting their competitors? Coincidentally ATI shouldn't have to support Nvidia, but they don't have to, they just have to say, well we don't support PhysX and that's that. But you are too quick to blame Nvidia for the problems even if it is an ATI card that is causing the mix up.
 
Incorrect, they do support all of my Nvidia cards. They just won't support a system that uses both ATI and Nvidia. The reason for this is then ATI gets to profit some off of Nvidias PhysX technology. That is a valid business case no matter how you try to skew it. And even so, you can STILL use older drivers and have both ATI and Nvidia in your system. The only BS here is you trying to force Nvidia to support using one of their cards in conjunction with their competitor's cards. Why should they spend money supporting their competitors? Coincidentally ATI shouldn't have to support Nvidia, but they don't have to, they just have to say, well we don't support PhysX and that's that. But you are too quick to blame Nvidia for the problems even if it is an ATI card that is causing the mix up.

Bes be trollin'.
 
Oh and someone said that ATI did not want to get Physx because they did not want to support a closed format...well, if ATI were supporting it and Nvidia too...that wouldnt be such a closed format anymore dont you think? ;)
Peace!

It would still be a closed standard, because it would prevent other companies from entering the market. This is what is known as an oligopoly, and it can be just as bad as a monopoly.
 
Umm, why? That makes no sense whatsoever. There is no reason for the GPU to "supplement" anything. It already does the graphics, the CPU does not do the graphics, so why would it supplement another graphic process when the sole purpose of the GPU is for *gasp* Graphics Processing.
You obviously have no idea how physics are produced in most games. Wrestle with this one - how are physics effects produced in games run on non-NVIDIA graphics cards, or on cards earlier than NVIDIA's 8800 series?
They are not disabling a function of a product. Last I checked, PhysX is Nvidia technology. They are "enabling" it for their devices. It is a supplemental effect that Nvidia helped design and made hardware for. The sooner you understand that, the better. They are giving you something EXTRA, not taking something away.
An apologist is: a person who makes a defense in speech or writing of a belief, idea, etc.

You are the one trying to defend the idea that PhysX should be available for everyone and that is is a core part of the game that should be open to everyone. That is not what it is though. I am a realist: a person who tends to view or represent things as they really are.

PhysX was developed by Nvidia to work with hardware that is licensed for it. AMD/ATI can license it if they want, but they don't. So guess what? They don't get to use it, it is as simple as that. That is how things are, not the idealistic view you have. So the apologist here is you and the stupidity is yours.
You are either completely ignorant on how technology works or are playing dumb in the hope that someone won't call you on your bullshit. If one has a system with say, a GTX285 and a GTX260, one can use the GTX285 as the main graphics adapter and the GTX260 solely for PhysX processing. However, replace that GTX285 with an HD5870 and the GTX260 can no longer be used to process PhysX, despite the hardware being completely compatible. It's blocked on the driver level because NVIDIA doesn't like being outshined in the graphics department. Who's an apologist again?
And the signature shows:
HD5870 @ 900/1250MHz
I rest my case ;)
And that's the only rebuttal you have? Thank you for proving my point; it's easy to debate when you defeat yourself.
Such as DirectX? Or how about x86 architecture? Or how about DDR technology? Or how about SLI and Crossfire tech? Or how about Multi-touch? Or any of the other myriad of proprietary technologies? Almost all technology is proprietary. Your post is simple FUD. AMD also has pathetic tactics like their whole marketing schemes and AMD Game! and other useless marketing points. Both companies do their share of stupid things. PhysX is not one of those things. You are merely trying to make a mountain out of a ant hill.

Now if you were to argue about AA like in Batman, I would understand. But that was also an isolated event and is not indicative of anything else, especially the degraded CPU performance of physics. It has been shown over and over again that the GPUs are far superior in Physics processing than the CPU (using either Nvidia OR ATI technology). So you really have no leg to stand on there. Again you keep showing your apologetic nature here.
Considering I'm on the attack, it's tough to be an apologist, no? Sad that you read the definition and still don't understand it. What's a proprietary technology that AMD is using to cripple games and buy off developers while blocking NVIDIA out of?
 
On a completely unrelated topic referring to the game...

...I just got it and it sucks.
 
And that's the only rebuttal you have? Thank you for proving my point; it's easy to debate when you defeat yourself.
Well, say something worth my time and I will reply back but so far you are not even close.
It would still be a closed standard, because it would prevent other companies from entering the market. This is what is known as an oligopoly, and it can be just as bad as a monopoly.
So if the two major video card companies are supporting the format, who else do you care for out there? I mean really....
 
Yeah, heaven forbid we state our opinion in a discussion, what kind of spineless apologist response is that? And we have had CPU based physics engines that do particle and fluid simulations for almost a decade anyway.

So anyone who doesn't agree with your baseless opinion is an apologist? Ha, nice try. Why in the world would anyone need to apologize for a company trying to profit and gain an advantage from technology they invested in?

Which engines are those by the way and which games use them?

On a completely unrelated topic referring to the game...

...I just got it and it sucks.

Don't know about the full game but yeah the demo was balls.
 
Last edited:
Well, say something worth my time and I will reply back but so far you are not even close.
You already tried and it was laughable. So are you just running away from proving your asinine argument or is this, sadly, the best you can do? You're already down to generic schoolyard responses, my you're a formidable debater. :rolleyes:
So anyone who doesn't agree with your baseless opinion is an apologist? Ha, nice try. Why in the world would anyone need to apologize for a company trying to profit and gain an advantage from technology they invested in?

Which engines are those by the way and which games use them?
You prove my point even in your response, it's great when you trolls do the work for me. What's baseless about my opnion? That it doesn't kiss NVIDIA's ass? Yes, let's make sure we denounce any hypothetical situation that makes NVIDIA look bad otherwise we won't get our monthly Green Team Awesome Star. And here's your homework assignment - how about every computer game in the last decade that has had particle or fluid physics effects and didn't use PhysX? Off the top of my head go look up Crysis and Unreal 2. You get bonus points for writing up a summary of one more game that you find all by yourself.
 
There is nothing in that game that could not have been done to 95% the same effect without using GPU accelerated Physics. Well it is not a very good game and I doubt I will play it any more than I have, so let me amend that to: in the first 2 hours of play, I witnessed nothing visually that I had not seen in the last 4 years. It may have arrived at those effects in a way the CPU could not handle, but those effects were nothing I had not already seen with out gpu accelerated Physx.

At this point I don't care who is to blame, I no longer care why it is this way. I am done buying games for more that $10, and my current rig will be my last gaming rig until these devs grab their god damn nuts, get off their fat asses and code games for the PC again. You don't leave out features that have been standard in PC games since 2002 like AA, you don't tie mundane effects to a proprietary api only half the vid cards out there can use, you don't turn over coding of standard features to Nv, you don't let a vid card company decide what your final product looks like, and you don't straight port console games to PC and then cry when they don't sell well. Fuck you devs, fuck you pubs, fuck you pirates, fuck you retailers, and fuck you Nv/AMD.
 
What's baseless about my opnion?

Too easy. Nvidia doesn't have a monopoly and neither is PhysX an essential commodity. So therefore you have no basis for running around acting like you're entitled to something. Want PhysX? Go Nvidia. Don't want PhysX? No problem. That's how it works for every single product. Calling anyone who understands simple business strategy a troll or apologist isn't really a strong basis for an argument. Also, if you really believe Cryengine does fluid simulations then you're very much out of your depth on this topic and should do some more research before throwing around insults.

I witnessed nothing visually that I had not seen in the last 4 years.

Not quite. It's very easy to spot the difference between fluid/particle effects that interact with the environment and those that don't. I agree though that visually it's easy to come up with precanned effects that provide a reasonable approximation. But there's a limit and the next step is actual physical simulation. Progress is good no?
 
Not quite. It's very easy to spot the difference between fluid/particle effects that interact with the environment and those that don't. I agree though that visually it's easy to come up with precanned effects that provide a reasonable approximation. But there's a limit and the next step is actual physical simulation. Progress is good no?


Problem is that in the last few years, whether under Ageia's ownership or Nv's, we have still not seen any progress. Why? That's easy, because it's a proprietary API only used by one hardware vendor. It has gained almost no traction in all this time and I find it unlikely that it ever will. So, we are stuck with effects physics until an open standard, or a proprietary one, not owned by one of the vid card makers, that the others gpu makers will also use, comes along. PhysX is free, so a shit load of devs use the cpu version of it, but we still are getting only a handful of games a year that use gpu accelerated PhysX. And almost none of them are what I would call triple a titles.

The next step is a proprietary gpu accelerated physics solution not owned by Nv, AMD, or Intel, or an open source solution that runs on OpenCL. Once we have a standard AMD/Nv can/will use we will see advancement.
 
Last edited:
Problem is that in the last few years, whether under Ageia's ownership or Nv's, we have still not seen any progress. Why? That's easy, because it's a proprietary API only used by one hardware vendor. It has gained almost no traction in all this time and I find it unlikely that it ever will. So, we are stuck with effects physics until an open standard, or a proprietary one, not owned by one of the vid card makers, that the others gpu makers will also use, comes along. PhysX is free, so a shit load of devs use the cpu version of it, but we still are getting only a handful of games a year that use gpu accelerated PhysX. And almost none of them are what I would call triple a titles.

The next step is a proprietary gpu accelerated physics solution not owned by Nv, AMD, or Intel, or an open source solution that runs on OpenCL. Once we have a standard AMD/Nv can/will use we will see advancement.

This is absolutely correct.
 
If I remember correctly one GPU company was pushing an open source physics engine... hmmm who was it?
 
Both nVidia and ATi have worked on Bullet. It's open source, but will game devs start using it?
 
Problem is that in the last few years, whether under Ageia's ownership or Nv's, we have still not seen any progress. Why? That's easy, because it's a proprietary API only used by one hardware vendor.It has gained almost no traction in all this time and I find it unlikely that it ever will.

That's not the whole picture. It has indeed made progress, especially in terms of integration into existing tool chains and broad platform support. Apex, for example, is a big step toward allowing artists and designers to integrate advanced physics effects into their games.

http://www.youtube.com/watch?v=VB4rZws3jDU&feature=player_embedded

So, we are stuck with effects physics until an open standard, or a proprietary one, not owned by one of the vid card makers, that the others gpu makers will also use, comes along. PhysX is free, so a shit load of devs use the cpu version of it, but we still are getting only a handful of games a year that use gpu accelerated PhysX. And almost none of them are what I would call triple a titles.

Agreed, there isn't going to be anything beyond effects physics as long as it's proprietary. However, the same can be said of DirectX - it's all just non-gameplay affecting effects so that on its own isn't a problem. Nvidia's strategy is to develop and promote the PhysX platform and establish mindshare with developers - both on consoles and on the PC. That's the only way they can get developers to pay attention to a proprietary GPU accelerated solution.

If I remember correctly one GPU company was pushing an open source physics engine... hmmm who was it?

Only open source physics engine I know of is Bullet. And that is written in CUDA with an OpenCL port supposedly on the way. AMD isn't pushing open source any more than Nvidia is, it just seems that way to some people because AMD has no solution of their own.
 
That's not the whole picture. It has indeed made progress, especially in terms of integration into existing tool chains and broad platform support. Apex, for example, is a big step toward allowing artists and designers to integrate advanced physics effects into their games.

http://www.youtube.com/watch?v=VB4rZws3jDU&feature=player_embedded



Agreed, there isn't going to be anything beyond effects physics as long as it's proprietary. However, the same can be said of DirectX - it's all just non-gameplay affecting effects so that on its own isn't a problem. Nvidia's strategy is to develop and promote the PhysX platform and establish mindshare with developers - both on consoles and on the PC. That's the only way they can get developers to pay attention to a proprietary GPU accelerated solution.



Only open source physics engine I know of is Bullet. And that is written in CUDA with an OpenCL port supposedly on the way. AMD isn't pushing open source any more than Nvidia is, it just seems that way to some people because AMD has no solution of their own.

D3D is not controlled by Nv or AMD, thus either is able to use it without needing to worry that the other has undue influence. Proprietary PhysX on proprietary Cuda, running only on Nv hardware is not even remotely comparable to DX/D3D. If only because AMD will NEVER, EVER, license gpu accelerated PhysX while Nv has control of it. In short it will always be a Nv only thing, never universally embraced. As it stands now, if Nv wants a game to have gpu accelerated physX effects it appears they have to code it themselves, which seems to have been the case most of the time so far.

I guess it comes down to "show me", show me the games. Tech demos, and 3dmark don't count. I can only remember one AAA title from last year that used GPU PhysX, Batman AA, Cryostasis was not in my opinion a AAA title but I could be convinced to give you that one raising the total to two. This year so far we have one B grade title in Dark Void that Nv apparently coded the AA and gpu PhysX for. Show me the games and I will be convinced. Show me effects that can not be 90% mimicked without gpu PhysX and I will believe. Until then it's going to be pretty hard to convince me.
 
The fact that I own 3x GTX 260s, and can't use even one for physX because I tossed in 2 5870s for primary rendering is BS.

No fanboy craptalk is going to change that, period.

I addressed this crap by buying another 5870, and making a firm decision not to buy nvidia's next gen cards no matter what the performance is.

Big business win there for nvidia huh. That is three top end gpu sales they lost. $1400 they would have had otherwise. I'd be willing to bet I'm not the only one.
 
There is nothing in that game that could not have been done to 95% the same effect without using GPU accelerated Physics. Well it is not a very good game and I doubt I will play it any more than I have, so let me amend that to: in the first 2 hours of play, I witnessed nothing visually that I had not seen in the last 4 years. It may have arrived at those effects in a way the CPU could not handle, but those effects were nothing I had not already seen with out gpu accelerated Physx.

At this point I don't care who is to blame, I no longer care why it is this way. I am done buying games for more that $10, and my current rig will be my last gaming rig until these devs grab their god damn nuts, get off their fat asses and code games for the PC again. You don't leave out features that have been standard in PC games since 2002 like AA, you don't tie mundane effects to a proprietary api only half the vid cards out there can use, you don't turn over coding of standard features to Nv, you don't let a vid card company decide what your final product looks like, and you don't straight port console games to PC and then cry when they don't sell well. Fuck you devs, fuck you pubs, fuck you pirates, fuck you retailers, and fuck you Nv/AMD.

Totally agree, and there is only one good game using this feature BFD, btw physics can be ran on this particular game using hacked drivers with an ATi main card and Nv secondary.

Honestly I personally feel like I 'am being forced fed half ass games even the 'aaa titles are effected by porting effects from titles developed primarily around consoles, be it graphics, mechanic, load times, optimisation etc.....if the pc versions were truely coded for the pc we would not be seeing alot of these issues....but what do I know?, I 'am just a customer or atleast was.
 
Back
Top