Dark Void PhysX Gameplay Performance and IQ @ [H]

D3D is not controlled by Nv or AMD, thus either is able to use it without needing to worry that the other has undue influence.

Of course, but the similarity is that both D3D and PhysX are just eye-candy (in response to your comment about being "stuck" with effects physics).

I'd be willing to bet I'm not the only one.

No you're not. But I suspect the number of "enthusiasts" who take up a cause against Nvidia for this is smaller than the number who will just see PhysX as some extra thing only Nvidia has. The fact that people are pissed off goes to show that they feel like they're missing out on something. If they didn't care they wouldn't be mad in the first place. Sucks for people who prefer AMD hardware but that's the whole point of product differentiation isn't it?
 
Too easy. Nvidia doesn't have a monopoly and neither is PhysX an essential commodity. So therefore you have no basis for running around acting like you're entitled to something. Want PhysX? Go Nvidia. Don't want PhysX? No problem. That's how it works for every single product. Calling anyone who understands simple business strategy a troll or apologist isn't really a strong basis for an argument. Also, if you really believe Cryengine does fluid simulations then you're very much out of your depth on this topic and should do some more research before throwing around insults.
And yet again you proved my point about being an apologist. Rather than expect something better or say "you know, there has been no progress in this arena, and these games still suck with this proprietary crap," you're replying making excuses for why NVIDIA's PhysX is still great. You don't understand simple business strategy, you're just spineless. I know it's over something silly like a video game physics engine, but really, if you can't have an opinion about something so trivial in the anonymity of the internet, you must be the stepping stone of everyone else in real life. Sad.

And CryEngine does fluid simulations (look at the water algorithms). Do better research next time before you troll.
Problem is that in the last few years, whether under Ageia's ownership or Nv's, we have still not seen any progress. Why? That's easy, because it's a proprietary API only used by one hardware vendor. It has gained almost no traction in all this time and I find it unlikely that it ever will. So, we are stuck with effects physics until an open standard, or a proprietary one, not owned by one of the vid card makers, that the others gpu makers will also use, comes along. PhysX is free, so a shit load of devs use the cpu version of it, but we still are getting only a handful of games a year that use gpu accelerated PhysX. And almost none of them are what I would call triple a titles.

The next step is a proprietary gpu accelerated physics solution not owned by Nv, AMD, or Intel, or an open source solution that runs on OpenCL. Once we have a standard AMD/Nv can/will use we will see advancement.
Exactly. Competition in an open platform produces results. Unfortunately, until we see this happen I don't think there will be much progress.
The fact that I own 3x GTX 260s, and can't use even one for physX because I tossed in 2 5870s for primary rendering is BS.

No fanboy craptalk is going to change that, period.

I addressed this crap by buying another 5870, and making a firm decision not to buy nvidia's next gen cards no matter what the performance is.

Big business win there for nvidia huh. That is three top end gpu sales they lost. $1400 they would have had otherwise. I'd be willing to bet I'm not the only one.
More or less my opinion as well, with the extension to games as well (like I didn't buy Batman: AA until it was $20).
 
And yet again you proved my point about being an apologist. Rather than expect something better or say "you know, there has been no progress in this arena, and these games still suck with this proprietary crap," you're replying making excuses for why NVIDIA's PhysX is still great. You don't understand simple business strategy, you're just spineless. I know it's over something silly like a video game physics engine, but really, if you can't have an opinion about something so trivial in the anonymity of the internet, you must be the stepping stone of everyone else in real life. Sad.

Lol, cry me a river. Yet again you have no point, hiding behind immature insults seems to be your only angle. Good luck with your campaign to free us from PhysX's tyranny. The rest of us normal people will just play the games we like and avoid those we don't.

And CryEngine does fluid simulations (look at the water algorithms). Do better research next time before you troll.

Nope they do waveform simulations. Again, you're out of your depth. Guess you didn't take my earlier advice about reading before posting.
 
And yet again you proved my point about being an apologist. Rather than expect something better or say "you know, there has been no progress in this arena, and these games still suck with this proprietary crap," you're replying making excuses for why NVIDIA's PhysX is still great. You don't understand simple business strategy, you're just spineless. I know it's over something silly like a video game physics engine, but really, if you can't have an opinion about something so trivial in the anonymity of the internet, you must be the stepping stone of everyone else in real life. Sad.

And CryEngine does fluid simulations (look at the water algorithms). Do better research next time before you troll.

Exactly. Competition in an open platform produces results. Unfortunately, until we see this happen I don't think there will be much progress.
More or less my opinion as well, with the extension to games as well (like I didn't buy Batman: AA until it was $20).

Quite frankly every post your make is juvenile. First of all describe for me what you mean by open platform? Next, describe for me exactly all these advanced physics affects you are getting generated on the CPU and how that is somehow better than those produced on a GPU? Third, explain to me how competition on open anything produces results? The last I checked closed development still far exceeds open development in R&D and widespread use. Many people try to tout anything "open" as better, but the market says otherwise.
 
Lol, I swear some of people are such drama queens getting all pissy about details as if this shit really even matters.
 
Lol, cry me a river. Yet again you have no point, hiding behind immature insults seems to be your only angle. Good luck with your campaign to free us from PhysX's tyranny. The rest of us normal people will just play the games we like and avoid those we don't.
Hmmm... looks like I hit too close to home :). Again, it's just a physics engine, no need to get defensive.
Nope they do waveform simulations. Again, you're out of your depth. Guess you didn't take my earlier advice about reading before posting.
Yeah, if I was ever this wrong, I'd start arguing semantics too. I don't label myself as a game developing expert or anything beyond an amateur enthusiast, but give me a break.
Quite frankly every post your make is juvenile. First of all describe for me what you mean by open platform? Next, describe for me exactly all these advanced physics affects you are getting generated on the CPU and how that is somehow better than those produced on a GPU? Third, explain to me how competition on open anything produces results? The last I checked closed development still far exceeds open development in R&D and widespread use. Many people try to tout anything "open" as better, but the market says otherwise.
Let me guess, it's juvenile because I point out how ludicrous your posts and arguments are in about a few sentences. I'd be embarrassed too. Let me continue by saying it's a physics engine, no need to get your panties in a knot over some PhysX jihad.

Having an open platform opens a market: the more people that can use your product, the more worthwhile of an investment it is. If you have a physics engine that runs on all graphics hardware (AMD, Intel, NVIDIA), developers are more likely to make use of it because any person who buys their product (a game, for instance), will be able to see the benefits. As it stands, PhysX on the GPU still sucks because no developer is going to waste its time on PhysX for a small minority of the customers that will actually be able to use it. You have two or three GPU PhysX releases a year, usually they suck (Batman: AA was an exception), and the physics shown in the game is some gimmicky unimpressive crap. Look at Dark Void - what do you have that's interesting there? Some dynamic smoke and some particle effects when you blow some guy away? Too bad they still look like shit and do nothing for the game play (which really needs improvement). On top of that, performance is shit. There is nothing they have done well or even right with this technology. They could have had the same effects done with textures and actually worked on making a decent game.

As I said, the reason this happens is because no developer is going to base a game on something a very small majority can use. So PhysX is going to be gimmicky, unimpressive effects until an open solution snuffs it out. Physics needs development, I'm not knocking that, but there are much better ways to do it. For example, check out the Infernal Engine used in the Ghostbusters video game - awesome, game changing physics effects that everyone playing the game could see and use. Furthermore, the game ran very well with them cranked, unlike PhysX. CPUs can do the same physics effects as GPUs, it just has to be coded/developed properly for them. Most gamers have at least a dual core now, and many have quad core. Now with Intel pushing hyperthreading on their new CPUs, that can be pushed to four logical cores and eight logical cores on their new. Still we have games that are at most dual core optimized any many times just run on a single core; there's a lot of CPU power going to waste that could easily be tapped.

It's interesting to see NVIDIA keep digging itself into a hole with PhysX: I think disabling PhysX on systems without a NVIDIA graphics card for rendering was probably what will kill it. Personally, I see the decisions being made as too arrogant for their own good. Whether it will change the market and some other company will come and rip the rug out from under them will remain to be seen.
 
Yeah, if I was ever this wrong, I'd start arguing semantics too. I don't label myself as a game developing expert or anything beyond an amateur enthusiast, but give me a break.

Well that much was clear from the beginning :)

It's interesting to see NVIDIA keep digging itself into a hole with PhysX: I think disabling PhysX on systems without a NVIDIA graphics card for rendering was probably what will kill it. Personally, I see the decisions being made as too arrogant for their own good. Whether it will change the market and some other company will come and rip the rug out from under them will remain to be seen.

Yes which is precisely why incessantly railing against it is a waste of time and energy. They will keep doing what they're doing as long as there's no competition to keep them honest. The onus isn't on them to play nice. They'll keep riding the PhysX train as long as they can and if/when some third party decides to show up with an alternative they'll be there with full OpenCL support as well. From their perspective there's no downside.
 
Well that much was clear from the beginning :)
So far you've shown no different (actually worse), so you're not in a place to judge.
Yes which is precisely why incessantly railing against it is a waste of time and energy. They will keep doing what they're doing as long as there's no competition to keep them honest. The onus isn't on them to play nice. They'll keep riding the PhysX train as long as they can and if/when some third party decides to show up with an alternative they'll be there with full OpenCL support as well. From their perspective there's no downside.
Of course there is, what kind of market do you think this? PhysX is only worth anything to them if they can use it to sell graphics cards. Personally, I could care what NVIDIA does: I don't work for them, know anyone who works for them, hell I don't even own any of their stock. However, as an enthusiast and a gamer, I see their tactics as inhibiting the growth and development of the industry, which directly affects my hobby of gaming. The more people are aware of the situation and what's wrong with it (as in average Joe says "hey, I don't need to spend extra money on xxxxx NVIDIA card, PhysX really is a gimmick"), the quicker this bullshit dies and we can move on to an open solution and make some progress.
 
So far you've shown no different (actually worse), so you're not in a place to judge.

Sure, when it comes to the depth of my graphics knowledge I'll take your word for it.

PhysX is only worth anything to them if they can use it to sell graphics cards.

Yep, that's the endgame.

However, as an enthusiast and a gamer, I see their tactics as inhibiting the growth and development of the industry, which directly affects my hobby of gaming. The more people are aware of the situation and what's wrong with it (as in average Joe says "hey, I don't need to spend extra money on xxxxx NVIDIA card, PhysX really is a gimmick"), the quicker this bullshit dies and we can move on to an open solution and make some progress.

That's a unique perspective. How exactly is PhysX preventing open solutions from coming to market today? I don't know of any open source projects that are being suppressed by PhysX's presence. Btw, have you seen AMD's latest comments on the state of their OpenCL support? That should give you an idea of how close we are to an open solution.

“If we’re not getting the requests from our users then we’re not putting in engineering resources until the community says that they need them,” he said. Which, to be fair, sounds sensible enough.

But Hook also argued that there really was “No real point of having OpenCl in your standard driver until there’s a large volume of applications that are widely available.”

http://www.techeye.net/software/amd-defends-itself-from-nvidia-opencl-attack
 
Great article!

It would be more relevant to me if it included information about using ATI HD5850 for video and Nvidia 9600GT for physx (via hack).

I'm loving my setup like this under Windows 7 Ultimate x64 on a 4ghz Q9550 for unreal engine games. I'm considering moving from my Dell 2707wfp to a 3 monitor eyefinity setup (like the NEC e-IPS 23" panels).

Anyways.. is it off limits to include this hardware config in testing?
 
Let me guess, it's juvenile because I point out how ludicrous your posts and arguments are in about a few sentences. I'd be embarrassed too. Let me continue by saying it's a physics engine, no need to get your panties in a knot over some PhysX jihad.
Its juvenile because all you ever do is attack people, and you have very little knowledge of the actual topic, business, or development. You are arguing with grass straws and trying to make comments that sound reasonable but have no base in how the world really works.

Having an open platform opens a market: the more people that can use your product, the more worthwhile of an investment it is. If you have a physics engine that runs on all graphics hardware (AMD, Intel, NVIDIA), developers are more likely to make use of it because any person who buys their product (a game, for instance), will be able to see the benefits. As it stands, PhysX on the GPU still sucks because no developer is going to waste its time on PhysX for a small minority of the customers that will actually be able to use it. You have two or three GPU PhysX releases a year, usually they suck (Batman: AA was an exception), and the physics shown in the game is some gimmicky unimpressive crap.

Not entirely true, there is still a number of physics being done on the GPU. You are arguing PhysX and then trying to lump all physics into it. That is entirely untrue. You can't mix and match to make a point. Also understand that an Open platform standard then forces everyone to use the same algorithms which may be faulty, not as efficient, or not as good as others. Having more choices is better. A closed standard does not mean other people cannot develop for it, we have already shot down your assertions there. Anyone can code and license PhysX and Nvidia has been happy to help them. CUDA is also the best form of physics processing out there right now, the other options are quite as good yet.

Look at Dark Void - what do you have that's interesting there? Some dynamic smoke and some particle effects when you blow some guy away? Too bad they still look like shit and do nothing for the game play (which really needs improvement). On top of that, performance is shit. There is nothing they have done well or even right with this technology. They could have had the same effects done with textures and actually worked on making a decent game.

Again here you show your total lack of insight. You try to use an argument of opinion to base your statements on, which already means you are wrong. Also note that many more things can be done with PhysX and CUDA, but that would deter from users who have other cards like ATI. So they can only do some supplemental coding and cannot fully incorporate PhysX into their game. If PhysX was licensed by ATI and Intel, then developers could be more free to do a lot more with it.

As I said, the reason this happens is because no developer is going to base a game on something a very small majority can use. So PhysX is going to be gimmicky, unimpressive effects until an open solution snuffs it out. Physics needs development, I'm not knocking that, but there are much better ways to do it. For example, check out the Infernal Engine used in the Ghostbusters video game - awesome, game changing physics effects that everyone playing the game could see and use. Furthermore, the game ran very well with them cranked, unlike PhysX. CPUs can do the same physics effects as GPUs, it just has to be coded/developed properly for them. Most gamers have at least a dual core now, and many have quad core. Now with Intel pushing hyperthreading on their new CPUs, that can be pushed to four logical cores and eight logical cores on their new. Still we have games that are at most dual core optimized any many times just run on a single core; there's a lot of CPU power going to waste that could easily be tapped.

Understand that pre-rendered effects and responses is not the same as on the fly dynamic physics. So to make an argument saying it is 'better' is pretty lame. Anyone who spends enough time pre-rendering and compiling the effects to run in response to certain stimuli can make a game with decent effects. That is more a testament to the time the developer spent on the game than on physics itself. Also understand that it is far more time-consuming and much harder to program multi-core games. If it was simply a matter of doing a few things differently, a lot more developers would do it. You make arguments that make no sense, saying "if only they did this it would be better!", except it is not better, which is why they are not doing that.

It's interesting to see NVIDIA keep digging itself into a hole with PhysX: I think disabling PhysX on systems without a NVIDIA graphics card for rendering was probably what will kill it. Personally, I see the decisions being made as too arrogant for their own good. Whether it will change the market and some other company will come and rip the rug out from under them will remain to be seen.

What hole? You do realize they are making money hand over fist with their CUDA development? This is why AMD/ATI are trying to develop a new GPU server product to try and compete with a market that Nvidia has already pretty much run away with, and with great success. So again, you make a comment that is entirely without any base in reality whatsoever.
 
Oh goodie I love these!

Its juvenile because all you ever do is attack people, and you have very little knowledge of the actual topic, business, or development. You are arguing with grass straws and trying to make comments that sound reasonable but have no base in how the world really works.
You guys are both using personal attacks. Nobody's a saint here.



Not entirely true, there is still a number of physics being done on the GPU. You are arguing PhysX and then trying to lump all physics into it.
What? His point was that we don't NEED physX to see a cup of water being spilled over a breaking building. Let's look at the last 3 games I've played.

STALKER CoP
-Impressive lighting
-Full dynamic physics for particles
-Wave rendering, water simulation

Playing Bioshock 2:
-Full fluid movement
-Flame simulation
-Physics, ragdoll

BFBC2
-Particle effects
-Explosions
-Breaking buildings

You can argue all you want that effects like the ones seen in physX aren't reproducible, but playing through those 3 games I've seen plenty of technical merit that did not need any help from physX to do its job properly. As seen in the game reviewed on the first post, PhysX takes a large toll on the GPU even when looking at the flagship 295.

It's slow and inefficient.


That is entirely untrue. You can't mix and match to make a point. Also understand that an Open platform standard then forces everyone to use the same algorithms which may be faulty, not as efficient, or not as good as others.
News at 11: Open Source definition changed to one entity.

OpenCL wouldn't be the only project out there, and how can you justify saying that it "might be faulty"? Did nVidia ship you a GTX Co-op edition Crystal Ball?

Also, I think something that is "efficient" and "not faulty" would run on the hardware it's designed for without needing 100$ worth of more purchases (inb4 that guy who said you can get a 8800gt for 5 cents and a hot dog)

Having more choices is better. A closed standard does not mean other people cannot develop for it, we have already shot down your assertions there.
I'd love some choice! Hmm, should I go for the nVidia GTX 671.9 hydro copper co op XTREME edition or the nVidia 710 GTX dual slot 8gb ram SPECIAL:$2,000 edition?

So hard to choose. Oh wait, I should get the 671.9 because I'll probably need another 200$ just to play it The Way It's Meant To Be Paid.

A closed standard does not mean other people cannot develop for it, we have already shot down your assertions there.

Durr. Let's see how PhysX might affect someone trying to interact with a market!

"Hi we're Company A developing physics for about a year and a half do you mind if we come in"

"Sure but can you pay us money to use your physics"

"No lol"

"No you cannot come in Company A father nVidia will be mad"

Plus, nVidia could simply go "nope" and not include support for those physics in their drivers. You can argue that if it became big then they would "have" to, but how would open source projects get support like that if it wasn't universal standard? How would they enter the market with PhysX still getting dev's attention?

Anyone can code and license PhysX and Nvidia has been happy to help them. CUDA is also the best form of physics processing out there right now, the other options are quite as good yet.
This has absolutely nothing to do with CUDA. We're talking about PhysX.

Also, why would anyone work on PhysX aside from nVidia when nVidia acts the way it does? Disabling physX when an ATI card is present? these are clearly market tactics and nVidia's happy-go-lucky "anybody can license this derp" attitude is clearly not the case when it favors the idea of blocking the cooperation of hardware.


Again here you show your total lack of insight. You try to use an argument of opinion to base your statements on, which already means you are wrong.
I think that the Nazis incinerating the Jews was a sad ordeal. This is an argument of opinion. now, according to your theory, this means I'm automatically wrong.

Those Nazis sound better now that you've phrased it that way. Thanks.


Also note that many more things can be done with PhysX and CUDA
Maybe they an have a building made of water fall down. Oh man. Also stop mentioning CUDA in a PhysX argument. It's not used for games.

If it were so great, then there would be applications that use it exclusively. I'm sure nVidia's wet dream is to have every corporation in the world running a PhysX OS using PhysX to play pong.

PhysX is not some magic thing. They're not limiting it.

but that would deter from users who have other cards like ATI.
That's the entire point of PhysX, as well as what they're doing in the market.

If they wanted people to buy an nVidia card for PhysX, they wouldn't have disabled it. It deters people from using ATI.


So they can only do some supplemental coding and cannot fully incorporate PhysX into their game. If PhysX was licensed by ATI and Intel, then developers could be more free to do a lot more with it.
If anything, dark void should serve as a reminder that nVidia will have devs force as much physx into a game as humanly possible until it's barely playable on their highest spec'd hardware on the ancient unreal engine.

Gee, I wonder why Intel and AMD don't want to license something that inefficient. I wonder why every company doesn't bend to nVidia's will and pay a corporation for the rights to develop something that they may or may not use.

If AMD licensed physx, all nvidia would have to do is let PhysX run absolutely god awful on ATI gpus like they did with cpu utilization and it would be another marketing ploy to make AMD's lineup look inferior to the space heater that is Fermi.

It would be basically open source, but with someone controlling the whole thing while profiting off of it. Sounds great until you think about it objectively for more than eight seconds.


Understand that pre-rendered effects and responses is not the same as on the fly dynamic physics. So to make an argument saying it is 'better' is pretty lame.
My opinions>your opinions. Also, you're implying that you can't do anything other than pre-rendered effects on non PhysX platforms. Which is not only lame, but entirely untrue.

Anyone who spends enough time pre-rendering and compiling the effects to run in response to certain stimuli can make a game with decent effects. That is more a testament to the time the developer spent on the game than on physics itself.
I don't see your point. Dynamic effects still respond to stimuli, even on the most basic level. Also, you're implying that you can't do anything other than pre-rendered effects on non PhysX platforms. Which is not only lame, but entirely untrue.


Also understand that it is far more time-consuming and much harder to program multi-core games. If it was simply a matter of doing a few things differently, a lot more developers would do it.
This guy seems legit. Are you a game developer? When is the star wars mmo coming out?

Good to know that there are [H]ardcore people that understand the intimacy between a game and multiple cores. I get that it is better to use 1 core at 50% now.

Also, your argument is flawed. Watch me do it.

If it was simply a matter of PhysX being anything other than a huge pile of shit, a lot more AMD and Intel would use it.

You make arguments that make no sense, saying "if only they did this it would be better!", except it is not better, which is why they are not doing that.
Utilizing the CPU is NOT better? You're right. I hope my i5's 3 other cores never see the light of day. I agree that is objectively better. Also, nice personal attack at the beginning. Really makes the rest of the meal go down smooth.


What hole? You do realize they are making money hand over fist with their CUDA development?
CUDA CUDA CUDA CUDA CUDA CUDA CUDA CUDA CUDA CUDA. This topic is about PhysX.

This is why AMD/ATI are trying to develop a new GPU server product to try and compete with a market that Nvidia has already pretty much run away with, and with great success. So again, you make a comment that is entirely without any base in reality whatsoever.
Logical Fallacies! My favorite.

AMD is losing in the GPGPU market therefore PhysX is a wonderful closed standard.
compare to
My grilled cheese sandwich is great therefore gay marriage should be banned in all states.


Man, you guys are so easy to pick apart. It's hard justifying something that runs like shit and limits the market, isn't it?

Stop trying to justify your purchases. :)

Oh, and I fully expect a response picking and choosing which arguments you want to contend with, and using opinions and personal attacks as justification all the while.

Don't disappoint me!
 
Hilarious. I'm sure they will not dissappoint you. At this point that certain two are just posting whatever to try and make it seem like the thread ended with an outcome in Nv's favor. In other words, they want the last word. We should give it to them. Pretty sure they are unpaid shills anyway.
 
What I locked onto is the words that the author used to describe things:

There is no doubt about it, PhysX makes a tremendous visual difference in Dark Void. The advanced particle effects enabled by PhysX make land combat an exciting and richly detailed experience. Between the shards of broken enemies, the dust clouds from bullet strikes, and the awesome torrent of particle streams from the Disintegrator cannon, combat in Dark Void is just plain awesome with NVIDIA’s PhysX technology running the show. Even Will’s jetpack is made more awesome with PhysX. That isn’t to say that the jetpack wasn’t cool without it, but with PhysX running on High, the stream of smoke coming out of the little jet engines is thick and dense and cool.

alanstein said:
What? His point was that we don't NEED physX to see a cup of water being spilled over a breaking building.

It's not: -- We all need, desire, have to, has to be the end-all-be-all, has to improve game-play no-doubt-about-it, but simply a choice for the GeForce name-brand family to improve someone's experience if they choose to and add some gaming experience value.

So many times it has to be extremes or what an individual thinks so this translates for all and we.
 
It's not: -- We all need, desire, have to, has to be the end-all-be-all, has to improve game-play no-doubt-about-it, but simply a choice for the GeForce name-brand family to improve someone's experience if they choose to and add some gaming experience value.

So many times it has to be extremes or what an individual thinks so this translates for all and we.

Oh christ. Did you even read what I was saying?

I meant that you don't need the PhysX brand in order to render the exact same thing.

Hiding behind clipped words only makes my argument stronger. Also-use-more-hyphens.
 
Last edited:
Simple points:

Yes, I read your post; it was difficult! It seems PhysX has to be the end-all-be-all, no-doubt-about-it feature to garner something positive for some. Was there a single positive in your post about PhysX? Let me reread. No, just emotional, one-sided extremism to me.. I guess, the voice of reason somehow?

Simple premise:

Some tend to look at PhysX as a feature that may enhance certain titles if a gamer desires. That's all I am saying and was my point.


This is why you place it in the hands of site with a reputation of evaluating and testing the experience. It's their findings with some positives to share like:

There is no doubt about it, PhysX makes a tremendous visual difference in Dark Void. The advanced particle effects enabled by PhysX make land combat an exciting and richly detailed experience. Between the shards of broken enemies, the dust clouds from bullet strikes, and the awesome torrent of particle streams from the Disintegrator cannon, combat in Dark Void is just plain awesome with NVIDIA’s PhysX technology running the show. Even Will’s jetpack is made more awesome with PhysX. That isn’t to say that the jetpack wasn’t cool without it, but with PhysX running on High, the stream of smoke coming out of the little jet engines is thick and dense and cool.

This speaks much louder than you - no matter how much emotion or extremism you desire to use.

Why do I like GPU Physics? Because this feature has the possibility to raise the bar of immersion and realism. Also, through the chaos and division of proprietary there is choice and innovation for some, that hopefully leads to more awareness, which may help pave the way for open standards to be agreed upon, offered and to mature.
 
Only after all the fake bullshit looking wanna be physics such as bullet and havok go first.

lol Physx is way more fake looking. Every effect in Physx accelerated games is so over exaggerated just to make sure you know it's on, especially in Batman. Physx is just as bad, the only thing it has going for it is the number of effects it can do with GPU acceleration.
 
lol Physx is way more fake looking. Every effect in Physx accelerated games is so over exaggerated just to make sure you know it's on, especially in Batman. Physx is just as bad, the only thing it has going for it is the number of effects it can do with GPU acceleration.

BC2 is fake, B:AA atleast looks real. Havok and bullet physics are about as fake as you can get.
 
Excellent post. I think [H]'s findings show just how much NVIDIA has gimped PhysX to run best on their hardware, except that their hardware can't handle it, which, to be honest, is pathetic. What a perfectly good waste of a physics engine.

Whats pathetic is teh attempt by all the fanATIcs on this sight to down play more eyecandy game play emersion.
 
Simple points:

Yes, I read your post; it was difficult! It seems PhysX has to be the end-all-be-all, no-doubt-about-it feature to garner something positive for some. Was there a single positive in your post about PhysX? Let me reread. No, just emotional, one-sided extremism to me.. I guess, the voice of reason somehow?

Simple premise:

Some tend to look at PhysX as a feature that may enhance certain titles if a gamer desires. That's all I am saying and was my point.


This is why you place it in the hands of site with a reputation of evaluating and testing the experience. It's their findings with some positives to share like:



This speaks much louder than you - no matter how much emotion or extremism you desire to use.

Why do I like GPU Physics? Because this feature has the possibility to raise the bar of immersion and realism. Also, through the chaos and division of proprietary there is choice and innovation for some, that hopefully leads to more awareness, which may help pave the way for open standards to be agreed upon, offered and to mature.

I think we're still arguing the same side, however. I just like to use the word "shit" and you like-to-use-hyphens.

Of course, if there was a choice between PhysX and no physics at all the obvious answer would be to allow nVidia to continue doing it's thing. My argument, simply, is that it could be done.... let's use the dangerous word "better"... under an open source platform or as a byproduct of people using physics that pertain to whatever engine they are running.

Regardless of how extreme my points may be, you cannot simply dismiss them just because it's written in a way that would make it seem otherwise. To summarize my argument as "pish posh, cheerio good show my lad" and to completely disregard what i was getting at entirely is quite disheartening considering I'm trying to prove a point. Yes, I write with the fervor of a thousand sparrow's picking at my eyes, but I'd like you to look at what i'm actually saying, because i really don't think our viewpoints are too far off at all.

While PhysX had its place in a few key titles, I think games like BFBC2 show the potential of something that's not PhysX doing something rather cool to buildings and objects, as well as lighting.

Also, I'd like to point out that i was responding to someone's post, not necessarily making a claim. I thought the post was rather ridiculous, and the thread had been sitting on it marinating for at least a day.
 
Whats pathetic is teh attempt by all the fanATIcs on this sight to down play more eyecandy game play emersion.

The argument is not against eye candy, but rather the market interaction revolving around said eye candy. Goodness gosh.
 
The argument is not against eye candy, but rather the market interaction revolving around said eye candy. Goodness gosh.

Coulda fooled me with the lieks of Mr. K6, 60Wman and a few others all saying physx is a gimick, junk, adds nothing to the game. Hell 60Wman is even claiming the bullshit claim that has been debunked time and again by teh dev about Nvidia locking out ATI from AA.
 
I think we're still arguing the same side, however. I just like to use the word "shit" and you like-to-use-hyphens.

And if you get real talented one can use the word shit with hyphens.


It's hard to have a conversation with someone with this view:

alanstein said:
I hope PhysX burns to the ground

http://www.hardforum.com/showpost.php?p=1035310349&postcount=50

I really like the prospect of GPU Physics or any vehicle that offers the ability so we're going to be like oil-and-water. It seems that ATI is going to introduce some work with Bullet at this up-and-coming GDC -- hopefully this will be promising.

Dave Hoff said:
We're continuing to make good progress with the OpenCL physics work with Erwin and his Bullet Physics team at Sony and the Bullet enhancements we're creating for them. We'll have more to show at the upcoming Game Developer's Conference including some other pieces that augment the Bullet ecosystem for Developers.

http://www.rage3d.com/reviews/video/ati_hd5570/index.php?p=8
 
And if you get real talented one can use the word shit with hyphens.


It's hard to have a conversation with someone with this view:



http://www.hardforum.com/showpost.php?p=1035310349&postcount=50

I really like the prospect of GPU Physics or any vehicle that offers the ability so we're going to be like oil-and-water. It seems that ATI is going to introduce some work with Bullet at this up-and-coming GDC -- hopefully this will be promising.



http://www.rage3d.com/reviews/video/ati_hd5570/index.php?p=8

Its funny they are working with bullet which is cuda based. guess tehy really have no plans for an opencl version for physics.
 
Coulda fooled me with the lieks of Mr. K6, 60Wman and a few others all saying physx is a gimick, junk, adds nothing to the game. Hell 60Wman is even claiming the bullshit claim that has been debunked time and again by teh dev about Nvidia locking out ATI from AA.

I'm sorry but where did I even say anything about AA in Batman in my post? Oh what's that? Physx does look fake? When you run past paper it's like your fucking Pecos Bill or some shit, riding a tornado. Havok although limited by CPU looks a lot more real to me.
 
I'm sorry but where did I even say anything about AA in Batman in my post? Oh what's that? Physx does look fake? When you run past paper it's like your fucking Pecos Bill or some shit, riding a tornado. Havok although limited by CPU looks a lot more real to me.

Only in the mind of someone who like preprogrammed rail nausiating physics. real my ass.
 
Coming from someone who doesn't know good game physics from bad and worse, I'll ignore you from here on out.
Depends on what you mean by "good" game physics. Realistic modeling of the world or over done flair?
 
Depends on what you mean by "good" game physics. Realistic modeling of the world or over done flair?

How about game physics that has objects react in way that they would if they were real when interacted with as abosed to preprogrammed, nausiating rail induced crap.
 
How about game physics that has objects react in way that they would if they were real when interacted with as abosed to preprogrammed, nausiating rail induced crap.

lol There is no such engine, not even PhysX. PhysX is preprogrammed, nauseating rail induced crap.
 
Nvidia PhysX is reactionary as object do and will behave based on what is done to them. Havok and bullet are rail induced preprogrammed induced nausiam
 
Nvidia PhysX is reactionary as object do and will behave based on what is done to them. Havok and bullet are rail induced preprogrammed induced nausiam

I don't think you could have possibly worded that any more objectively.

How much is nVidia paying you?
 
How about game physics that has objects react in way that they would if they were real when interacted with as abosed to preprogrammed, nausiating rail induced crap.
Nvidia PhysX is reactionary as object do and will behave based on what is done to them. Havok and bullet are rail induced preprogrammed induced nausiam
I was hoping you'd say something along those lines; thanks for showing everyone your true objective here. So how much is NVIDIA paying you?
 
I was hoping you'd say something along those lines; thanks for showing everyone your true objective here. So how much is NVIDIA paying you?

What objective would that be? The want for Physics that are more true to life than preprogrammed rail based, never changing always the same no matter what is done physics? Then yeah, thats my objective.
 
What objective would that be? The want for Physics that are more true to life than preprogrammed rail based, never changing always the same no matter what is done physics? Then yeah, thats my objective.

Now you're just blatantly trolling. If you're not, then there's no point in continuing this conversation.
 
Last edited:
Now you're just blatantly trolling. If you're not, then there's no point in continuing this conversation.

The only ones trolling this thread are those advacating for preprogrammed nonreactionary physics where no matter what happens, its always the same and never changes. Hmmmm, kinda like you, Mr. K6 and that other nobody.
 
It's a pretty safe bet that many may agree that things are not ideal right now. But, things are starting to heat up and the GPU Physic ball is rolling at least and in the discussion -- some good, negative, extremism and emotional; it's all good over-all to me. Sure, many may have different views but hopefully we may end up at the same path where open standards may be agreed upon so things can move quicker for developers and the consumer from an entire industry point-of-view.

Politics aside, that is why I enjoyed HardOcp's evaluation because it was an evaluation on the specific gaming title on this lightning rod that is PhysX with most of the politics left outside the door so-to-speak. How did the effects enhance the title? That's what is most important to me. That's all!
 
Back
Top