Too Bad PhysX Lowers FPS in Games...is it worth these numerous trade offs?

Status
Not open for further replies.
You need Physx to play Batman: confirm/deny?

Confirmed.
Even the console versions use the PhysX middleware...but on the PC (with a NVIDIA GPU) they added way more physics for better I.Q/Immersion.
 
Confirmed.
Even the console versions use the PhysX middleware...but on the PC (with a NVIDIA GPU) they added way more physics for better I.Q/Immersion.
Can you play Batman: AA without an nVidia card or a PhysX hardware solution?

Yes.

Thanks.
 
I rest my case. Keep drinking the koolaid








ps. Consoles suck and immersion is for fat mmo nerds.
 
And I've shown you proof that the performance you get with CPU physics performing comparable effects is often unacceptable. PhysX can be run on CPUs. So far the performance in even the tech demos is pretty aweful. That's not even adding in a game engine, audio effects, player input, etc.

Ah, but therein lies the problem. You can't compare GPU PhysX to CPU PhysX because Nvidia wants GPU PhysX to succeed, *not* CPU PhysX. Thus, they have no motivation to make CPU PhysX anything other than "good enough" for devs to use it.

You base this on what? Games are mostly GPU bound rather than CPU bound. Given the massive difference in performance between CPU and GPUs in terms of running games, your math doesn't remotely check out. Again you base this on a very vague notion and understanding of how game engines work.

As you said, games are mostly GPU bound. Thus, there isn't a technical reason why the devs couldn't have exposed the physics quality levels for those without GPU PhysX. Those of us with OC'd i7s would certainly be able to handle at least some of the extra effects.

Given that the game recommends a dedicated Geforce GTX 260 for PhysX processing, I'm inclined to believe that the PhysX effects are quite demanding. I also base this statement on the performance differences between GPU and CPUs in regard to PhysX processing.

I don't know where you see that the game recommends a dedicated GTX 260 (couldn't find any reference searching google, anyway), but that would be because Nvidia wants people to buy expensive cards, not cheap ones.

A dedicated 9600GSO performs the same as a dedicated 9800GT - a dedicated GTX 260 is definitely *NOT* needed.
http://www.elitebastards.com/index....rticle&id=842&catid=14&Itemid=27&limitstart=3

The lack of CPU usage in Batman Arkham Asylum is NOT necessarily due to PhysX implementation. The game, like most others, is far more GPU bound than CPU bound.

However, turn up the PhysX effects without an nvidia GPU and FPS drops while CPU usage doesn't spike, THAT is due to horrid CPU scaling in PhysX.

I don't know what could be done on the CPU without taking a massive performance hit and frankly neither do you. All we can do is speculate. You seem to believe more can be done in this regard than I do. At least in regard to Batman Arkham Asylum. I say the performance hit for some of it would be too massive. If you are only getting 30FPS now then I think almost any performance hit would be unacceptable. You can disagree all you like and this is really a matter of opinion. Neither one of us can pretend to present facts here. I showed data that I believe supports my opinion and you did the same.

We certainly know what CAN be done - just look at what other physics engines and games can do. You call it apples and oranges and say they aren't comparable, but they most certainly are comparable. If engine A can do what engine B can't, its because of problems with engine B - not the hardware.
 
The baseline for HIGH PhysX in Batman is a 9800GTX, as per the options menu. The GTX260 was for running the game with one card for graphics and LOW physics (or did they call it normal, I can't remember).
 
ps. Consoles suck and immersion is for fat mmo nerds.

yeah i guess immersion must be for fat mmo nerds who want better graphics, sound, physics, a.i., realism, etc. since all those factors improve the immersion in a game. so we should all just stick with 2d games (since the shift to 3d obviously boosts the immersion factor) so none of us will be labeled as a fat mmo nerd.

p.s. - being somewhat of a casual mmo player myself, i take slight offense to being stereotyped as a fat nerd, of which i am neither.

p.p.s. - and consoles don't suck for me if i can't get my tekken 6, uncharted 2, brutal legend, and ninja gaiden 2 sigma fix on the pc.
 
Last edited:
I'm running a "single" GTX260 and found that Batman is very playable with Physics on high
The card is clocked 15% approx.
These are my figures at 1080p with [email protected]:

AA Min Max Avg
0xAA 34, 76, 56
16xAA 27, 64, 47
(16QAA: not enough video memory, chugs)

Its possible that you need a fast CPU to really push the GTX260 but I havent verified if that is important yet.
As mine is at 4.1GHz, this could be the reason why I'm getting decent results.

Plugging an 8800GT in (mild overclocked version running stock speed) for PhysX nets a 35% min framerate increase at 0xAA and a 52% min framerate increase at 16xAA.
An 8800GT really will help if you have speed issues.

So this demonstrates that at 1080p, a single GTX260 can give a very playable experience with PhysX on high, no need for an extra PhysX card.
But if you have a slower CPU and framerates are poor, using a second card for PhysX could help tremendously.

edit:
Just to clarify, in my tests the CPU isnt doing the High PhysX effects, the GPU is.
My post is not about using the CPU for PhysX, its about using the GPU for PhysX and how GPU PhysX may be helped by a faster CPU :)

sod it I'm gonna test it now...
 
Last edited:
no one said it is necesary to have a hardware physx capable gpu to enjoy batman. pretty much anyone with a decent pc setup or a console obviously can enjoy the game. having a hardware physx capable gpu just allows for added eyecandy as an option, just like having a dx10 gpu and vista allowed for added eyecandy in some games as a bonus.

as far as cpu scaling, well none of the primarily used physics engines seem to do well in that department, so i guess just wait for things to possibly improve in that area or until maybe something better comes along to rectify that problem.

all i know is regardless of the physics engines being used, not very much has changed for quite some time in terms of general cpu based physics for whatever reasons. seems to be to be somewhat of a dead end given the slow growth and adoption rate of multicore cpus, where a whole lot more cores would be needed for more efficient parallel processing for physics, especially with something like fluid simulations. i guess amd seems to realize this as well as nvidia - hence their push for more advanced gpu physics instead of just relegating physics to their own cpus. this will only become even more apparent as more advanced physics processing that will improve gameplay beyond just mere eyecandy eventually makes its way in the future.
 
CPU scaling results are in!
GTX260 clocked (720/1475, 1208), no PhysX card, PhysX set to highest.

CPU @ 4.1GHz
AA Min, Max, Average
0xAA 30, 79, 56
16xAA 26, 65, 47

CPU @ 3.0GHz
AA Min, Max, Average
0xAA 30, 82, 57
16xAA 27, 65, 47

CPU @ 2.5GHz
AA Min, Max, Average
0xAA 29, 84, 55
16xAA 26, 65, 47

CPU @ 2.0GHz
AA Min, Max, Average
0xAA 24, 77, 50
16xAA 11, 73, 45


A 3GHz E8400 gives the same performance as 4.1GHz, pretty good!
At 2.5GHz its too slow to use any AA but is playable.
2.0GHz stutters a bit without AA.

So a large range of older PCs can get playable framerates with a single GTX260, PhysX on max at 1080p.
NVidia bent the truth a bit ;)
 
Those 16xAA numbers don't look right at all. No game takes such a low FPS hit at that level of AA, the UE3 engine especially.

Also are you running with or without ambient occlusion?
 
You need Physx to play Batman: confirm/deny?

PhysX support is built into the game engine. It simply uses it. It runs both CPU/Software PhysX and has support for optional hardware accelerated PhysX for additional effects.

I honestly don't think that we're going to see a single mainstream implementation of a physics API any time soon (unless someone like Microsoft does step in and finally helps to make a decision on it).

The thing I fear the greatest (and sadly that which I think is most likely to come to fruition), is that we'll actually see three competing types: PhysX with nVidia's support, Havok with Intel's support, and then Bullet (or whatever AMD is pushing at that point) with AMD's support.

It's funny, people complain now about the nightmare that is nVidia controlling PhysX and trying to push that as the top standard (and I'm in this camp, as I don't want any one company to have completely control over whatever the top physics API ends up being). But just imagine what type of cross-linking Intel can do in regards to pushing Havok as the advantageous physics API to use (combining a Intel GPU, with a Intel-supported physics API, on an Intel processor). A lot of people here (seemingly mostly nVidia fans) scoff at the idea of Larrabee competing against the top GPUs from nVidia or AMD. While I agree it's still a bit of a ways off, I think a lot of people underestimate Intel's determination or ability to compete, lol.

I completely agree.

Confirmed.
Even the console versions use the PhysX middleware...but on the PC (with a NVIDIA GPU) they added way more physics for better I.Q/Immersion.

Correct.

Can you play Batman: AA without an nVidia card or a PhysX hardware solution?

Yes.

Thanks.

Correct.

Ah, but therein lies the problem. You can't compare GPU PhysX to CPU PhysX because Nvidia wants GPU PhysX to succeed, *not* CPU PhysX. Thus, they have no motivation to make CPU PhysX anything other than "good enough" for devs to use it.

This is simply a conspiracy theory type of argument. NVIDIA owns the PhysX API/Physics engine which they can potentially make money off of in the console market and even on PC games. They do indeed have motivation to make CPU PhysX processing succeed given the situation with consoles.

So until you provide some proof that this "theory" is true, I'm going to have to disagree.

As you said, games are mostly GPU bound. Thus, there isn't a technical reason why the devs couldn't have exposed the physics quality levels for those without GPU PhysX. Those of us with OC'd i7s would certainly be able to handle at least some of the extra effects.

How many times do I have to say it? The difference between GPUs and CPUs in regard to physics processing is HUGE. Yes, there is no doubt in my mind that some increased level of physics effects could have been handled by the CPU. However, the devs made certain choices and their reasoning isn't necessarily what we would agree with, or even understand. They may simply be (as dev studios often do) pandering to the lowest common denominator. I think they really opted for an easier route with Batman Arkham Asylum. Rather than taking advantage of the extra CPU cores (which many people don't even have anything beyond a dual core at this point) they opted to support only standard PhysX support for the regular game physics and then leverage GPU based PhysX for added effects only for those people with NVIDIA cards. I suspect this was simply an easy way to go rather than trying to do things the way you wanted them to have done them. Also don't lay the blame on NVIDIA for this one. They didn't code the game. Rocksteady Studios did.

I get it. The PhysX effects are cool and it would be nice if some of the wasted CPU cycles could have been used to give non-NVIDIA card owners at least a taste of some of these effects.

I don't know where you see that the game recommends a dedicated GTX 260 (couldn't find any reference searching google, anyway), but that would be because Nvidia wants people to buy expensive cards, not cheap ones.

You are correct. A dedicated 9800GTX+ is recommended for "High" PhysX settings.

Youarewrong.gif


A dedicated 9600GSO performs the same as a dedicated 9800GT - a dedicated GTX 260 is definitely *NOT* needed.
http://www.elitebastards.com/index....rticle&id=842&catid=14&Itemid=27&limitstart=3

Look at my above screenshot. At 2560x1600 I doubt a 9600GSO would get the job done. Though to be fair, I haven't tested that. I don't have one laying around. I can say that at 1680x1050 the game runs great with PhysX effects on medium with a single 9800GX2.

However, turn up the PhysX effects without an nvidia GPU and FPS drops while CPU usage doesn't spike, THAT is due to horrid CPU scaling in PhysX.

This maybe true. The CPU scaling may suck, but unless you are an experienced game developer with PhysX programing/implementation experience, I'm going to venture a guess that you don't know enough about it to say definitively that there is something that can be done about it. Your argument is theoretical at best. (So is mine, we just have differing viewpoints.)

We certainly know what CAN be done - just look at what other physics engines and games can do. You call it apples and oranges and say they aren't comparable, but they most certainly are comparable. If engine A can do what engine B can't, its because of problems with engine B - not the hardware.

Bullshit. Things aren't that simple. Engines can't be compared that way. Sure you can compare the physics effects themselves, but you can't compare the rest of the game engines the same way and you should damn well know that. Running Unreal Tournament 2003 isn't directly comparable to running Rainbow Six games or anything else. Even when you talk about games that use the same engines, textures, physical models, and other factors change things. You are comparing celery to grapefruits.

The baseline for HIGH PhysX in Batman is a 9800GTX, as per the options menu. The GTX260 was for running the game with one card for graphics and LOW physics (or did they call it normal, I can't remember).

True.
 
Those 16xAA numbers don't look right at all. No game takes such a low FPS hit at that level of AA, the UE3 engine especially.

Also are you running with or without ambient occlusion?

Those are the results :)

Oddly I havent got the ambient occlusion entry in the NVidia driver so its probably off.
 
You still need the PhysX middleware installed or it's a no go ;)
I don't care whether the PhysX *software* middleware is installed on my system or not, in order to run a game. At that point, it's no different than any other physics software middleware implementation.

For all of the games that utilize Havok, you have to let it install as part of the game installation.

For all of the games that use Bullet, you have to let it install as part of the game installation.

And if a game utilizes PhysX, I'll let it install, just as I would any of the other competing implementations.

However, the entire essence of my comment, which you overlooked, is that you do *not* need an nVidia GPU in order to run Batman: AA.

You also do *not* need to be running a hardware implementation of PhysX (and thus *having* to use an nVidia GPU for PhysX) in order to play Batman: AA.

Will the game have less eye candy? Sure. And noone is disputing that. However, Batman: AA is still very much playable without that little extra. You seem to be having a hard time understanding that.

It reminds me of the old infamous console debates, where you'd have two next generation systems released a year apart. The latter system was usually the more powerful, and would end up looking *slightly* better in some regard or another. The supporters would yell "Hey look, games on our system look so much better".

Was it noticeable? Sure. About as noticeable as the difference between 720p and 1080p. At the end of the day, I really just didn't care. lol. Some people do though. Good for them. I don't feel like I should rush out, and a) support one company blindly just because they offer a semi-nice feature, or b) go out and spend a considerable amount of money to replace my GPU (if it isn't an nVidia GPU), just to be able to take advantage of it.

Sure, you can throw in a cheap nVidia GPU and use the driver hack to get around nVidia's bullshit. But what if I don't want to do that? It means I'm probably out a few hundred dollars, just to get some extra eye candy in a game. Sorry, I just don't see it being worth it.
 
I don't care whether the PhysX *software* middleware is installed on my system or not, in order to run a game. At that point, it's no different than any other physics software middleware implementation.

For all of the games that utilize Havok, you have to let it install as part of the game installation.

For all of the games that use Bullet, you have to let it install as part of the game installation.

And if a game utilizes PhysX, I'll let it install, just as I would any of the other competing implementations.

However, the entire essence of my comment, which you overlooked, is that you do *not* need an nVidia GPU in order to run Batman: AA.

You also do *not* need to be running a hardware implementation of PhysX (and thus *having* to use an nVidia GPU for PhysX) in order to play Batman: AA.

Will the game have less eye candy? Sure. And noone is disputing that. However, Batman: AA is still very much playable without that little extra. You seem to be having a hard time understanding that.

It reminds me of the old infamous console debates, where you'd have two next generation systems released a year apart. The latter system was usually the more powerful, and would end up looking *slightly* better in some regard or another. The supporters would yell "Hey look, games on our system look so much better".

Was it noticeable? Sure. About as noticeable as the difference between 720p and 1080p. At the end of the day, I really just didn't care. lol.

Well have fun at Console-port-land then.
Stuck at 1080P port with DX9 I.Q.

I will have fun in PC land then, where I get way more immersion.
I tried playing Batman AA again ,without PhysX...it lasted for 10 mins....the I turned of the "4 year console PC port" and started my "PC game" again.

Batman AA is a game where the mental state of Batman and immersion go hand in hand...due to GPU physics..
 
it is understood that the physics engine being used in batman isn't the basis for the argument of whatever this thread is supposed to be about, lol. it is the gpu implementation/ acceleration of some optional eyecandy added in the game that is the issue of contention in relation to performance. and in the end, it is up to the end user to decide what trade-offs are acceptable whether it be graphics, physics, resolution, etc. depending on the hardware they own. everyone has their own opinions in terms of what level of performance they deem acceptable and whether enabling higher levels of eyecandy is worth it to them and that's that.

p.s. i have a feeling this thread will pop up again when dark void and mafia 2 both come out in a few months...

p.p.s. try not to feed the trolls, regardless of color.
 
Last edited:
You must really hate Crysis?
Or ARMAII?

These effects are worth losing the +30 FPS I can't see anyways:
http://www.youtube.com/watch?v=6GyKCM-Bpuw

AFYI....not all games neeeds 60FPS.
The first Call of Duty worked fine with 30FPS.
Crysis.

Your premise is flawed...but that is no news :rolleyes:

EDIT: they call a Core2 dual-core for HIGH-END? *ROFL*
I wonder how much that CPU limited that DUAL-GPU card...

I haven't read the whole thread, but is that really valid? I mean, I have seen smoke and explosion effects (when a projectile hits a wall) before. While they may not be 100% physically valid, they are close enough approximations. So the smoke may not swirl when you run through it. Batman seems to be either all or nothing. Shouldn't it be appropriate to compare the best non-PhysX effects to those using it? Just because a developer chose to omit effects because you do not have PhysX does not mean that they cannot be closely emulated. That seems more like a failure on the developer's side than anything else.
 
I haven't read the whole thread, but is that really valid? I mean, I have seen smoke and explosion effects (when a projectile hits a wall) before. While they may not be 100% physically valid, they are close enough approximations. So the smoke may not swirl when you run through it. Batman seems to be either all or nothing. Shouldn't it be appropriate to compare the best non-PhysX effects to those using it? Just because a developer chose to omit effects because you do not have PhysX does not mean that they cannot be closely emulated. That seems more like a failure on the developer's side than anything else.

Once upon a time, geeks marveled when geeks introduced new effects/physics/features.
(Remember the buzz over HL2's physics?)
They would spend hours and hour overclocking their hardware to squezze more FPS out of games...so they could run them at max I.Q....and looked forward to new hardware being released, so they could add more IQ.
That would be apperant ofor anyone who has been a gamer since the 3Dfx days and followed the growth of power in GPU's to add more AA...to add more AF.
Then C.G. (console-generation) happend.

Now people whine every time a new new effects/physics/features are introduced(just look at PhysX).
Now people whine over games that really pushes the hardware (Crysis anyone?)
Seems people wanne live in a wolrd where gaming is staic and dead and all that matters are FPS e-peen...I have 5904 FPS in "Call of Duty 45"...WOHHHOOOO this game ROCKS.

I fear for the further advancement of gaming if the whiners win.

EDIT:
If you look at my sig you will see that I like new hardware.
I always have....I like to toy with it..and even if I only gain 0.0000.....1% with the hardware...it's still a GAIN.
But showing that today will give you some flak.
"Did you rally buy X?
"Ha, you have Y in your sig...your words have no vlaue, n00b"
"Why did you waste your money on Z?"
On a forum.
For hardware geeks.
In the last couple of years something changed.
I blame consoles.
Because on a console everyone is equally limited.
Even a small kid can figure out to hit the powerbutton.
Creating gamers that really are not into hardware...or technology...just as lon as they get spoonfeed and don't have to tinker.
What I hate is to see that mentality spread over to PC's.
But is seems like we are getting there.

Just look at the OP's nick....I rest my case.
 
Last edited:
I fear for the further advancement of gaming if the whiners win.

I'm not whining. Once again, I have seen approximations of those effects. It really is best to compare approximations to the full PhysX deal. When you compare full PhysX to nothing at all (and ignoring what is out there) you are being disingenuous. I fully support a move to advance physics effects, especially if it is an open standard. Better physics is a good thing.

However, comparing full physics to nothing (which is what your clip did, thanks to the developers) is not really honest. Other developers manage to create something. Why should this developer be exempt from criticism for not doing something for non-PhysX hardware, when the majority of people do not use PhysX?
 
Well have fun at Console-port-land then.
Stuck at 1080P port with DX9 I.Q.

I will have fun in PC land then, where I get way more immersion.
I tried playing Batman AA again ,without PhysX...it lasted for 10 mins....the I turned of the "4 year console PC port" and started my "PC game" again.

Batman AA is a game where the mental state of Batman and immersion go hand in hand...due to GPU physics..
I fail to see how you get so much more "immersion" simply because of hardware-accelerated physics. It's nice, don't get me wrong, and yes, it does add something to the game. That's not in denial.

But the extent at which you seem to be taking "immersion", we'd almost think you're standing on a holodeck. If you really are getting that much enjoyment out of it though, then more power to you.

I've played Batman: AA on both my PS3 and my PC with hardware PhysX enabled. I can see the extra little niceties of hardware PhysX acceleration. But it's in no way revolutionary. It wouldn't make me rush to get an nVidia GPU simply to have it. The PS3 version was just fine, and to me, just as much fun.

And I hate to break it to you, but within a few years, almost all of the games you will be playing will likely be ports. It's far more wise for a company to first release a game on consoles, and then port it to the PC.

You'll still have a few companies, like Blizzard, who will largely focus on the PC market, and not do ports from console-first games. But it'll be extremely rare.
 
EDIT:
If you look at my sig you will see that I like new hardware.
I always have....I like to toy with it..and even if I only gain 0.0000.....1% with the hardware...it's still a GAIN.
But showing that today will give you some flak.
"Did you rally buy X?
"Ha, you have Y in your sig...your words have no vlaue, n00b"
"Why did you waste your money on Z?"
On a forum.
For hardware geeks.
In the last couple of years something changed.
I blame consoles.
Because on a console everyone is equally limited.
Even a small kid can figure out to hit the powerbutton.
Creating gamers that really are not into hardware...or technology...just as lon as they get spoonfeed and don't have to tinker.
What I hate is to see that mentality spread over to PC's.
But is seems like we are getting there.

Just look at the OP's nick....I rest my case.

Yeah, I caught that edit.

I bought a 4870x2 to play DoD:Source better (the only game I play competitively), and try out other games. I am not thrifty with my hardware purchases; in fact, the only reason I have not bought into the 5800 family is that I am waiting for the 5870x2. Not everybody is stupid. Not everybody needs the simplicity of a console. Not everybody needs to be looked down on.

ALL that I was saying was that Batman does not seem to be a very good example, but it's what you chose. In practice, perhaps, and I'll give you that, it shows an advantage in certain games. But it seems like the developers coded it for PhysX and then said, "Fuck it." after that. Better developers would have done some approximation rather than leave things like smoke and fog out of the game completely if PhysX was not used. That is not a testament to the power of PhysX, it's a critique of a lazy developer who would not do anything but that alternative. It is a bad comparison. Show me PhysX and then an approximation that can run on other video cards, and you have a basis for a tradeoff between image quality and framerate. If you just strip all features and say, "Hey, you can't have this without PhysX" you're just lying.
 
I'm not whining. Once again, I have seen approximations of those effects. It really is best to compare approximations to the full PhysX deal. When you compare full PhysX to nothing at all (and ignoring what is out there) you are being disingenuous. I fully support a move to advance physics effects, especially if it is an open standard. Better physics is a good thing.

However, comparing full physics to nothing (which is what your clip did, thanks to the developers) is not really honest. Other developers manage to create something. Why should this developer be exempt from criticism for not doing something for non-PhysX hardware, when the majority of people do not use PhysX?

i see where you're coming from, but i have to disagree. it has nothing to do with honesty in terms of the actual comparison. the game is what it is. none of these additional effects would be in the game if nvidia didn't offer an incentive to rocksteady to integrate gpu physx into the game. it probably went something like nvidia asking the devs add some special stuff for gpu physx outside of the normal software physx and telling them that they will compensate the devs for their efforts; therefore, we have gpu physx in batman. the comparison is just showing what was added in the game as a result of gpu physx versus without.
 
I fail to see how you get so much more "immersion" simply because of hardware-accelerated physics. It's nice, don't get me wrong, and yes, it does add something to the game. That's not in denial.

But the extent at which you seem to be taking "immersion", we'd almost think you're standing on a holodeck. If you really are getting that much enjoyment out of it though, then more power to you.

I've played Batman: AA on both my PS3 and my PC with hardware PhysX enabled. I can see the extra little niceties of hardware PhysX acceleration. But it's in no way revolutionary. It wouldn't make me rush to get an nVidia GPU simply to have it. The PS3 version was just fine, and to me, just as much fun.

And I hate to break it to you, but within a few years, almost all of the games you will be playing will likely be ports. It's far more wise for a company to first release a game on consoles, and then port it to the PC.

You'll still have a few companies, like Blizzard, who will largely focus on the PC market, and not do ports from console-first games. But it'll be extremely rare.

the point is that gpu physx is merely a bonus for those that can take advantage of it. of course you can still enjoy the game without it, but for some (atech especially lol), it can heighten their experience. just like you can play call of juarez on the consoles or on xp via dx9, but for those that have the hardware and os, they can play it with the dx10 patch, and that would be a bonus for them as well. moreover, i agree that the majority of titles will continue to be multiplatform or console ports. that's why something like gpu physx is trying to help differentiate the pc platform from the consoles. that and dx10/11, 3dvision, eyefinity/ triplehead, etc. it's only a matter of time before amd has gpu physics support themselves. then maybe some of the complaints will go away.
 
i see where you're coming from, but i have to disagree. it has nothing to do with honesty in terms of the actual comparison. the game is what it is. none of these additional effects would be in the game if nvidia didn't offer an incentive to rocksteady to integrate gpu physx into the game. it probably went something like nvidia asking the devs add some special stuff for gpu physx outside of the normal software physx and telling them that they will compensate the devs for their efforts; therefore, we have gpu physx in batman. the comparison is just showing what was added in the game as a result of gpu physx versus without.

The game is what it is, but other people have done it. Once again, you can't just post a Youtube vid of "here is the potential of PhysX vs. something that doesn't even try." Look at what can be done without PhysX and then make a comparison. I really am not advocating one way or another, I just want a valid comparison. I can see where I sound like I am anti-PhysX because I am saying how the non-PhysX option isn't fair, but if people have played other games they can see how it's not all-or-nothing. But Batman seems to be just that.

I understand that Nvidia probably paid these people to do what they did. Still, in my mind, you have to compare potential vs. potential, even if it is across games. What one developer did may be heavily influenced by money received to implement those measures. I think that the difference here is you are arguing practical vs. practical in the same game, and while I may want to see potential vs. potential, I am content comparing practical vs. practical in different games.
 
Now people whine every time a new new effects/physics/features are introduced(just look at PhysX).
No, people are not complaining over new effects/features/physics/etc are implemented. People are complaining about *HOW* it is implemented. You seem to be the only one who is truly failing to grasp that.

I have nothing against physics implementation into games. What I (and many others) do have a problem with, is one company (nVidia) trying to impose their proprietary API on everyone. When Ageia was its own entity, I thought PhysX was an interesting concept that had a long way to go, but could be promising. I also thought that Havok FX was promising. Unfortunately, both were bought by larger corporations and essentially closed off (PhysX) or canceled (Havok FX).

The only ones who seem to truly think that PhysX should be *THE* standard to use, are nVidia fans.

Now people whine over games that really pushes the hardware (Crysis anyone?)
In general, I rarely see enthusiasts truly whining about games that push their hardware. Sure, they might be upset at how they have to fork out X amount of dollars to ultimately get better hardware to play said games better, but most generally seem to accept that it's part of the territory when being an enthusiast.

However, you do see many enthusiasts complaining about PhysX. Not because of *what* PhysX does, but because of *how* it's being implemented. That's the major difference, and the concept which you overlook/ignore/pass off.

Seems people wanne live in a wolrd where gaming is staic and dead and all that matters are FPS e-peen...I have 5904 FPS in "Call of Duty 45"...WOHHHOOOO this game ROCKS.
Well, at the end of the day, you could have all the most wonderful physics in the world, but if the game is running at 20 frames per second, it's not going to be fun.

That's the irony here, you're so in love with physics, you'll gladly ignore whether or not the game is even playable. "HEY LOOK, THE SMOKE SWIRLS AROUND MY LEGS AS I RUN THROUGH IT... at 15 frames per second." That's obviously an exaggeration, well, in general, but it makes the point.

It's actually kinda funny, you're taking the same stance on FPS (that it's not the end-all, be-all of gaming), that many others take with PhysX.

I fear for the further advancement of gaming if the whiners win.
Gaming has been going strong for decades. It will continue to go strong. Consoles already largely are driving it, and will continue to do so in the future. Look at the number of console sales in the world. Conservatively, the PS3/360/Wii are at roughly 100 million systems sold. Yes, there's a lot of redundancy in terms of one owner having multiple systems, so let's say it's at 50 or 60 million unique owners. Do you really think there's that many serious PC gamers in the world? Exactly.

If you look at my sig you will see that I like new hardware.
I always have....I like to toy with it..and even if I only gain 0.0000.....1% with the hardware...it's still a GAIN.
But showing that today will give you some flak.
I like new hardware too, as do many others on this site. Purchasing/using new hardware doesn't get flak. Having blind loyalty in companies, gets flak.

My new system I'm building is using a Radeon 5850 or 5870. Why? Because they're the top cards on the market right now. It has nothing to do with the fact that I dislike the approach that nVidia is taking with PhysX (which I very much do. It bothers me to no end some of the stuff they're doing these days, such as their driver-induced PhysX lockout with AMD cards).

If nVidia had the top-performing card right now, I'd go with it, regardless of what I think of their business practices. But they don't, so I'm going with a Radeon this time around.
 
In case anyone hasn't read every page of this thread, I'll sum it up for you with 3 words.

Batman Arkham Asylum.
 
The game is what it is, but other people have done it. Once again, you can't just post a Youtube vid of "here is the potential of PhysX vs. something that doesn't even try." Look at what can be done without PhysX and then make a comparison. I really am not advocating one way or another, I just want a valid comparison. I can see where I sound like I am anti-PhysX because I am saying how the non-PhysX option isn't fair, but if people have played other games they can see how it's not all-or-nothing. But Batman seems to be just that.

I understand that Nvidia probably paid these people to do what they did. Still, in my mind, you have to compare potential vs. potential, even if it is across games. What one developer did may be heavily influenced by money received to implement those measures. I think that the difference here is you are arguing practical vs. practical in the same game, and while I may want to see potential vs. potential, I am content comparing practical vs. practical in different games.

you're right - other people have done "approximations" like you said of these kinds of effects. but the point of the comparison isn't to compare physics engines and their effects and how they are implemented with other physics engines/effects/implementations. it's merely to showcase the differences between gpu physx on and off as a result of gpu physx being added in the game. just like with some dx10 vs. dx9 comparison videos, where some effects would be "missing" from the dx9 version (even if "approximations" of those effects could be mimicked in dx9). the more appropriate viewpoint would be the effects were added in dx10, not removed from dx9. same case applies here. and if you want to make a comparison video of potential vs potential, feel free to do so.
 
Last edited:
the point is that gpu physx is merely a bonus for those that can take advantage of it. of course you can still enjoy the game without it, but for some (atech especially lol), it can heighten their experience.
And as I've been saying all along, GPU physics, whether PhysX or something else in the future, is a bonus. Noone is arguing that.

And I know it can heighten the gameplay experience. The difference is that some try to pass it off as this amazing, can't-game-without-it experience. They make it sound like PhysX is the defining reason for owning an nVidia card. And it's not, I'm sorry. What good is hardware PhysX, if the game is crap, or it's playing like crap?

All I want to see, is a non-proprietary, fully open physics API take hold and be used by everyone. Want to know what would be awesome? nVidia divest themselves of PhysX, and Microsoft picks it up and implements it as part of DirectX. But that'll never* happen.

*I reserve the right to undo this comment, as part of the "never say never" concept :p
 
And as I've been saying all along, GPU physics, whether PhysX or something else in the future, is a bonus. Noone is arguing that.

And I know it can heighten the gameplay experience. The difference is that some try to pass it off as this amazing, can't-game-without-it experience. They make it sound like PhysX is the defining reason for owning an nVidia card. And it's not, I'm sorry. What good is hardware PhysX, if the game is crap, or it's playing like crap?

All I want to see, is a non-proprietary, fully open physics API take hold and be used by everyone. Want to know what would be awesome? nVidia divest themselves of PhysX, and Microsoft picks it up and implements it as part of DirectX. But that'll never* happen.

*I reserve the right to undo this comment, as part of the "never say never" concept :p

i agree with most of what you are saying. so if you believe that to be the case, then i would recommend "not feeding the trolls" as i suggested earlier. as for the last part, something like that may happen in the near future. not the ms part, of course. nvidia may or may not eventually port gpu physx to opencl depending on if they are pressured competitively to do so and the implementation is comparable in performance to cuda. it's pretty much just wait and see what happens.
 
I fail to see how you get so much more "immersion" simply because of hardware-accelerated physics. It's nice, don't get me wrong, and yes, it does add something to the game. That's not in denial.

But the extent at which you seem to be taking "immersion", we'd almost think you're standing on a holodeck. If you really are getting that much enjoyment out of it though, then more power to you.

I've played Batman: AA on both my PS3 and my PC with hardware PhysX enabled. I can see the extra little niceties of hardware PhysX acceleration. But it's in no way revolutionary. It wouldn't make me rush to get an nVidia GPU simply to have it. The PS3 version was just fine, and to me, just as much fun.

And I hate to break it to you, but within a few years, almost all of the games you will be playing will likely be ports. It's far more wise for a company to first release a game on consoles, and then port it to the PC.

You'll still have a few companies, like Blizzard, who will largely focus on the PC market, and not do ports from console-first games. But it'll be extremely rare.


I lol'd at the holodeck. nice one. Quoted for more to see. I agree 100%



Surely there have been some great to legendary games before all this phsyx came along and immersion was enough ...or maybe I'm wrong? Keep drinking what Nvidia marketing feeds you and tells you what you need to enjoy pc games.
 
as for the last part, something like that may happen in the near future. not the ms part, of course. nvidia may or may not eventually port gpu physx to opencl depending on if they are pressured competitively to do so and the implementation is comparable in performance to cuda. it's pretty much just wait and see what happens.
Oh, I'd be willing to bet that nVidia already has PhysX working on OpenCL. The thing is though, I don't really want to see PhysX ever become "the standard", since companies such as AMD would have to license it from them (and for the record, I'll feel the same way if Havok were to become "the standard" someday, and everyone was forced to license it from Intel).

I'm fully in support of GPU-based physics. I simply want to see a standard that is 100% open and free for everyone.
 
but the point of the comparison isn't to compare physics engines and their effects and how they are implemented with other physics engines/effects/implementations. it's merely to showcase the differences between gpu physx on and off as a result of gpu physx being added in the game.

Not at all. The point is to show what PhysX can do that regular physics cannot. Otherwise, why would you even want PhysX? Come on, be reasonable.

I cut the rest of your post to highlight your poorly-made point. If DX10 can do the same effects as DX9 but at a higher framerate, GREAT. And I think that is the point that is more often made. "We can do XXX in DS9, but it's much more efficient in DX10." It may be more efficient, but show us. If it's 20 fps vs. 40 fps, AND it's a noticable difference in quality, GREAT! Let's go for it. If it's 35 vs. 40 fps and nobody can tell the difference, buy the DX10 hardware but consider DX9 if you're on a budget. Except this is really neither of those scenarios. It's still full PhysX vs. no PhysX, and then making a gameplay comparison based on that. It's a bad comparison when the developers did not do anything for the non-PhysX crowd.
 
only problem with ageia is that they really weren't going anywhere with ppus. one might complain that nvidia bought them out and are using the tech. to use their gpus as ppus, thus allowing for exclusive content. however, in doing so, it is allowing for hardware accelerated physics to be adopted at a much faster rate than would have ever been dreamed possible by ageia. since gpus are a necessity for gaming, and now they can do physics as a bonus right now. otherwise, the tech. probably would have stagnated and slowly faded away or put on the "backburner" in terms of gaming innovation/ improvement. faster progress is better progress, imo. the situation might not be ideal right now, but progress has to start somewhere, especially if mainstream adoption is a goal. ultimately, we will just have to wait and see what the future holds for the technology.
 
Yeah, I caught that edit.

I bought a 4870x2 to play DoD:Source better (the only game I play competitively), and try out other games. I am not thrifty with my hardware purchases; in fact, the only reason I have not bought into the 5800 family is that I am waiting for the 5870x2. Not everybody is stupid. Not everybody needs the simplicity of a console. Not everybody needs to be looked down on.

Didn't say that...so....?

ALL that I was saying was that Batman does not seem to be a very good example, but it's what you chose. In practice, perhaps, and I'll give you that, it shows an advantage in certain games. But it seems like the developers coded it for PhysX and then said, "Fuck it." after that. Better developers would have done some approximation rather than leave things like smoke and fog out of the game completely if PhysX was not used. That is not a testament to the power of PhysX, it's a critique of a lazy developer who would not do anything but that alternative. It is a bad comparison. Show me PhysX and then an approximation that can run on other video cards, and you have a basis for a tradeoff between image quality and framerate. If you just strip all features and say, "Hey, you can't have this without PhysX" you're just lying.

Show me any game doing the same with CPU physics.
Havok has been around for years, so it should be easy.


I fail to see how you get so much more "immersion" simply because of hardware-accelerated physics. It's nice, don't get me wrong, and yes, it does add something to the game. That's not in denial.

It adds immersion...if not more immersion, what word you you use?

But the extent at which you seem to be taking "immersion", we'd almost think you're standing on a holodeck. If you really are getting that much enjoyment out of it though, then more power to you.

It's the same as with AA and AF.
It adds more I.Q. giving more immersion...try playing without any AA of AF...it's the same game...but is the experince the same, nope.
It all adds up and where AA and AF are "static" addistions, physics is interactive addistion...a whole different ballgame.

I've played Batman: AA on both my PS3 and my PC with hardware PhysX enabled. I can see the extra little niceties of hardware PhysX acceleration. But it's in no way revolutionary. It wouldn't make me rush to get an nVidia GPU simply to have it. The PS3 version was just fine, and to me, just as much fun.

See above.

And I hate to break it to you, but within a few years, almost all of the games you will be playing will likely be ports. It's far more wise for a company to first release a game on consoles, and then port it to the PC.

Not really.
ARMA2, WiC, Crysis and Doom4 are/will not be console ports.
A console have limitations...most RST/RPG games simply don't cut it on a console.
FPS on console need to be dumbed down (autoaim ect) due to poor control.
CryEngine3 will show just how limited the 4 ears old consoels are now.

You'll still have a few companies, like Blizzard, who will largely focus on the PC market, and not do ports from console-first games. But it'll be extremely rare.

Well, DX11, GPU physics (No matter the PhysX nay-sayers, AMD and Intel are pushing hardware physics too on the PC) and the sheer amount more power on PC's (compared to 4 years old console hardware will (hopefully) show otherwise.

No, people are not complaining over new effects/features/physics/etc are implemented. People are complaining about *HOW* it is implemented. You seem to be the only one who is truly failing to grasp that.

Look at the OP, the OP is whining about the performance hit...

I have nothing against physics implementation into games. What I (and many others) do have a problem with, is one company (nVidia) trying to impose their proprietary API on everyone. When Ageia was its own entity, I thought PhysX was an interesting concept that had a long way to go, but could be promising. I also thought that Havok FX was promising. Unfortunately, both were bought by larger corporations and essentially closed off (PhysX) or canceled (Havok FX).

DirectX is proprietary.
No one besides Microsoft can add features.
You can't get physics in DX...unless Microsoft decides too.
How is DirectX fairing?

The only ones who seem to truly think that PhysX should be *THE* standard to use, are nVidia fans.

Cart before horse.
What I want is for AMD and Intel to get off their collective asses and get on the market.
It will level the peaying field..and make threads like this funny to read.
Because I am dead certain both Intel's and AMD's solutions will have the same performance impact.

In general, I rarely see enthusiasts truly whining about games that push their hardware. Sure, they might be upset at how they have to fork out X amount of dollars to ultimately get better hardware to play said games better, but most generally seem to accept that it's part of the territory when being an enthusiast.

Excuse me, try searching [H] for threads about Crysis...and see the vast amount of people whining that Crysis won't run at +60FPS maxed out on their mainstream rig.
Hell, try looking a Dan_D's signature..."inspired" by people in this very thread, whining about if games don't run maxed out on their year old rig...why should they buy it?

However, you do see many enthusiasts complaining about PhysX. Not because of *what* PhysX does, but because of *how* it's being implemented. That's the major difference, and the concept which you overlook/ignore/pass off.

To be blunt, most of the people whining about PhysX don't have the first clue about physics, the computations behind the scene...like the OP they whine about aded I.Q. cost performance.


Well, at the end of the day, you could have all the most wonderful physics in the world, but if the game is running at 20 frames per second, it's not going to be fun.

Then upgrade your hardware...or lower the settings
It is really that simple.
"the grapes are sour" mentality is not a valid argument.

]That's the irony here, you're so in love with physics, you'll gladly ignore whether or not the game is even playable. "HEY LOOK, THE SMOKE SWIRLS AROUND MY LEGS AS I RUN THROUGH IT... at 15 frames per second." That's obviously an exaggeration, well, in general, but it makes the point.

You can alway play PONG at +1000 FPS...better?

It's actually kinda funny, you're taking the same stance on FPS (that it's not the end-all, be-all of gaming), that many others take with PhysX.

False.
I tweak both my hardware (overclocking) and my software (games) to get the best experience.
Look at [H]'s real world gameplay evalutions.
They don't strive for +60FPS, but the settings that gives the BEST I.Q(and thus the best immersion)....at playable FPS.


Gaming has been going strong for decades. It will continue to go strong. Consoles already largely are driving it, and will continue to do so in the future. Look at the number of console sales in the world. Conservatively, the PS3/360/Wii are at roughly 100 million systems sold. Yes, there's a lot of redundancy in terms of one owner having multiple systems, so let's say it's at 50 or 60 million unique owners. Do you really think there's that many serious PC gamers in the world? Exactly.

The only thing the 4 year old console are doing right now is slowing progress.
Quantity != quality.
And I don't get your point?

I like new hardware too, as do many others on this site. Purchasing/using new hardware doesn't get flak. Having blind loyalty in companies, gets flak.

Like the anti-PhysX crowd isn't driving by AMD fans? :rolleyes:

My new system I'm building is using a Radeon 5850 or 5870. Why? Because they're the top cards on the market right now. It has nothing to do with the fact that I dislike the approach that nVidia is taking with PhysX (which I very much do. It bothers me to no end some of the stuff they're doing these days, such as their driver-induced PhysX lockout with AMD cards).

If nVidia had the top-performing card right now, I'd go with it, regardless of what I think of their business practices. But they don't, so I'm going with a Radeon this time around.

See, this is where it gets murky.
Some games (like Batman AA), shows that even the new AMD cards don't give the best I.Q.
Instead of whining about PhysX, punk AMD for dragging their asses.
AMD(then ATI) made a lot of PR-FUd about "we can do GPU-physics" too...back in 2006.
It's now close to 2010..and they have nothing to show.

That is sad.
 
only problem with ageia is that they really weren't going anywhere with ppus. one might complain that nvidia bought them out and are using the tech. to use their gpus as ppus, thus allowing for exclusive content. however, in doing so, it is allowing for hardware accelerated physics to be adopted at a much faster rate than would have ever been dreamed possible by ageia. since gpus are a necessity for gaming, and now they can do physics as a bonus right now. otherwise, the tech. probably would have stagnated and slowly faded away or put on the "backburner" in terms of gaming innovation/ improvement. faster progress is better progress, imo. the situation might not be ideal right now, but progress has to start somewhere, especially if mainstream adoption is a goal. ultimately, we will just have to wait and see what the future holds for the technology.

I actually think that the loss of Havok in terms of being an independent company, was far worse for physics implementation than the loss of Ageia to nVidia.

I'll fully agree, that with nVidia pushing PhysX, it's "adoption" has been greater solely for the fact that so many people use nVidia graphics cards. That's completely true and accurate. The counter problem though, is that it also means that it's essentially restricted to one company's GPUs (yeah, yeah, nVidia offered it to AMD/ATI. But who would seriously want to pay nVidia what would likely be a cost per GPU made, to license PhysX?).

Havok was developing Havok FX, which would have provided GPU physics support on both nVidia and ATI graphics cards (and whoever else would later join the market). Sure, a lot of people question how it would have turned out, but we'll never know, since Intel bought them out and canned the project.

However, I'd speculate that you would have seen Havok FX become the predominate PhysX implementation, and widely used. Much more used than Ageia's implementation. Why support a PhysX API that requires a PPU for maximum performance, when you could utilize Havok FX, and have it run on both nVidia and ATI cards.

When I think about a GPU-based physics API that is open for anyone to use, Havok FX is what I generally think of. Too bad it wasn't to be.
 
Well have fun at Console-port-land then.
Stuck at 1080P port with DX9 I.Q.

I will have fun in PC land then, where I get way more immersion.
I tried playing Batman AA again ,without PhysX...it lasted for 10 mins....the I turned of the "4 year console PC port" and started my "PC game" again.

Batman AA is a game where the mental state of Batman and immersion go hand in hand...due to GPU physics..

Little Big Planet, Killzone 2, Resistance 2 and Uncharted 2 are consoles game which have great physics implementation.
 
Not at all. The point is to show what PhysX can do that regular physics cannot. Otherwise, why would you even want PhysX? Come on, be reasonable.

I cut the rest of your post to highlight your poorly-made point. If DX10 can do the same effects as DX9 but at a higher framerate, GREAT. And I think that is the point that is more often made. "We can do XXX in DS9, but it's much more efficient in DX10." It may be more efficient, but show us. If it's 20 fps vs. 40 fps, AND it's a noticable difference in quality, GREAT! Let's go for it. If it's 35 vs. 40 fps and nobody can tell the difference, buy the DX10 hardware but consider DX9 if you're on a budget. Except this is really neither of those scenarios. It's still full PhysX vs. no PhysX, and then making a gameplay comparison based on that. It's a bad comparison when the developers did not do anything for the non-PhysX crowd.

What should they do?
Ask the pixie fairy for some magic?
It's not like Batman - AA is castrated.
It looks the same on the Xbox, PS3 and none-GPU physics PC.
What GPU physics did, was to enable them to ADD more I.Q...because of the more power available.
And that wasn't NVIDIA's doing...that was the doing of the game developers.
Ask them, why they did as they did...instead of falsely blaming PhysX.
 
I actually think that the loss of Havok in terms of being an independent company, was far worse for physics implementation than the loss of Ageia to nVidia.

I'll fully agree, that with nVidia pushing PhysX, it's "adoption" has been greater solely for the fact that so many people use nVidia graphics cards. That's completely true and accurate. The counter problem though, is that it also means that it's essentially restricted to one company's GPUs (yeah, yeah, nVidia offered it to AMD/ATI. But who would seriously want to pay nVidia what would likely be a cost per GPU made, to license PhysX?).

Havok was developing Havok FX, which would have provided GPU physics support on both nVidia and ATI graphics cards (and whoever else would later join the market). Sure, a lot of people question how it would have turned out, but we'll never know, since Intel bought them out and canned the project.

However, I'd speculate that you would have seen Havok FX become the predominate PhysX implementation, and widely used. Much more used than Ageia's implementation. Why support a PhysX API that requires a PPU for maximum performance, when you could utilize Havok FX, and have it run on both nVidia and ATI cards.

When I think about a GPU-based physics API that is open for anyone to use, Havok FX is what I generally think of. Too bad it wasn't to be.

AMD pays Intel for x86...where is the difference?
 
Oh, I'd be willing to bet that nVidia already has PhysX working on OpenCL. The thing is though, I don't really want to see PhysX ever become "the standard", since companies such as AMD would have to license it from them (and for the record, I'll feel the same way if Havok were to become "the standard" someday, and everyone was forced to license it from Intel).

I'm fully in support of GPU-based physics. I simply want to see a standard that is 100% open and free for everyone.

well it appears bullet may be your only hope.

Not at all. The point is to show what PhysX can do that regular physics cannot. Otherwise, why would you even want PhysX? Come on, be reasonable.

I cut the rest of your post to highlight your poorly-made point. If DX10 can do the same effects as DX9 but at a higher framerate, GREAT. And I think that is the point that is more often made. "We can do XXX in DS9, but it's much more efficient in DX10." It may be more efficient, but show us. If it's 20 fps vs. 40 fps, AND it's a noticable difference in quality, GREAT! Let's go for it. If it's 35 vs. 40 fps and nobody can tell the difference, buy the DX10 hardware but consider DX9 if you're on a budget. Except this is really neither of those scenarios. It's still full PhysX vs. no PhysX, and then making a gameplay comparison based on that. It's a bad comparison when the developers did not do anything for the non-PhysX crowd.

um, that's not true, since "regular" physx can do the same thing gpu physx can, it just won't offer the same performance. so in that regard, it is doing something regular physx can't. gpu physx is hardware accelerated physx, which is just a better implementation of regular physx due to accelerating more advanced physics effects. so that's why some comparisons are showing effects on and off. others will just show more physics effects. as for dx9 vs dx10, there are dx10 exclusive effects or graphic options in some games. i wasn't talking about performance at all. for example, call of juarez has a dx10 version with much better graphical effects and that causes it to have much lower performance from the vanilla dx9 version. i recall world in conflict had a few dx10 exclusive effects. and its not full physx versus no physx in the comparison. its gpu accelerated physics effects ("on top of" regular physx) versus just regular physx. that is the comparison. and what do you mean didn't do anything for the "non-physx" crowd? physx is the underlying engine, so they still get regular physx. why would they do anything "special" like adding "approximations" of said effects if they have no incentive to do so? tell amd/ intel to do that in order to slightly improve cpu physics a bit. i doubt that would matter since intel will probably push havok and amd will push bullet for their gpus.
 
Last edited:
Little Big Planet, Killzone 2, Resistance 2 and Uncharted 2 are consoles game which have great physics implementation.

Define great?
Does it come close to GPU physics?
Or is it more like the "comparions" of Ghostbusters "limited rigid boides that dissapear after 10 seconds" physics to Batman - AA's "way beyond that" physics?

And before anyone think I am paided by NVIDIA:
I would rather be dragged naked over a field filled with broken glass than use a NVIDIA chipset for my CPU's.
But when no one else can step up to the plate, they get my vote for GPU's.
AMD don't get my voted...they have been dragging their feet since 2006...and have nothing to show.
Intel's don't get my vote...they have no hardware on the market.

It might change in the future...but today is now....not the future.
 
Status
Not open for further replies.
Back
Top