You need Physx to play Batman: confirm/deny?
Confirmed.
Even the console versions use the PhysX middleware...but on the PC (with a NVIDIA GPU) they added way more physics for better I.Q/Immersion.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
You need Physx to play Batman: confirm/deny?
Can you play Batman: AA without an nVidia card or a PhysX hardware solution?Confirmed.
Even the console versions use the PhysX middleware...but on the PC (with a NVIDIA GPU) they added way more physics for better I.Q/Immersion.
And I've shown you proof that the performance you get with CPU physics performing comparable effects is often unacceptable. PhysX can be run on CPUs. So far the performance in even the tech demos is pretty aweful. That's not even adding in a game engine, audio effects, player input, etc.
You base this on what? Games are mostly GPU bound rather than CPU bound. Given the massive difference in performance between CPU and GPUs in terms of running games, your math doesn't remotely check out. Again you base this on a very vague notion and understanding of how game engines work.
Given that the game recommends a dedicated Geforce GTX 260 for PhysX processing, I'm inclined to believe that the PhysX effects are quite demanding. I also base this statement on the performance differences between GPU and CPUs in regard to PhysX processing.
The lack of CPU usage in Batman Arkham Asylum is NOT necessarily due to PhysX implementation. The game, like most others, is far more GPU bound than CPU bound.
I don't know what could be done on the CPU without taking a massive performance hit and frankly neither do you. All we can do is speculate. You seem to believe more can be done in this regard than I do. At least in regard to Batman Arkham Asylum. I say the performance hit for some of it would be too massive. If you are only getting 30FPS now then I think almost any performance hit would be unacceptable. You can disagree all you like and this is really a matter of opinion. Neither one of us can pretend to present facts here. I showed data that I believe supports my opinion and you did the same.
ps. Consoles suck and immersion is for fat mmo nerds.
You need Physx to play Batman: confirm/deny?
I honestly don't think that we're going to see a single mainstream implementation of a physics API any time soon (unless someone like Microsoft does step in and finally helps to make a decision on it).
The thing I fear the greatest (and sadly that which I think is most likely to come to fruition), is that we'll actually see three competing types: PhysX with nVidia's support, Havok with Intel's support, and then Bullet (or whatever AMD is pushing at that point) with AMD's support.
It's funny, people complain now about the nightmare that is nVidia controlling PhysX and trying to push that as the top standard (and I'm in this camp, as I don't want any one company to have completely control over whatever the top physics API ends up being). But just imagine what type of cross-linking Intel can do in regards to pushing Havok as the advantageous physics API to use (combining a Intel GPU, with a Intel-supported physics API, on an Intel processor). A lot of people here (seemingly mostly nVidia fans) scoff at the idea of Larrabee competing against the top GPUs from nVidia or AMD. While I agree it's still a bit of a ways off, I think a lot of people underestimate Intel's determination or ability to compete, lol.
Confirmed.
Even the console versions use the PhysX middleware...but on the PC (with a NVIDIA GPU) they added way more physics for better I.Q/Immersion.
Can you play Batman: AA without an nVidia card or a PhysX hardware solution?
Yes.
Thanks.
Ah, but therein lies the problem. You can't compare GPU PhysX to CPU PhysX because Nvidia wants GPU PhysX to succeed, *not* CPU PhysX. Thus, they have no motivation to make CPU PhysX anything other than "good enough" for devs to use it.
As you said, games are mostly GPU bound. Thus, there isn't a technical reason why the devs couldn't have exposed the physics quality levels for those without GPU PhysX. Those of us with OC'd i7s would certainly be able to handle at least some of the extra effects.
I don't know where you see that the game recommends a dedicated GTX 260 (couldn't find any reference searching google, anyway), but that would be because Nvidia wants people to buy expensive cards, not cheap ones.
A dedicated 9600GSO performs the same as a dedicated 9800GT - a dedicated GTX 260 is definitely *NOT* needed.
http://www.elitebastards.com/index....rticle&id=842&catid=14&Itemid=27&limitstart=3
However, turn up the PhysX effects without an nvidia GPU and FPS drops while CPU usage doesn't spike, THAT is due to horrid CPU scaling in PhysX.
We certainly know what CAN be done - just look at what other physics engines and games can do. You call it apples and oranges and say they aren't comparable, but they most certainly are comparable. If engine A can do what engine B can't, its because of problems with engine B - not the hardware.
The baseline for HIGH PhysX in Batman is a 9800GTX, as per the options menu. The GTX260 was for running the game with one card for graphics and LOW physics (or did they call it normal, I can't remember).
Those 16xAA numbers don't look right at all. No game takes such a low FPS hit at that level of AA, the UE3 engine especially.
Also are you running with or without ambient occlusion?
Can you play Batman: AA without an nVidia card or a PhysX hardware solution?
Yes.
Thanks.
I don't care whether the PhysX *software* middleware is installed on my system or not, in order to run a game. At that point, it's no different than any other physics software middleware implementation.You still need the PhysX middleware installed or it's a no go
I don't care whether the PhysX *software* middleware is installed on my system or not, in order to run a game. At that point, it's no different than any other physics software middleware implementation.
For all of the games that utilize Havok, you have to let it install as part of the game installation.
For all of the games that use Bullet, you have to let it install as part of the game installation.
And if a game utilizes PhysX, I'll let it install, just as I would any of the other competing implementations.
However, the entire essence of my comment, which you overlooked, is that you do *not* need an nVidia GPU in order to run Batman: AA.
You also do *not* need to be running a hardware implementation of PhysX (and thus *having* to use an nVidia GPU for PhysX) in order to play Batman: AA.
Will the game have less eye candy? Sure. And noone is disputing that. However, Batman: AA is still very much playable without that little extra. You seem to be having a hard time understanding that.
It reminds me of the old infamous console debates, where you'd have two next generation systems released a year apart. The latter system was usually the more powerful, and would end up looking *slightly* better in some regard or another. The supporters would yell "Hey look, games on our system look so much better".
Was it noticeable? Sure. About as noticeable as the difference between 720p and 1080p. At the end of the day, I really just didn't care. lol.
You must really hate Crysis?
Or ARMAII?
These effects are worth losing the +30 FPS I can't see anyways:
http://www.youtube.com/watch?v=6GyKCM-Bpuw
AFYI....not all games neeeds 60FPS.
The first Call of Duty worked fine with 30FPS.
Crysis.
Your premise is flawed...but that is no news
EDIT: they call a Core2 dual-core for HIGH-END? *ROFL*
I wonder how much that CPU limited that DUAL-GPU card...
I haven't read the whole thread, but is that really valid? I mean, I have seen smoke and explosion effects (when a projectile hits a wall) before. While they may not be 100% physically valid, they are close enough approximations. So the smoke may not swirl when you run through it. Batman seems to be either all or nothing. Shouldn't it be appropriate to compare the best non-PhysX effects to those using it? Just because a developer chose to omit effects because you do not have PhysX does not mean that they cannot be closely emulated. That seems more like a failure on the developer's side than anything else.
I fear for the further advancement of gaming if the whiners win.
I fail to see how you get so much more "immersion" simply because of hardware-accelerated physics. It's nice, don't get me wrong, and yes, it does add something to the game. That's not in denial.Well have fun at Console-port-land then.
Stuck at 1080P port with DX9 I.Q.
I will have fun in PC land then, where I get way more immersion.
I tried playing Batman AA again ,without PhysX...it lasted for 10 mins....the I turned of the "4 year console PC port" and started my "PC game" again.
Batman AA is a game where the mental state of Batman and immersion go hand in hand...due to GPU physics..
EDIT:
If you look at my sig you will see that I like new hardware.
I always have....I like to toy with it..and even if I only gain 0.0000.....1% with the hardware...it's still a GAIN.
But showing that today will give you some flak.
"Did you rally buy X?
"Ha, you have Y in your sig...your words have no vlaue, n00b"
"Why did you waste your money on Z?"
On a forum.
For hardware geeks.
In the last couple of years something changed.
I blame consoles.
Because on a console everyone is equally limited.
Even a small kid can figure out to hit the powerbutton.
Creating gamers that really are not into hardware...or technology...just as lon as they get spoonfeed and don't have to tinker.
What I hate is to see that mentality spread over to PC's.
But is seems like we are getting there.
Just look at the OP's nick....I rest my case.
I'm not whining. Once again, I have seen approximations of those effects. It really is best to compare approximations to the full PhysX deal. When you compare full PhysX to nothing at all (and ignoring what is out there) you are being disingenuous. I fully support a move to advance physics effects, especially if it is an open standard. Better physics is a good thing.
However, comparing full physics to nothing (which is what your clip did, thanks to the developers) is not really honest. Other developers manage to create something. Why should this developer be exempt from criticism for not doing something for non-PhysX hardware, when the majority of people do not use PhysX?
I fail to see how you get so much more "immersion" simply because of hardware-accelerated physics. It's nice, don't get me wrong, and yes, it does add something to the game. That's not in denial.
But the extent at which you seem to be taking "immersion", we'd almost think you're standing on a holodeck. If you really are getting that much enjoyment out of it though, then more power to you.
I've played Batman: AA on both my PS3 and my PC with hardware PhysX enabled. I can see the extra little niceties of hardware PhysX acceleration. But it's in no way revolutionary. It wouldn't make me rush to get an nVidia GPU simply to have it. The PS3 version was just fine, and to me, just as much fun.
And I hate to break it to you, but within a few years, almost all of the games you will be playing will likely be ports. It's far more wise for a company to first release a game on consoles, and then port it to the PC.
You'll still have a few companies, like Blizzard, who will largely focus on the PC market, and not do ports from console-first games. But it'll be extremely rare.
i see where you're coming from, but i have to disagree. it has nothing to do with honesty in terms of the actual comparison. the game is what it is. none of these additional effects would be in the game if nvidia didn't offer an incentive to rocksteady to integrate gpu physx into the game. it probably went something like nvidia asking the devs add some special stuff for gpu physx outside of the normal software physx and telling them that they will compensate the devs for their efforts; therefore, we have gpu physx in batman. the comparison is just showing what was added in the game as a result of gpu physx versus without.
No, people are not complaining over new effects/features/physics/etc are implemented. People are complaining about *HOW* it is implemented. You seem to be the only one who is truly failing to grasp that.Now people whine every time a new new effects/physics/features are introduced(just look at PhysX).
In general, I rarely see enthusiasts truly whining about games that push their hardware. Sure, they might be upset at how they have to fork out X amount of dollars to ultimately get better hardware to play said games better, but most generally seem to accept that it's part of the territory when being an enthusiast.Now people whine over games that really pushes the hardware (Crysis anyone?)
Well, at the end of the day, you could have all the most wonderful physics in the world, but if the game is running at 20 frames per second, it's not going to be fun.Seems people wanne live in a wolrd where gaming is staic and dead and all that matters are FPS e-peen...I have 5904 FPS in "Call of Duty 45"...WOHHHOOOO this game ROCKS.
Gaming has been going strong for decades. It will continue to go strong. Consoles already largely are driving it, and will continue to do so in the future. Look at the number of console sales in the world. Conservatively, the PS3/360/Wii are at roughly 100 million systems sold. Yes, there's a lot of redundancy in terms of one owner having multiple systems, so let's say it's at 50 or 60 million unique owners. Do you really think there's that many serious PC gamers in the world? Exactly.I fear for the further advancement of gaming if the whiners win.
I like new hardware too, as do many others on this site. Purchasing/using new hardware doesn't get flak. Having blind loyalty in companies, gets flak.If you look at my sig you will see that I like new hardware.
I always have....I like to toy with it..and even if I only gain 0.0000.....1% with the hardware...it's still a GAIN.
But showing that today will give you some flak.
The game is what it is, but other people have done it. Once again, you can't just post a Youtube vid of "here is the potential of PhysX vs. something that doesn't even try." Look at what can be done without PhysX and then make a comparison. I really am not advocating one way or another, I just want a valid comparison. I can see where I sound like I am anti-PhysX because I am saying how the non-PhysX option isn't fair, but if people have played other games they can see how it's not all-or-nothing. But Batman seems to be just that.
I understand that Nvidia probably paid these people to do what they did. Still, in my mind, you have to compare potential vs. potential, even if it is across games. What one developer did may be heavily influenced by money received to implement those measures. I think that the difference here is you are arguing practical vs. practical in the same game, and while I may want to see potential vs. potential, I am content comparing practical vs. practical in different games.
And as I've been saying all along, GPU physics, whether PhysX or something else in the future, is a bonus. Noone is arguing that.the point is that gpu physx is merely a bonus for those that can take advantage of it. of course you can still enjoy the game without it, but for some (atech especially lol), it can heighten their experience.
And as I've been saying all along, GPU physics, whether PhysX or something else in the future, is a bonus. Noone is arguing that.
And I know it can heighten the gameplay experience. The difference is that some try to pass it off as this amazing, can't-game-without-it experience. They make it sound like PhysX is the defining reason for owning an nVidia card. And it's not, I'm sorry. What good is hardware PhysX, if the game is crap, or it's playing like crap?
All I want to see, is a non-proprietary, fully open physics API take hold and be used by everyone. Want to know what would be awesome? nVidia divest themselves of PhysX, and Microsoft picks it up and implements it as part of DirectX. But that'll never* happen.
*I reserve the right to undo this comment, as part of the "never say never" concept
I fail to see how you get so much more "immersion" simply because of hardware-accelerated physics. It's nice, don't get me wrong, and yes, it does add something to the game. That's not in denial.
But the extent at which you seem to be taking "immersion", we'd almost think you're standing on a holodeck. If you really are getting that much enjoyment out of it though, then more power to you.
I've played Batman: AA on both my PS3 and my PC with hardware PhysX enabled. I can see the extra little niceties of hardware PhysX acceleration. But it's in no way revolutionary. It wouldn't make me rush to get an nVidia GPU simply to have it. The PS3 version was just fine, and to me, just as much fun.
And I hate to break it to you, but within a few years, almost all of the games you will be playing will likely be ports. It's far more wise for a company to first release a game on consoles, and then port it to the PC.
You'll still have a few companies, like Blizzard, who will largely focus on the PC market, and not do ports from console-first games. But it'll be extremely rare.
Oh, I'd be willing to bet that nVidia already has PhysX working on OpenCL. The thing is though, I don't really want to see PhysX ever become "the standard", since companies such as AMD would have to license it from them (and for the record, I'll feel the same way if Havok were to become "the standard" someday, and everyone was forced to license it from Intel).as for the last part, something like that may happen in the near future. not the ms part, of course. nvidia may or may not eventually port gpu physx to opencl depending on if they are pressured competitively to do so and the implementation is comparable in performance to cuda. it's pretty much just wait and see what happens.
but the point of the comparison isn't to compare physics engines and their effects and how they are implemented with other physics engines/effects/implementations. it's merely to showcase the differences between gpu physx on and off as a result of gpu physx being added in the game.
Yeah, I caught that edit.
I bought a 4870x2 to play DoD:Source better (the only game I play competitively), and try out other games. I am not thrifty with my hardware purchases; in fact, the only reason I have not bought into the 5800 family is that I am waiting for the 5870x2. Not everybody is stupid. Not everybody needs the simplicity of a console. Not everybody needs to be looked down on.
ALL that I was saying was that Batman does not seem to be a very good example, but it's what you chose. In practice, perhaps, and I'll give you that, it shows an advantage in certain games. But it seems like the developers coded it for PhysX and then said, "Fuck it." after that. Better developers would have done some approximation rather than leave things like smoke and fog out of the game completely if PhysX was not used. That is not a testament to the power of PhysX, it's a critique of a lazy developer who would not do anything but that alternative. It is a bad comparison. Show me PhysX and then an approximation that can run on other video cards, and you have a basis for a tradeoff between image quality and framerate. If you just strip all features and say, "Hey, you can't have this without PhysX" you're just lying.
I fail to see how you get so much more "immersion" simply because of hardware-accelerated physics. It's nice, don't get me wrong, and yes, it does add something to the game. That's not in denial.
But the extent at which you seem to be taking "immersion", we'd almost think you're standing on a holodeck. If you really are getting that much enjoyment out of it though, then more power to you.
I've played Batman: AA on both my PS3 and my PC with hardware PhysX enabled. I can see the extra little niceties of hardware PhysX acceleration. But it's in no way revolutionary. It wouldn't make me rush to get an nVidia GPU simply to have it. The PS3 version was just fine, and to me, just as much fun.
And I hate to break it to you, but within a few years, almost all of the games you will be playing will likely be ports. It's far more wise for a company to first release a game on consoles, and then port it to the PC.
You'll still have a few companies, like Blizzard, who will largely focus on the PC market, and not do ports from console-first games. But it'll be extremely rare.
No, people are not complaining over new effects/features/physics/etc are implemented. People are complaining about *HOW* it is implemented. You seem to be the only one who is truly failing to grasp that.
I have nothing against physics implementation into games. What I (and many others) do have a problem with, is one company (nVidia) trying to impose their proprietary API on everyone. When Ageia was its own entity, I thought PhysX was an interesting concept that had a long way to go, but could be promising. I also thought that Havok FX was promising. Unfortunately, both were bought by larger corporations and essentially closed off (PhysX) or canceled (Havok FX).
The only ones who seem to truly think that PhysX should be *THE* standard to use, are nVidia fans.
In general, I rarely see enthusiasts truly whining about games that push their hardware. Sure, they might be upset at how they have to fork out X amount of dollars to ultimately get better hardware to play said games better, but most generally seem to accept that it's part of the territory when being an enthusiast.
However, you do see many enthusiasts complaining about PhysX. Not because of *what* PhysX does, but because of *how* it's being implemented. That's the major difference, and the concept which you overlook/ignore/pass off.
Well, at the end of the day, you could have all the most wonderful physics in the world, but if the game is running at 20 frames per second, it's not going to be fun.
]That's the irony here, you're so in love with physics, you'll gladly ignore whether or not the game is even playable. "HEY LOOK, THE SMOKE SWIRLS AROUND MY LEGS AS I RUN THROUGH IT... at 15 frames per second." That's obviously an exaggeration, well, in general, but it makes the point.
It's actually kinda funny, you're taking the same stance on FPS (that it's not the end-all, be-all of gaming), that many others take with PhysX.
Gaming has been going strong for decades. It will continue to go strong. Consoles already largely are driving it, and will continue to do so in the future. Look at the number of console sales in the world. Conservatively, the PS3/360/Wii are at roughly 100 million systems sold. Yes, there's a lot of redundancy in terms of one owner having multiple systems, so let's say it's at 50 or 60 million unique owners. Do you really think there's that many serious PC gamers in the world? Exactly.
I like new hardware too, as do many others on this site. Purchasing/using new hardware doesn't get flak. Having blind loyalty in companies, gets flak.
My new system I'm building is using a Radeon 5850 or 5870. Why? Because they're the top cards on the market right now. It has nothing to do with the fact that I dislike the approach that nVidia is taking with PhysX (which I very much do. It bothers me to no end some of the stuff they're doing these days, such as their driver-induced PhysX lockout with AMD cards).
If nVidia had the top-performing card right now, I'd go with it, regardless of what I think of their business practices. But they don't, so I'm going with a Radeon this time around.
only problem with ageia is that they really weren't going anywhere with ppus. one might complain that nvidia bought them out and are using the tech. to use their gpus as ppus, thus allowing for exclusive content. however, in doing so, it is allowing for hardware accelerated physics to be adopted at a much faster rate than would have ever been dreamed possible by ageia. since gpus are a necessity for gaming, and now they can do physics as a bonus right now. otherwise, the tech. probably would have stagnated and slowly faded away or put on the "backburner" in terms of gaming innovation/ improvement. faster progress is better progress, imo. the situation might not be ideal right now, but progress has to start somewhere, especially if mainstream adoption is a goal. ultimately, we will just have to wait and see what the future holds for the technology.
Well have fun at Console-port-land then.
Stuck at 1080P port with DX9 I.Q.
I will have fun in PC land then, where I get way more immersion.
I tried playing Batman AA again ,without PhysX...it lasted for 10 mins....the I turned of the "4 year console PC port" and started my "PC game" again.
Batman AA is a game where the mental state of Batman and immersion go hand in hand...due to GPU physics..
Not at all. The point is to show what PhysX can do that regular physics cannot. Otherwise, why would you even want PhysX? Come on, be reasonable.
I cut the rest of your post to highlight your poorly-made point. If DX10 can do the same effects as DX9 but at a higher framerate, GREAT. And I think that is the point that is more often made. "We can do XXX in DS9, but it's much more efficient in DX10." It may be more efficient, but show us. If it's 20 fps vs. 40 fps, AND it's a noticable difference in quality, GREAT! Let's go for it. If it's 35 vs. 40 fps and nobody can tell the difference, buy the DX10 hardware but consider DX9 if you're on a budget. Except this is really neither of those scenarios. It's still full PhysX vs. no PhysX, and then making a gameplay comparison based on that. It's a bad comparison when the developers did not do anything for the non-PhysX crowd.
I actually think that the loss of Havok in terms of being an independent company, was far worse for physics implementation than the loss of Ageia to nVidia.
I'll fully agree, that with nVidia pushing PhysX, it's "adoption" has been greater solely for the fact that so many people use nVidia graphics cards. That's completely true and accurate. The counter problem though, is that it also means that it's essentially restricted to one company's GPUs (yeah, yeah, nVidia offered it to AMD/ATI. But who would seriously want to pay nVidia what would likely be a cost per GPU made, to license PhysX?).
Havok was developing Havok FX, which would have provided GPU physics support on both nVidia and ATI graphics cards (and whoever else would later join the market). Sure, a lot of people question how it would have turned out, but we'll never know, since Intel bought them out and canned the project.
However, I'd speculate that you would have seen Havok FX become the predominate PhysX implementation, and widely used. Much more used than Ageia's implementation. Why support a PhysX API that requires a PPU for maximum performance, when you could utilize Havok FX, and have it run on both nVidia and ATI cards.
When I think about a GPU-based physics API that is open for anyone to use, Havok FX is what I generally think of. Too bad it wasn't to be.
Oh, I'd be willing to bet that nVidia already has PhysX working on OpenCL. The thing is though, I don't really want to see PhysX ever become "the standard", since companies such as AMD would have to license it from them (and for the record, I'll feel the same way if Havok were to become "the standard" someday, and everyone was forced to license it from Intel).
I'm fully in support of GPU-based physics. I simply want to see a standard that is 100% open and free for everyone.
Not at all. The point is to show what PhysX can do that regular physics cannot. Otherwise, why would you even want PhysX? Come on, be reasonable.
I cut the rest of your post to highlight your poorly-made point. If DX10 can do the same effects as DX9 but at a higher framerate, GREAT. And I think that is the point that is more often made. "We can do XXX in DS9, but it's much more efficient in DX10." It may be more efficient, but show us. If it's 20 fps vs. 40 fps, AND it's a noticable difference in quality, GREAT! Let's go for it. If it's 35 vs. 40 fps and nobody can tell the difference, buy the DX10 hardware but consider DX9 if you're on a budget. Except this is really neither of those scenarios. It's still full PhysX vs. no PhysX, and then making a gameplay comparison based on that. It's a bad comparison when the developers did not do anything for the non-PhysX crowd.
Little Big Planet, Killzone 2, Resistance 2 and Uncharted 2 are consoles game which have great physics implementation.