How PhysX Makes Batman: Arkham Asylum Better

Atech, as ElMoIsEviL showed in his Youtube video, the CPU is perfectly capable of doing decent Physx effects. At this point I think Physx is a nice feature, but will eventually be supplanted in the years to come by CPU based physics calculations in games.

It might not be as "good" as a dedicated card, but free is hard to compete with. Just like the Killer NIC and Ageia cards you have, they were decent products, but when compared to free they don't stand a chance. Its just like John Carmack predicted when he said Ageia's technology was a total waste of time, CPUs would eventually take over that role as the number of cores we have available for gaming increases.

Physics in games is kind of like anti aliasing. When AA first came out it was considered a gimmick and made most games nearly unplayable when enabled, it was simply easier to boost the display resolution. Now its basically a must have feature. When software physics hits that level will probably be when quad core is 100 percent ubiquitous and we start seeing six and twelve core CPUs on the market.
 
Atech, as ElMoIsEviL showed in his Youtube video, the CPU is perfectly capable of doing decent Physx effects. At this point I think Physx is a nice feature, but will eventually be supplanted in the years to come by CPU based physics calculations in games.

It might not be as "good" as a dedicated card, but free is hard to compete with. Just like the Killer NIC and Ageia cards you have, they were decent products, but when compared to free they don't stand a chance. Its just like John Carmack predicted when he said Ageia's technology was a total waste of time, CPUs would eventually take over that role as the number of cores we have available for gaming increases.

Physics in games is kind of like anti aliasing. When AA first came out it was considered a gimmick and made most games nearly unplayable when enabled, it was simply easier to boost the display resolution. Now its basically a must have feature. When software physics hits that level will probably be when quad core is 100 percent ubiquitous and we start seeing six and twelve core CPUs on the market.

i doubt it is considering the cpu hack is showing a fifth of the physics processing load compared to the gpu just to be considered playable. gpu physics is free - in the case of gpu physx, it comes free with an 8800 series card or higher. it will also be free on ati gpus when amd offers actual support via opencl bullet. so it's just as free as when you buy a cpu that supports software physics by default. by the time six and twelve core cpus come out, we'll also have massively more powerful gpus as well that will still be much better for physics processing. i will go out on a limb and say when the next generation of consoles arrive with newer gpus, hardware-accelerated physics growth will be set to explode over standard software physics support. not to say software-based physics won't improve, but it will still pale in comparison to what the gpu is capable of providing.
 
The difference in the number of physics objects capable of processing between a high-end CPU and high-end GPU is about a factor 100x and probably will only go up in the future especially with designs like Fermi. There's absolutely no way CPU physics will regain exclusive control over the physics engine as it used to be in the past. Future games simply will not be able to do without GPU physics. Just wait until the XBox 720 and PS4 support GPU-accelerated PhysX (PS3 already is capable of this) and Bullet. PC gaming will have to adapt or die at that point.
 
The difference in the number of physics objects capable of processing between a high-end CPU and high-end GPU is about a factor 100x and probably will only go up in the future especially with designs like Fermi. There's absolutely no way CPU physics will regain exclusive control over the physics engine as it used to be in the past. Future games simply will not be able to do without GPU physics. Just wait until the XBox 720 and PS4 support GPU-accelerated PhysX (PS3 already is capable of this) and Bullet. PC gaming will have to adapt or die at that point.

PS3 is capable of GPU-accelerated PhysX? That is new to me. From what I can see, PhysX games on PS3 like Mirror's Edge and Valkyria Chronicles are nothing compared to 1st party games like Killzone 2 and Uncharted 2 which use the SPEs for physics.
 
PS3 is capable of GPU-accelerated PhysX? That is new to me. From what I can see, PhysX games on PS3 like Mirror's Edge and Valkyria Chronicles are nothing compared to 1st party games like Killzone 2 and Uncharted 2 which use the SPEs for physics.

Each SPE of the Cell CPU has got its own SIMD engine, making it a respectable vector processor akin to a GPU. Of course, I don't know whether the PhysX runtime on the PS3 actually makes use of this functionality for every game. But it would surprise me if this wasn't done as it would free up a lot of resources for AI and other processing threads.
 
Nobody wants to comment on the water effect in the game or the lack of it?

Fluid/soft body interactions are quite complex and would require a lot of processing power. Not having seen the game in question I can't judge their approach
 
i played this game on an x1950xtx and then a gtx285, didn't notice a real big difference in physics? hum

maybe i'm just oblivious
 
ATI doesn't have any sort of physics processing right?

Physics processing in general is not an ATI/NVidia thing.
Its down to what the devs implement on the CPU normally.

This discussion is about performing Physics that is beyond a CPUs power.
NVidia have created a Physics library (PhysX) that will run the code on their graphics cards so much more complicated and/or higher detail Physics effects can be used in games etc.
Some of the effects are for sure beyond a CPUs capability, others arguably could have still be done on a CPU.
Theres a bit of shenanigans going on as NVidia have done some effects on GPU while only allowing PhysX to use a fraction of the CPUs available power which suggests they have gimped the CPU side to force GPU PhysX use.
However for effects like fog in Batman AA, a CPU at full load can only do an approximation so there is still good value in PhysX on GPU, but it does look like NVidia are strangling CPU PhysX in favour of GPU PhysX (that only runs on NVidia cards).
This puts ATI users at a disadvantage, hence the uproar.
 
ATI doesn't have any sort of physics processing right?

ATi has CPU-based physics using Havok, which they licensed from Intel. Intel doesn't allow them to add GPU-acceleration, so ATi is now switching to Bullet with OpenCL-based GPU-acceleration. No idea when that's going to be released, though.
 
Back
Top