Effects Physics & Gameplay Physics Explored @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,601
Effects Physics & Gameplay Physics Explored - Confused about effects physics vs. gameplay physics? We have asked ATI, NVIDIA, Havok, and Ageia about the two. We received in-depth responses providing great detail regarding effects and gameplay physics, what they mean for gameplay, and current hardware physics abilities.

"After digesting all of these great replies from the players in the physics market, here is what we come up with. There are big definitional differences between effects physics and gameplay physics as you might have expected, but much of the forum banter you have seen people prop up as fact, is just simply bad information at best, and lies at worst."

Please Digg our article if you would like to share.
 
So ATi's solution does both kind of physics? That is some sweet news.
Nice article but I am not making another comment in here because it's bound to get ugly.
 
That clears up ALOT of puzzling information about GPUs! A thread I posted awhile back showing the benchmarks for the folding project using GPU shaders for performing their simulations.

I had asked the question....if Sanford can send simulation data to the GPU, have it perform the calculations, and get the data back with such an extreme level of performance why can't ATI/nVidia do it with physics calculations.

Great to hear that GPUs can indeed to game-play physics..

GREAT ARTCLE!
 
psychoace, as the article states ATI's and NV's current GPUs and Ageia's PhysX processor are hardware capable of effects and gameplay physics acceleration.

(HardOCP) - GPUs allow effect physics right now, but technically speaking would it be possible to also use them to accelerate gameplay physics now, or in the future?

(Godfrey Cheng) - Absolutely. As I described above, the actual calculation of both types of physics is identical and from a technical point of view, both can be processed on ATI’s GPUs today.

http://enthusiast.hardocp.com/article.html?art=MTA5NywyLCxoZW50aHVzaWFzdA==

(HardOCP) - GPUs allow effect physics right now, but technically speaking would it be possible to also use them to accelerate gameplay physics now, or in the future?

(Chris Seitz) - GPUs accelerate the simulation and calculation of rigid body collisions. Because the crux of the simulation can be cast into a data-parallel problem which maps extremely well into shader code, GPUs are perfectly capable of quickly and efficiently performing this processing. There is no distinction made at the GPU level whether the objects being collided are effects objects or gameplay objects.

http://enthusiast.hardocp.com/article.html?art=MTA5NywzLCxoZW50aHVzaWFzdA==

There is a common misconception that GPUs can't do gameplay physics acceleration. Now that you have the facts, well, there you go.

I urge you to read the rest though in detail as it is explained why developers are focusing on effects physics only at this particular time. The benefits that effects physics acceleration can bring to the gaming experience are also discussed.
 
Would it be possible for a developer to build a game, then release a patch that would allow those with the
necessary hardware to experience gameplay physics in at least single player? The game would probably
have to be built in a way that would allow the patch to be possible of course.

In multiplayer there could be severs with and without the gameplay physics patch.
 
I will have to disagree as the article clearly points against the fact that GPU's can't do gameplay physics, as a few members here have been quite adamant about.
 
psychoace said:
So ATi's solution does both kind of physics? That is some sweet news
No offense, but that has been flying around this forum since the start and I always suspected that you didn't read up before you ever posted.



Yes, the GPU can do effects physics. Yes, you will lose a lot of normal GPU performance because the GPU will basically be running in reverse.

ivzk said:
I will have to disagree as the article clearly points against the fact that GPU's can't do gameplay physics, as a few members here have been quite adamant about.
GPUs can do gameplay physics, nothing new there. On a similar note: my old AMD 900mhz can also run Oblivion... o shit.. I guess people like 3D performance?
 
Ageia has hardware that can do both as does ATI and Nvidia, and since the only thing holding back the implementation of game play physics is a developer with deep pockets, I say we all petition John Carmac to "get it done" and have the 4 physics players bankroll it. :cool:
 
Now I'm really ready to see what the next-gen graphics cards and games will bring to the table. :)
 
Keep this discussion on topic. If you want to discuss something personal with another member, take it to PM or you will be permabanned.
 
Hey Terra...

Told you so..:p *kidding*

I think this article solidifies my stance on the matter. Then again I had already asked thse questions (to Chris from ATi).

GPU's can render both effects and gameplay physics. And ATi's x1K series.. due to it's dedicated branching unit, can do so better then nVIDIA's 7x00 series. And in the case of the x1900XT/XTX possibly faster then Ageia's PPU.

BUT!!

G80 will also bring a Dedicated Branching Unit. I'm fairly certain of thise as it would be like shooting once self in the foot by not incorporating such an IMPORTANT feature.

BTW, ATi's x1K won't suffer from the issues of data being read back. that's why they're pushing there Crossfire setup with a third card. It will be calculated (by the VPU and Dedicated Branching Unit) and then the data will be displayed.

Why? Because ATi's implementation will have it's own API (which apparently will be compatible with Microsoft's own defacto API). ATi not only supports HavokFX but also there own API for Physics.
 
great article! what i really got out of this is one thing: hardware physics is definately in its infantcy. we have nothing to show off what the hardware can do, what GPU physics is capable of, benchmarks, etc. it simply doesn't exist yet. much will change before we all have some sort of hardware physics processing whatever it may be. if something doesn't work, you can bet each company will change that. for example, we all know that geforce 7 cards cannot do HDR and AA at the same time....but then HL2:episode 1 was released. so even IF GPU phsyics was effects physics only (which its not) you can bet they'd find a way to change it.

personally i feel that in a year or so all physics hardware will be supported through DX. it only makes sense to have some sort of unified api to take care of it. maybe there will be an alternative as there is with sound and visual (direct x/open GL and direct sound/open AL) and ageia could very well introduce something along those lines. there's nothing else right now or on the horizon that we know of so it would make sense. maybe they'll try something along the lines of what 3dfx did with their miniGL driver...

either way, can't wait to see how this pans out in a year or two!
 
The one thing about game-play physics that has me excited is the fact that there can be more than one way to enter an area. Instead of what's scripted. That idea in itself would make for a great feature in any of the counter-terrorists games.
 
ElMoIsEviL said:
Hey Terra...

Told you so..:p *kidding*

I think this article solidifies my stance on the matter.
Which stance? The one where you use your experience with Ageia physics solution (You claim to have seen it all), or the one where you believe Micro$oft and ATI have a joint venture into the field ?

ElMoIsEviL said:
I've seen what Ageia is capable off. Pretty much nothing.


So is it this one, the obvious lie?
ElMoIsEviL said:
Microsoft/ATi joint API to be included in DirectX (All physics)
Or is it this one, the speculative-- borderline illegal-- stance?
 
I mis one Player MS with its DirectX PhysX plan's.
Agains HavokFX and PhysX.

I mis a lot on the API side. It's More generaly speaking. more hardware vendor aimed.
The only API wich is directly ask is Havok. and more talk about the GPU then there API.

Interresting point.
No Gameplay Physics now! Might be a bit later if the maket is seeded with lot's of Physics mis-use G-card and the large crop of Dev are going for it. Altho the basis is there. But for now are pushin effect Physics only. ATI & nV to. They are runing behind Ageia wich has a market presence and with gameplay Physics out now avaible to dev's.
Chicken egg between devs hardware and API vendors. So the easy way. Ageia gives Dev's the chicken egg choice.

Ageia PPU is starting and offers now Gameply Physics and effect Physics. It is waiting for a dev a few of those 100 supporting ones, that implement this for a still small PhysX1 enable audience. Cellfactor is one it would be a full game. Thats the plan.

It could be when havok GPU solution is unfolding there might be a gameplay physics game out. When the HavokFX are start coming out.

It could mean the use of accelerated Gameplay physics is closer then expected Ageia is pushin it. Just waitin for a Gameplay physics game. Wonder if there comes such title in reasonable time.

Altho Effect physics is mucho wanted to.

Graw is just a slapped on solution of PhysX. Physics in game on CPU is Havok, extra Ageia PhysX is on top of this. It isn't ageia PhysX native but some crude slapped on first attemp. Nt native and rushed. It's a rushed game.
I do like the Physic havok does in the game. With some strange behavior! And that bitty PhysX add to it with the P1, amke it a tad better. I played graw first with out it and later with it. I like it without it an and with a bit more.:)

So I read a long time ago that havok FX only supports Effect physics. Wich is true currently. its'bad nature to draw conclusion on that about GPU. But I think many did. So they can do it but not with the first HavokFX . If games are using HAvok FX as GPU physics solution partner. They can do effect physics only.

Ageia is now more dpending on a killer app.
But for the first hardware accelerated games you need a P1 because the havok fX ones come later .

Any known Havok FX release dates of games?
 
(HardOCP) - Do effects physics put burden on the GPU lowering 3D graphics performance having to now render more objects on the screen?



(Nvidia)- [after thoroughly confusing everyone with technological PR] "yes, the shader unit is not “rendering 3D graphics” while it’s simulating physics."


Did I forget to mention the diminished return? The second card will not be running at 100%. Nvidia describes the load as "fairly light." What a waste of clock cycles.
 
cyks said:
(HardOCP) - Do effects physics put burden on the GPU lowering 3D graphics performance having to now render more objects on the screen?



(Nvidia)- [after thoroughly confusing everyone with technological PR] "yes, the shader unit is not “rendering 3D graphics” while it’s simulating physics."


Did I forget to mention the diminished return? The second card will not be running at 100%. Nvidia describes the load as "fairly light." What a waste of clock cycles.

So then Nvidia solution acts exactly like a PPu in that it's not rendering anything on screen so it wont provide any speed enhancements to your game. So it will technicly make your GPU a PPU since a PPU dosn't do graphics. Also in a MultiGPU status you will lose nothing on fps since that quote was directed at a single card solution. As in the same card handles the both GPU and PPU functionality. Multi GPU setup will totally cut any bottleneck away though and be able to perform well enough to be compareable to Ageia's solution or even better. I also like how your "quote" has your own personal opinion about the information stated before it. Nice PR move there.

Also the quote from Ageia about performance without the pr bs before it

"...if the game either introduces additional physics and graphics elements, or extra AI, etc., when the PhysX card is present, then it is possible that the 3D graphics burden would increase"
 
Interesting article. I found the "1000 objects at 300 FPS" comment interesting (I don't remember who made it), but I think it might be a bit misleading. I wonder what counts as an object. Could the entire immovable, 'indestructible' part of a map in a FPS be considered one object, or is each individual component (down to the individual geometric primitives) modelled separately?

It will be interesting to see how game developers handle the flexibility that fully-destructible worlds would give gamers. I know that they test the heck out of their maps to find and eliminate potential exploits, and to keep the game fair. Everything from the firing arcs of gun emplacements to the witdh of a chokepoint to the lighting of an alcove to the speed of doors opening is painstakingly tweaked to keep the map playable and fun. The Assault maps in UT2004 come to mind in this respect. Now, all of a sudden, all of those excruciatingly-tweaked objects......can simply be blown out of the way. That deathmatch level sprawled across 5 floors of an abandoned warehouse? After about 2 minutes of rocket-spamming, it becomes a 1-story trash heap.

A map with a total physics simulation means infinite player flexibility. That means that in a very short time, a lot of people are going to come up with a lot of very nasty exploits that can ruin the playability of a map. Example: Player blows a foxhole for himself in the ground, and uses the rubble to make a very nice barricade, from behind which he/she has excellent visibility, and can snipe others at will, while being nearly totally protected.

So how can developers handle that? Here are a few ideas:
1) Make the level, test it the best you can, and hope that people don't find exploits that are so bad that it ruins the level
2) Assume that players will find exploits, and try to mitigate them as much as possible, say with piles of rubble that are technically destructible but which would take unreasonable amounts of ammunition to productively move.
3) Limit the full physics simulation to certain parts of the map. Call some walls or pillars or floors or stairs "too thick/strong to destroy"

I know that Red Faction introduced fully-destructible environments several years ago, but never had a chance to play the game. Was it still playable, or were there artificial limits to what could be destroyed?

All reservations aside, there are a lot of cool things you could do with this. Don't want to take the elevator up to the top of the tower where the enemy's flag is? No problem--just hammer the tower itself until it falls over (taking those campers along with it).
 
So, lemme get this straight:

Havok/GPU physics have the ability to support both types of physics, but the software currently only supports effects.

Ageia/PhysX has the hardware and software to do both types of physics, but it doesn't matter since current implementations are all effects phyics anyway.

Also, typo here (bolded word is missing):

The Bottom Line

There is a difference between effects and gameplay physics and having dedicated hardware acceleration for one or the other or both can lead to an improved gameplay experience. The current GPUs and PhysX processor are capable of both effects and gameplay physics acceleration although gameplay physics are not being taken advantage of.
 
Mohonri said:
I know that Red Faction introduced fully-destructible environments several years ago, but never had a chance to play the game. Was it still playable, or were there artificial limits to what could be destroyed?
Nothing is without limits. You could blast huge tunnels through rock.

One of the test levels that I enjoyed playing around in was a giant crappy block of an empty room with a glass building in the center.

Blasting my way in through the side of the solid rock material and blasting a tunnel up the outside of th eworld and over the top of the level, and back down to land on top of the glass house :)

Pretty crazy, however after some 1000's of explosions and resuing the give all ammo code enough times...you will reach a point where you no longer can blaste away at terrain.
 
InorganicMatter said:
So, lemme get this straight:

Havok/GPU physics do have the ability to support both types of physics, but we currently have implementations of neither.

Ageia/PhysX do have the ability to do both types of physics, but we have spotty implementations of both types and asside from the single game, CellFactor, only effects physics implementations announced for future titles.
Fixed!

nVidia said that Havok doesn't have the software support for game-play physics, but Havok says the do just not promoting it until devs want to use it.

But – we do provide all simulation data back to the game/CPU, for use, if desired. That is probably a current misconception of about how Havok FX works – it can technically affect game-play physics. And the fact that we are running today on a GPU attached via PCIe means we have an even greater speed potential on read-back than non-PCIe devices. However, for all the reasons I provided above, we do not strongly promote FX for game-play uses

It all comes down to hardware acclerated physics being in their infancy at this point, and it will be probably years before it's worth spending money on hardware for this. Sometime when everything is universalized into one OpenPhysics or DirectPhysics type standard things will be alot more clear.
 
cyks said:
(HardOCP) - Do effects physics put burden on the GPU lowering 3D graphics performance having to now render more objects on the screen?



(Nvidia)- [after thoroughly confusing everyone with technological PR] "yes, the shader unit is not “rendering 3D graphics” while it’s simulating physics."


Did I forget to mention the diminished return? The second card will not be running at 100%. Nvidia describes the load as "fairly light." What a waste of clock cycles.


Although i do currently enjoy the Nvidia product, I must say I was pleased with the response ATI gave. It was concise and understandable. The Nvidia letter came across as a "baffle them with bullshit " statement. I must admit I am a bit disappointed.

For the record I will be gonig dedicated hardware in whatever refined form presents itself
 
Thank you Kyle and [H]ardlOCP for putting this together. Hopefully, this will eliminate some of the tiresome ranting back and forth we've been seeing.
 
That is an interesting article to read, but it basically comes down to a collection of PR guys doing their best to give confusing answers when it comes to their product’s limitations, and coating their strong points with as much sugar and frosting as they can possibly manage. I mean this would kinda be like trying to decide between an ATi and nVidia card by only looking at the product descriptions on each companies website, or by comparing the benchmarks printed on the side of each box while you’re at a retail store. Of course both sides are going to have some little jargon write-up explaining why theirs is the best and the other guy’s card sucks whether or not that reflects reality.

I just find it a bit strange that a bunch of PR fluff is somehow supposed to “clear up” the “misconceptions” that ordinary people have provided via factual arguments on the forum. While certainly everyone has bias, I doubt anyone here has as much bias as a PR guy that works for the company. Quite simply put, Public relations banter is the anti-thesis of knowledge and understanding; and given that, I’m not sure what exactly I’m supposed to come away from this article with.
 
I thought it was a good article and helped everyone understand that physics is coming but really isn't ready for prime time. I thought ATI's and Nvidia's responses were very interesting. One thinks they have the shader horsepower right now and one shyed away from the question with PR BS.

In the end we still have some time before we really know if 300 on dedicated hardware is worth it for something some GPU's have no problem with today.
 
GotNoRice said:
That is an interesting article to read, but it basically comes down to a collection of PR guys doing their best to give confusing answers when it comes to their product’s limitations, and coating their strong points with as much sugar and frosting as they can possibly manage. I mean this would kinda be like trying to decide between an ATi and nVidia card by only looking at the product descriptions on each companies website, or by comparing the benchmarks printed on the side of each box while you’re at a retail store. Of course both sides are going to have some little jargon write-up explaining why theirs is the best and the other guy’s card sucks whether or not that reflects reality.

I just find it a bit strange that a bunch of PR fluff is somehow supposed to “clear up” the “misconceptions” that ordinary people have provided via factual arguments on the forum. While certainly everyone has bias, I doubt anyone here has as much bias as a PR guy that works for the company. Quite simply put, Public relations banter is the anti-thesis of knowledge and understanding; and given that, I’m not sure what exactly I’m supposed to come away from this article with.

Wow! This is the first time I have seen anything but PR fluff from these companies about this technology. Without working games I am unsure how much more information you can get about it as it will be tied to the applications.

Sorry you are not satisfied with some well laid out facts about the technology. I really have to think you did not read it too well. But hey, to each his own.
 
jacuzz1 said:
Although i do currently enjoy the Nvidia product, I must say I was pleased with the response ATI gave. It was concise and understandable. The Nvidia letter came across as a "baffle them with bullshit " statement. I must admit I am a bit disappointed.

For the record I will be gonig dedicated hardware in whatever refined form presents itself

NVIDIA's answers were a bit myopic but I think as has been pointed out above, he fianlly got down to the answer that we all wanted to hear.
 
InorganicMatter said:
So, lemme get this straight:

Havok/GPU physics have the ability to support both types of physics, but the software currently only supports effects.

Ageia/PhysX has the hardware and software to do both types of physics, but it doesn't matter since current implementations are all effects phyics anyway.

Also, typo here (bolded word is missing):

No, the software is the games. You don't have to use HavokFX for GPUs, that is just one physics engine package that has been announced. In fact ATI is working with game developers to be able to write directly at the hardware level therefore they would be able to use their own physics engine.

If using HavokFX though, right now it only supports acceleration of effects physics on GPUs, but if you read you'll see they do plan to expand when the time is right and developers ask for it. Do not underestimate the potential that effects physics can have on games, as I said in the article these are the physics effects that you "see".

So the PhysX processor and GPUs are capable of effects and gameplay physics (the hardware doesn't know the difference). It all comes down to the games and how the game developer leverages effects and or gameplay physics acceleration on said hardware.
 
Thanks for the info! I was hoping to hear something like this...

it doesnt change much (at present).... but its good to know... I figured that atleast ATI's GPU was capable of full physics because of all the scalability and future proofing they had claimed to do to the X1K series, but all I have heard was the Havok effects physics. Its still (developmentally) behind ageia significantly....
 
nhusby said:
Thanks for the info! I was hoping to hear something like this...

it doesnt change much (at present).... but its good to know... I figured that atleast ATI's GPU was capable of full physics because of all the scalability and future proofing they had claimed to do to the X1K series, but all I have heard was the Havok effects physics. Its still (developmentally) behind ageia significantly....

I think the point is that for right now it doesn't matter. The trend will be to start with effects physics. Gameplay physics acceleration will follow later once they work out all the problems associated with hardware accelerating that with a dedicated processor in games. I think you can look forward to CPUs handling the gameplay physics for now, hopefully they can leverage more out of the new dual core CPUs in that regard for gameplay physics.
 
A really cool follow-up to this article would be how actual game developers feel about this stuff. As a couple posters mentioned above, the ability to blast everything into rubble would make balancing levels extremely difficult (for both single and multi). Maybe that is why devs are only using effects physics.
 
MH Knights said:
A really cool follow-up to this article would be how actual game developers feel about this stuff. .

yes, that would be very interesting, ID Software, Valve, Epic and others
 
I'd just like to thank [H] for this great and informative piece you've put online. Straight to the point, no hassle :)

Good work
thumbup.gif
 
GotNoRice said:
That is an interesting article to read, but it basically comes down to a collection of PR guys doing their best to give confusing answers when it comes to their product’s limitations, and coating their strong points with as much sugar and frosting as they can possibly manage. I mean this would kinda be like trying to decide between an ATi and nVidia card by only looking at the product descriptions on each companies website, or by comparing the benchmarks printed on the side of each box while you’re at a retail store. Of course both sides are going to have some little jargon write-up explaining why theirs is the best and the other guy’s card sucks whether or not that reflects reality.

I just find it a bit strange that a bunch of PR fluff is somehow supposed to “clear up” the “misconceptions” that ordinary people have provided via factual arguments on the forum. While certainly everyone has bias, I doubt anyone here has as much bias as a PR guy that works for the company. Quite simply put, Public relations banter is the anti-thesis of knowledge and understanding; and given that, I’m not sure what exactly I’m supposed to come away from this article with.

I wholeheartedly disagree. Sure there is a little PR fluff added in, but that is understandable, and not in your face. Overall most of the information is pure hard technical facts that clears up and answers many things such as 1.) what exactly effects and gameplay physics are 2.) what real-world benefits both bring to the gaming experience 3.) what the hardware and software currently support.
 
SlamDunk said:
I'd just like to thank [H] for this great and informative piece you've put online. Straight to the point, no hassle :)

Good work
thumbup.gif

Thanks, and you're welcome to everyone. Our goal is to put forth the facts on subjects such as these so that no misinformation is being slung around out there.
 
Though the GPU manufacturers tout and rave that they support physics processing within the GPU, there is an underlying question that hasnt been addressed. As mentioned with Godfrey Cheng of ATI, the GPU can determine weight and breaking down walls to continue on.

All that being said, here is something to think about: Are these effects/gameplay physics calculations being performed because of the user, or was it something scripted in by the devs to make it "LOOK" like the user performed the action.
 
It wouldn't be physics if it was scripted. They could easily animate it before hand and take up less cpu by just playing that animation instead of having pre determined rubble randomly fall because of a trigger (be it you walking to a certain spot although I guess that is also the user triggering it as well but...). They are talking about full back and forth interaction with the cpu so I can't see them wasting that on scripted junk.
 
Brent_Justice said:
No, the software is the games. You don't have to use HavokFX for GPUs, that is just one physics engine package that has been announced. In fact ATI is working with game developers to be able to write directly at the hardware level therefore they would be able to use their own physics engine.
Not sure about this. HavokFX is based on a high level Physics API Havok. PHysX a high level API based on the novodex engine. Both excisting a long time. Altho ATI mention it own Physics API, but is could be a low level Physics API more setup, hardware abstraction API . A driver and low level API. To set thing's up in a open way where there are no limit's to a genre and thus no forced direction. But more work to do and more optimisation to do inhouse on the GameDeveloper side the high level Physics side.
Like low level Game libary like Truevisoin 3D Witch is not so genre bound vs a more FPS style aimed Torque engine.
Like the DirectX 3d immidiate mode vs retaine mode.
The Already aviable Physics API's have had a long head start. ATI must do build a high level Physics API as new. Mucho work to do. So I wonder AAA titles likely would go for the specialist aprouch, the direct to the metal aproach. To stand out to the crowed of games. More is possible. ANd have more of the presious time. But clones and Project with a tight scheme and the larger Time restricted part of the dev market would go for a High level aproach. If ATI is going to go for a full high level it take time to get that in short order. Because theay are running behind or have they aquierd a Physics engine firm?
So there High level solution is based on a third player ground work.

I wonder?
If using HavokFX though, right now it only supports acceleration of effects physics on GPUs, but if you read you'll see they do plan to expand when the time is right and developers ask for it. Do not underestimate the potential that effects physics can have on games, as I said in the article these are the physics effects that you "see".
I think I would apriciate effect Physics to. Would be great to slide each effect option to it max with a two R600 just one for the eye candy physics and rendering. Wonder what my X1800Xl can do.
So the PhysX processor and GPUs are capable of effects and gameplay physics (the hardware doesn't know the difference). It all comes down to the games and how the game developer leverages effects and or gameplay physics acceleration on said hardware.
Yes the mention that they are the same. Problem Effects physX can be done easaly client side. Where Gameplay Physics must be synct to all. Plus how you deal with non hardware accelerated Players vs Accelerated. where the part of accelerated is low. Split servers. Split the online community?
But altho Online Play is important Gameplay Physics can be done without any problem for a single player part of a game. Some Like a strong story most story driven games are single player. Coop is a rare feature. So here is a greater chance dev's will go for it.

Maybe and hopely Some of those 100+ games out for around end '06 or begin '07 some will be GameplayPhysX enabled and very important hopely one under them that put it to good use.

It will push the ATI camp to rush a bit faster with there gameplay Physics.

Because Games competes with each other to.
 
Back
Top