Nvidia killed the PPU star

Jason711 said:
oh i forgot... fear will be the last game ever made.. my bad.

Or that game-engines can be rewritten or patched...

Terra - We know that never happens, right? ;)
 
Terra said:
Or that game-engines can be rewritten or patched...

Terra - We know that never happens, right? ;)
That may well be possible, but even if they do this I think the benefit will be marginal at best.

The CPU's are perfectly capable of handling the game physics we have now. The environments are going ot have to become ALOT more interactive with alot more interaction between objects before the PPUs are effective. But at the same time, the PPU has to exist to make this greater implementation of physics possible, so that game developers have a platform to develop on.

Kinda the scenario of the chicken and the egg.
 
not only that but putting physics on certain things such as water, or complete destructable objects with massive amounts of shrapnel or interactive water, having something like this in place gives developers something to work on, while the PPU is all but non existant, not to mention if the PPU never makes it to the market you still have a platform from NV or ATI to take advantage of it, so development will not go to waste
 
I like the idea of the separate physics engine and profiles in games.....I would bet that by the time this is reality you'll be able to adjust physics effects in the game set-up just like you do graphics effect, to suit your GPU's power or lack thereof.

My fear.....I have a 7800GTX SLI setup right now and in some areas of games like COD2 my system chokes a bit. I cant imagine sacrificing one 7800 to do the physics, the graphics would suffer terribly, and my system aint wimpy.

Im glad this is happening, it gives SLI some teeth and competition for Aegia.
 
ironforge said:
Why not just utilize dual core CPU's for the same effect?
thing is....you CAN, but it doesn't make it ideal nor does it mean it is as effective as a standalone ppu or a gppu(or watever ya wanna call it) since we are on that question, why not use dual core cpu's to render graphics? as with with physics, you CAN do it, but it could never perform at the same level as a discreet solution. just because you can carry a huge rock on ur back doesn't mean that wheelbarrow is useless, nah mean?
 
I think the main problem facing the PPU, in general, is that it cannot be incrementally deployed and still function as a core gameplay element. You can't, for example, have any PPU accelerated physics affect obstacles, movement, enemies, weapons, etc, because in multiplayer scenarios, not everyone will have a dedicated PPU. Core gameplay elements that affect health, movement, weapons, etc. MUST be the same across all systems so no one has an unfair advantage. This fact alone will relegate PPU to eye-candy background physics effects that really don't functionally change gameplay mechanics -- It greatly limits the possibilities of "more interactive environments". I think these background effects, while interesting, won't justify the purchase of a $300 dedicated processor.

GPU based physics, on the other hand, can be justified by lumping physics computing power in as additional "eye candy" power that people already pay for, just supplying a better *looking* gameplay experience overall. In this case, you won't run into core gameplay differences across GPUs, AND users will have multiple uses out of their new GPU. They will think, "this NVIDIA card will make the gameplay experience that much better overall, I can run on my big monitor at high framerates with lots of pretty stuff on the screen (both physics and shading)", from ONE product.. bill it as a "general gameplay enhancement product".
 
their are game devs currently developing and or accounting for the physx ppu. im not too worried about it. of course its going to take time to mature. how long was it before games and the gpus were coded to property take advantage of transform and lighting? just give it a while and it should settle down into something we can all enjoy, one way or another.

me thinks, anyway. :D
 
You can't, for example, have any PPU accellerated physics affect obstacles, movement, enemies, weapons, etc, because in multiplayer scenarios, not everyone will have a dedicated PPU.

correct me if im wrong, but the servers are the ones that need to calculate the physics. it wont really matter if a client has a PPU or not because the physics in a multiplayer game is controlled by the server that you are playing on. The limitation to complex physics would be the internet bandwitdh.
 
Sniper X said:
correct me if im wrong, but the servers are the ones that need to calculate the physics. it wont really matter if a client has a PPU or not because the physics in a multiplayer game is controlled by the server that you are playing on. The limitation to complex physics would be the internet bandwitdh.

Actually that is incorrect, there would be an ungodly amount of bandwidth used if servers updated every game object to every player every frame. In fact, the server just synchronizes state to the clients at level load (object locations, random seeds, etc), then the client machines each deterministically calculate all the physics on their own. Games minimize state updates as much as possible, usually only sending deltas for player movement, player actions, periodic sync information, etc.
 
In fact, the server just synchronizes state to the clients at level load (object locations, random seeds, etc), then the client machines each deterministically calculate all the physics on their own.

I have made a dedicated CSS server with a couple of bots and when a bomb explodes or a frag goes off. It usually results in stuff flying and the server's cpu being taxed a good deal. IT seems to be that the server is calculating the physics.

Edit: http://www.hl2world.com/wiki/index.php/Client_Side_Physics#Server_Side <---- Talks about physics
 
Sniper X said:
I have made a dedicated CSS server with a couple of bots and when a bomb explodes or a frag goes off. It usually results in stuff flying and the server's cpu being taxed a good deal. IT seems to be that the server is calculating the physics.

Let me be a little more specific, 99% of the physics you see in a game are client-side, there are a few exceptions where the server handles the sync up between multiple physics-enabled objects *from different clients* interacting with each other, however this is extremely limited, and only used when say, someone throws a barrel at you in HL2 -- the server decides if you hit or miss the person here, because each client has a slightly different version of the game state at any given time.

However, after this hit is decided, the bouncing of the barrel, other non-player objects it hits, non-player interactions, environmental effects, etc. are all simulated client side. It is just not possible to update all the objects in the world to every player at every frame -- most of the time, players can only see a subset of the objects anyway, it would be impossible for a server to keep track of all flying objects in an entire level at once.

Real-time physics is only possible because the calculations are confined to the scope of what the player can see and interact with, your computer doesn't care about some cup falling over on the other side of the level. Server side physics are in limited use *only* when two players need to see exactly the same state for a specific object, which is relatively rare when compared to overall environmental interactions and physics calculations.

Your CSS example taxes the CPU because the bots are clients themselves, and the *clients* are trying to figure out if they have been affected by movement of objects. I 100% guarantee that as in-game physics become more complex, optimizations will be made so that server will handle even less physics than now (i.e the bare minimum, they will probably even move to low-accuracy models for the few server physics calculations), since the server must keep track of the *whole* level at once, and complex physics are $$$ on resources.

EDIT: Your link just proves my point exactly
 
Isn't a PCI slot limited to 33mhz? I don't know about you, but I have a feeling that a physics processor, which must render the physics in real-time while in sync with a videocard and processor is going to have a lot of trouble on a PCI bus.

And chances are, there won't be room for 2 cards in SLI and a PPU on the PCI-e bus...
 
Talked to a nVidia rep today. He said they are doing this to just give more advantage to SLi, and that this plays an even bigger role in Quad SLi ;).
 
if i dedicate some of my 7900gt for physics, will i see that much of a dock in performance hypothetically?
 
Menelmarar said:
The CPU's are perfectly capable of handling the game physics we have now.

That's the problem...
I want WAY more physics...
More than CPU's can handle...
Waaay more...
Just like I want more and more real-life looking graphics and more and more accurate sounds...

Terra - Are you content at Status Quo?
 
I don't have time to read through the whole thread to see if someone has already pointed this out, but here's my $0.02:

A dedicated PPU will likely be faster than a high end videocard at doing physics...simply by design. But probably not much faster. So in the end, I think people with high-end systems will have a dedicated PPU. However, anyone with decent videocard will get some physics eye-candy too.

As long as a dedicated PPU is faster (and probably cheaper as well) than a second video card, there will be a market for it.

It's just cool that videocards are going to support this stuff too, will make it much easier to get it into new games when there is a much bigger installed user base.
 
Terra said:
That's the problem...
I want WAY more physics...
More than CPU's can handle...
Waaay more...
Just like I want more and more real-life looking graphics and more and more accurate sounds...

Terra - Are you content at Status Quo?
Yes, I completely agree. But you took my quote a bit out of context. My statement there was in reply to your mention of patching support for a PPU into today's games and my point was that it would grant minimal benefit to TODAY's games. It would simply be unutilized power.

The real benefit of having a dedicated PPU is yet to been seen or truely realized....
 
Based off what eXzite says is true about multiplayer physics. Mulitplayer games like ut2007 are going to have to have different servers for people with PPU's and those with out.(at least untill PPU's become part of the standard gaming hardware) In order for advanced physics in multiplayer to work.
 
Jason711 said:
im hesitant about having a gpu trying to do both at the same time.

Well, since physics computations are slightly different from shader computations, it *might* take an extra few milliseconds to reset the GPU for PPU processing and back every few frames. You'll also need quite a bit of memory to store the extra data (512's are gonna be popular if this is true)

Putting the hardware physics ability on the GPU means programmers suddenly have a large market share already active in the population (even if most video cards can't use it fully).
 
eXzite said:
Actually that is incorrect, there would be an ungodly amount of bandwidth used if servers updated every game object to every player every frame. In fact, the server just synchronizes state to the clients at level load (object locations, random seeds, etc), then the client machines each deterministically calculate all the physics on their own. Games minimize state updates as much as possible, usually only sending deltas for player movement, player actions, periodic sync information, etc.

Indeed. Doom3 has physics built into the netcode. And everyone already knows what happened :rolleyes:
 
Terra said:
That's the problem...
I want WAY more physics...
More than CPU's can handle...
Waaay more...
Just like I want more and more real-life looking graphics and more and more accurate sounds...

Terra - Are you content at Status Quo?

Was anyone complaining about the lack of physics in games before ageia showed up? No.
Was anoying saying that half life 2's physics suck and they want 10x as much? No.

Ageia is solving a problem that doesnt exist.
 
forcefed said:
Was anyone complaining about the lack of physics in games before ageia showed up? No.
Was anoying saying that half life 2's physics suck and they want 10x as much? No.

Ageia is solving a problem that doesnt exist.

Your reasoning bewilders me?
When the physics part about HL2 was know, the gamers rejoiced.
A more dynamic world...
Hell, I have even played around pushing drums down hills in FarCry...just because I could :D
Did anyone compalain about the lack of T&L in GFX cards before they where launched?
I use a GFX card to offload my CPU(and gain preformance and better IQ)
I use a soundcard to offlad my CPU(and gain preformance and better sound)
But youre saying that if I wan't to use a PPU to offload my CPU(and gain more preformance and better physics) that it's stupid? :confused:

And I don't see this just for FPS games...
Flight sims would benefit...lift/drag is all about physics...
Car sims would benefit...tracktion, grip, g-forces, collosion is all about physics...

Like The Tech Report writes:
The Tech Report - NVIDIA to demo GPU-accelerated Havok FX physics

However, Havok FX is limited to what's referred to as "effect physics," or physics that don't affect gameplay. Havok prefers to keep gameplay physics on the CPU, leaving Havok FX with the physics calculations necessary for visual effects and other eye candy.

Like DX9 hardware rendered graphics is better/faster than software rendered graphics, so i real-time physics better than scripted "physics"...

Terra - Games have always evolved, getting more and more "real-life"...this is just the next step...
 
forcefed said:
Was anyone complaining about the lack of physics in games before ageia showed up? No.
Was anoying saying that half life 2's physics suck and they want 10x as much? No.

Ageia is solving a problem that doesnt exist.

i was, the physics currently suck, they always have. plus, game developers continue to avoid using circles.. (except q3)... i hate things that are made to represent circles, but are not circles, it looks like shit.

i want all the realism i can get, period. sorry if your happy with shit.
 
forcefed said:
Was anyone complaining about the lack of physics in games before ageia showed up? No.
Was anoying saying that half life 2's physics suck and they want 10x as much? No.

Ageia is solving a problem that doesnt exist.
How wrong you are. Immersion is everything to me and I've always wanted games to have more physics stuff and I hope the snowball is finally starting to roll because, let's face it, physics in todays games (FPS/action games) are basically on a level of knocking down a few chairs here and there. That's nothing.

One thing that helps realizing how important hardware accelerated physics will be for the all important immersion level in future games is by using reverse psychology by trying to imagine what it would be like to be in the real world that is as static, scripted and "dead" as the world of todays games. Boring, eh?

Anyways... even if AGEIA fails and fades away (which can certainly happen if there's no killer apps that really push the physics envelope and make people go WHOAH!) they have to be granted the credit for evangelizing the importance of making the game world "alive". In my honest opinion, todays games are beautiful but they really need to have a physics revolution. I don't care if it eventually happens with a PPU, GPU or multi-core CPU systems but seeing it happen is a huge thing for me. According to the previews, Crysis from Crytek is a great step in the right direction
thumbup.gif


I'm sure wonderful things will come out of this whole HW physics thing and physics will become a big selling point for the games of the future. It won't happen overnight though.
 
Yea, I think physics in today's games is pretty crappy. And lowering our graphics power is not the way to improve. We need both graphics AND physics to improve.

A CPU can handle maybe 400-1000 objects. The initial PPUs were reported to be able to handle 8,000 objects, and moving up to 32,000 soon.

a dual core cpu is not the place to be doing physics. neither is a graphics card.
 
So funny that nVidia happened to find a way to use the video card's processing to its fullest potential, and all people do is bitch about how it's a bad idea and it's worse than cards that DON'T EVEN EXIST YET.
 
Trimlock said:
give a dog a bone and they bitch it wasn't big enough :(

So because we point out that NVIDIA's solution will be more expensive than Ageia's and not give real hardcore physics, but only pseudo-fluids and other "soft" physics we are wrong?

Tera - Is that what you are saying? :)
 
robberbaron said:
So funny that nVidia happened to find a way to use the video card's processing to its fullest potential, and all people do is bitch about how it's a bad idea and it's worse than cards that DON'T EVEN EXIST YET.

why cant the apply the cards full potential twords graphics?
 
So because we point out that NVIDIA's solution will be more expensive than Ageia's and not give real hardcore physics, but only pseudo-fluids and other "soft" physics we are wrong?

Tera - Is that what you are saying?

no i'm saying you guys are bitching over something that doesn't even effect the price, OMG HDR CUTS RESOURCES WAY DOWN JUST TO GET BETTER LIGHT!!! screw the 6800's!!!

its an option, i and many others are welcoming it as it will have put a testbed for games to develop for it for future use of better physics
 
I haven’t read everything, and I’m not going to pretend I understand all of it, but here’s my question.

Will this Havoc FX work with nVidia cards only? if I but any ATi card now or in the future, and try to play a game that features Havoc FX (say, “Hellgate: London”) will I be screwed?
 
Trimlock said:
no i'm saying you guys are bitching over something that doesn't even effect the price, OMG HDR CUTS RESOURCES WAY DOWN JUST TO GET BETTER LIGHT!!! screw the 6800's!!!

I have an AGP 6800GT and I use HDR(7) when I game FarCry 1.33
I prefer the added realism HDR gives, over AA.

its an option, i and many others are welcoming it as it will have put a testbed for games to develop for it for future use of better physics

It is a "light" option giving only "pseudo"-physics, compared to the physics-acceleration AGEIA's solution provide.

Terra - That's all I am saying...read beyond the marketing-PR...
 
northrop said:
I haven’t read everything, and I’m not going to pretend I understand all of it, but here’s my question.

Will this Havoc FX work with nVidia cards only? if I but any ATi card now or in the future, and try to play a game that features Havoc FX (say, “Hellgate: London”) will I be screwed?

iirc any sm3 capable card will work with Havok FX
 
i think a problem that hasn't yet been addressed in the discussion is that effects physics can create a lot more particles on the screen, thereby creating much more work for the graphics card. doing the physics on the graphics card is like the hardware shooting itself in the foot. the graphics card creates more work for itself, while at the same time reducing its ability to cope with the added workload.
 
Back
Top