Ageia's PhysX Needs a Killer App @ [H] Enthusiast

i just read this in an interview with gabe newell. not sure if you guys have seen it before or not.

GI: I think you guys have one of the best physics setups in gaming, and with things like Aegia and what Havok is doing with NVidia, how will you implement that in Half-Life?

Newell: There are a couple of different issues going on there. With our physics, the player can collide with objects. This is a good thing. The stuff that is being shown by NVidia right now, the latencies are too high. Their physics is essentially to make prettier pictures. It’s like you can have a bunch of different things bouncing around so long as they don’t actually touch anything that matters. If you don’t actually have to read the data out – if your AI system ever needed to know about whether or not one of those objects had collided with something else it would run slower by running on the GPU than having it run on the main CPU. So physics that matter is different than physics that makes pretty pictures.

The presentation quality is fine. I think we went through a round of that with graphics acceleration. It wasn’t until it enabled different types of games, games that involve the game logic being able to ascertain the state of the physics system I think that that’s the really important long term direction from everybody. So having pretty explosions is cool, but you stop seeing it pretty quickly. You only see the stuff that actually matters. There’s still a lot of work to be able to figure out how to get an order of magnitude or two orders of magnitude improvement in that kind of physics. That’ll be a big challenge for everybody.


full article here.
 
I think the biggest problem is that no developer for a mainstream game title is going to limit their audience by selling a game which REQUIRES a PPU to play.

As Gabe points out in the linked article, making the physics part of the game logic is important, you cannot turn that off. If you add game logic which relys on physics you cannot turn it off for people who do not own a PPU. Well maybe you can but it would require practically building 2 games, and you can forget about multiplayer.

All that really leaves is graphical effects which do not effect gameplay, which can then be turned off. Of course this goes completely against what Kyle said in the article, we don't want just new effects, our video card is already good enough at displaying pretty effects which for the most part don't require any physics calculations at all.

The biggest advancement in games recently for me personaly was definatly the physics, HL2 did it very well by making the world in which you play that much more interactive, but even just the way ragdols fall more accurately in early physics enabled games such as max payne 2, had me loving every minute of it.

I fear that Nvidia's (and now ATI) aproach is going to be far more widely accepted, providing the overall performance is well balanced.

*edit*

Another thing I forgot to mention is that online play with physics requires far more bandwidth and work on the servers CPU, I remember not being able to play HL2 deathmatch on my 256k broadband connection without lagging out, all other games could be played online just fine with that amount of bandwidth. It was clear that the contribution to the bandwidth from the syncing of the physics enabled objects was quite large.
 
ScotteusMaximus said:
i just read this in an interview with gabe newell. not sure if you guys have seen it before or not.




full article here.

They're shooting themselves in the foot. AFAIK, HavokFX is simply an extension of the existing Havok engine. Even if the HavokFX only does it on the visual side, the gameplay physics part is still done by the regular Havok, which is the exactly what they're using. Saying that theirs is better is kinda stupid.
 
Sly said:
They're shooting themselves in the foot. AFAIK, HavokFX is simply an extension of the existing Havok engine. Even if the HavokFX only does it on the visual side, the gameplay physics part is still done by the regular Havok, which is the exactly what they're using. Saying that theirs is better is kinda stupid.

As Gabe says, their problem is that offloading that work onto a PPU or the video card isn't a great deal of help, that information still has to be sent over to the system RAM and CPU to have calculations done on it so the AI can interact with it, so other dynamic objects can also interact with it, and so game logic can rely on it. Doing so just creates a large latency between the two.

Having a landslide off in the distance in a game of thousansd of rocks which roll down the side of a hill is fine for a PPU card, but as soon as you add in AI monsters which need to attempt to dodge them depending on which way they're going, or make other desicions based on how many rocks are falling and where they are and at what speed, then all that data has to be processed by the CPU anyhow.

Without this the falling rocks just become a graphical effect of which you could do away with if you so wished, it would not effect the game logic.
 
Frosteh said:
As Gabe says, their problem is that offloading that work onto a PPU or the video card isn't a great deal of help, that information still has to be sent over to the system RAM and CPU to have calculations done on it so the AI can interact with it, so other dynamic objects can also interact with it, and so game logic can rely on it. Doing so just creates a large latency between the two.

Having a landslide off in the distance in a game of thousansd of rocks which roll down the side of a hill is fine for a PPU card, but as soon as you add in AI monsters which need to attempt to dodge them depending on which way they're going, or make other desicions based on how many rocks are falling and where they are and at what speed, then all that data has to be processed by the CPU anyhow.

Without this the falling rocks just become a graphical effect of which you could do away with if you so wished, it would not effect the game logic.

Still. How can plain Havok be better than Havok+FX? Coz that's basically what he's saying.
 
This technology is all fluff presently. Look at that CellFactor demo.....some guys flying around exploding boxes???? Its pretty, but I didnt really see how the flying boxes impacted the game, other than destroying cover......I'm still waiting for a character in a game to be able to pull out a chair for someone to sit down, or grab a book off a shelf and wack someone over the head; without using a "gravity gun". Its just a new concept waiting for an application. I hope it can be used fully in the future because it has some really intriguing possibilities. HL2 definately provided a great starting point.
 
freddiepm61 said:
the thing with the ppu, is that is advertises itself as changing the whole way that we play games, which is very cool... i.e. everything moves with everything else and it all feels more realisitc. I mean hell, at the start of half life 2 I spent about half an hour just throwing boxes and stuff at other boxes and marvelling at how realisitc it all looked. But this physics app had an impact on the way that you played the game.

Now, the aegia ppu cannot do this, it can merely add eye candy, because no games developer will ever change the way the whole gameplay works banking on people having this ppu.... it shall remain for a while at least just that... eye candy, or a way for people to up other settings in the game as the ppu will be taking pressure of the cpu....
f

This was basically going to be my point... For current games especially online games how can the PhysX card add anything but eye candy? AGEIA lists City of Villains, an online only game, as one of the supported games. How can one participant in an online game have advanced environmental effects and another not. I'm not even sure if added eye candy is desirable in a game like this. Do you really want one player seeing a different explosion having their vision more or less obscured than another? Are we going to see PhysX only servers? For that matter would the majority of the Physics calculations for an online game be done on the server? Trajectory, Debris Dispersement, Collision Control.... all seem like things that need to be tracked by the server? Doesn't it? Does that mean the dedicated Game Servers will need to have PhysX cards too?

It is possible that in some of these “supported games” all the PhysX card will do is take the Physics load off of the CPU.

That said I agree with a previous poster that these days I really only get into 3-4 games a year and if even one of those is significantly improved by having a PhysX card then I'll probably get one.... Looking forward to Rise of Legends.... What is PhysX going to do for that game? I have no clue?
 
Framerate said:
I would like to know if these cards share onboard memory or use their own memory.

they have their own memory, just like the frame buffer on a video card, believe right now they come in 128 and 256Mb models

kaleb_zero said:
I'd buy one - probably the lower end model, if only to try it out and give ageia (and the game developers shouldering the costs/risk of implementing this new technology) my support.

thats the spirit! without people like you and me they may not get the capital they need to further research and make things really better, such is the rise of AMD
 
If these new physics cards are going to gain traction quickly enough to survive against Red and Green, they need forget the chicken and egg problem with physics card market penetration versus developer dependency. They need to go after the modding community with a contest. All the physics card folks need a few big games that will use the physics cards for tagged calculation if the cards are available. Offer "free" goodies/cash for the best mod to a game that improves gameplay but nearly requires a physics card.
An easy contest mod to envision would be a persistant glue gun. Say anything that touches the fresh glue for five seconds will bond with it. Also say that the bond can be broken with a certain amount of force. Finally say the bond won't disappear for a long time, if ever, if you don't break it first. Now let the FPS crowd use the gun. You'll get a build-up of bonds in a level that the physics cards can handle, but will likely overwhelm any system without one. Release the winning mod for free and the community will dedicate gaming servers to it. If the mod is great, people will then buy the cards to go play it for free on top of the game they already own.
And if you think a mod like a glue gun won't seriously change the gameplay, go ask someone who remembers Quake II what any of the grappling hook mods did to gameplay.
-Zen
 
LethalZen said:
....An easy contest mod to envision would be a persistant glue gun. Say anything that touches the fresh glue for five seconds will bond with it. Also say that the bond can be broken with a certain amount of force. Finally say the bond won't disappear for a long time, if ever, if you don't break it first. Now let the FPS crowd use the gun. You'll get a build-up of bonds in a level that the physics cards can handle, but will likely overwhelm any system without one. Release the winning mod for free and the community will dedicate gaming servers to it. If the mod is great, people will then buy the cards to go play it for free on top of the game they already own.
And if you think a mod like a glue gun won't seriously change the gameplay, go ask someone who remembers Quake II what any of the grappling hook mods did to gameplay.
-Zen

This is something I've been trying to work out.... I'm no programmer and don't really know the answer, but wouldn't calculations like this need to be done on the game's SERVER not the clients? Wouldn't it be the SERVER that needs the PhysX card?
 
When 3D hardware accelerators first came out, how many years did it take before it actually became mainstream? When CounterStrike came out, hardware and software acceleration were still roughly 50/50. And even when the hardware started dominating, there was still zero gameplay difference between the two. At no time were the two significantly alienated. The transition was natural and painless.

Do you really think the PPU will be able to do the same? By simply having the PPU affect gameplay, you are automatically alienating the ones that don't have the PPU by giving them a different set of rules.

Saying, "We just need more people to buy the card" isn't much of a reason. $250.00 isn't something a regular gamer would pay for a card that's basically a paperweight atleast 95% of the time.
 
jroyv said:
This is something I've been trying to work out.... I'm no programmer and don't really know the answer, but wouldn't calculations like this need to be done on the game's SERVER not the clients? Wouldn't it be the SERVER that needs the PhysX card?

Yup, and the bandwidth needed to send the physics data is gonna be a big headache too.

But his point regarding a free popular mod worked for CounterStrike. Before then the market was becoming saturated with hardware accelerators but it wasn't enough to actually win people over, most gaming PC's still ran in software. But when CounterStrike came out, it was the killer app that made people (and especially Cyber Cafe's) buy TNT2's and finally tipped the scale for 3D cards.

Who know's, by the time the killer app for the PPU comes out, a lot of the bandwidth and gameplay issues might already be worked out.
 
Sly said:
Saying, "We just need more people to buy the card" isn't much of a reason. $250.00 isn't something a regular gamer would pay for a card that's basically a paperweight atleast 95% of the time.

Which is why they need to Port F@H to the PhysX card!

Sly said:
Yup, and the bandwidth needed to send the physics data is gonna be a big headache too.

[...]

Who know's, by the time the killer app for the PPU comes out, a lot of the bandwidth and gameplay issues might already be worked out.


Depends, there are two options:
  • server side physics, in which case each 'fragment' of an object becomes and object and its velocity and position is transmitted to all clients. This requires a LOT of bandwidth.
  • client side physics: the server sends the velocities, positions of all objects to the client (this is what I image CS:S does anyway). The client then computes any/ all physics interactions between the objects. Requires that the server can trust the client will perform this computation correctly (i.e. hacking may be easier). Let me elaborate: the clients would keep track of other object/ object interaction (i.e. player with fragment) and not the server, which is only interested in 'interactive' objects, such as bullets and players.
 
Kyle said "Make me part of 15 people playing CellFactor deathmatch online".
Thats going to be a problem, unless the server has a PhysX card. In order to make sure all of the clients see the exact same world state data, none of them can process their own physics for the world.
That is unless its a "non-interactive" special effect. However if its a mountain of crates falling down, and said individual crates block bullets and player movement then that will have to be processed on the server and the world state data sent to each client with ZERO client side prediction for physics movement. This is the reason they have it running on a lan. This is the reality of a physics infused future, we really need to get a lot more bandwidth into homes a lot faster then they are currently slinging fiber...
 
mashie said:
And which game should they mod that already support the PPU?
Soldner. That game had deformable terrain, trees that would fall over buildings you could blow all the way to the ground floor.... That game really needed something like this hardware....besides proper marketing....
 
If I am Ageia, I offer a big time devleoper a stake in the company in return for leveraging the resources to develope the " killer app" or perhaps a stake in the company to the winner of the proposed mod community contest. A stake in the company would be a real incentive.
 
Sly said:
When 3D hardware accelerators first came out, how many years did it take before it actually became mainstream? When CounterStrike came out, hardware and software acceleration were still roughly 50/50. And even when the hardware started dominating, there was still zero gameplay difference between the two. At no time were the two significantly alienated. The transition was natural and painless.

Do you really think the PPU will be able to do the same? By simply having the PPU affect gameplay, you are automatically alienating the ones that don't have the PPU by giving them a different set of rules.

Saying, "We just need more people to buy the card" isn't much of a reason. $250.00 isn't something a regular gamer would pay for a card that's basically a paperweight atleast 95% of the time.


You have hit on a really important point here. Everyone is trying to compare this to the dedicated graphics card situation, yet it's simply not the same. It's kind of a catch-22 for the physics cards, as a true killer ap would REQUIRE a physics card, and without it, you really wouldn't be able to play at all.

Like Kyle said, I'm not interested in spending $250 for a couple more nifty explosions. That would not be important enough for me to drop the cash. It has to truly affect the way the game is played.
 
My $.03 cents: (My opinion is worth the extra penny.)

The transition to a PPU is going to have many more problems than what the industry went through with accelerated GPUs. Rather than only change how a game looks, a PPU will fundamentally alter the way the game reacts. Multiplayer games are going to be seriously fragmented between those that have and don't have one. With that being the case, I don't see multiplayer gaming with PPUs gaining any traction in the next three to four years.

The single-player experience is more controlled and is going to take off the fastest. However, designers are going to have to pick up the pace and really get creative with it. Stacking crates and knocking them down, or using a dead body to do a "tootsie roll" dance routine is already getting old with our current state of game physics. Already I just know that some developer is sitting there with the toolset making stupid domino effects in their game that serves absolutely no purpose than to "look cool" even though it's way out of place.

In fact, that's my greatest fear currently. That game design will take one step forward with the PPU and two steps back with level design. I see crates galore, with logs littered everywhere making OSHA inspectors cringe.

As for the Cellfactor demo... yeah, I can see it being fun. But I also see it as a bunch of people running around with a gravity gun only on a larger scale. I'd be a bit more impressed if the crates shattered and a chunk of wood took out a person, or a bridge collapse takes out a car. Right now it's just so... pediestrian. Not a killer app yet.
 
prtzlboy said:
You have hit on a really important point here. Everyone is trying to compare this to the dedicated graphics card situation, yet it's simply not the same. It's kind of a catch-22 for the physics cards, as a true killer ap would REQUIRE a physics card, and without it, you really wouldn't be able to play at all.


Except you miss the point about the video card manufacturers supportting physics accelleration...thereby all games can use it and play the same, just the ones with dedicated cards will be able to play at full settings and those without dedicated cards will have to scale back graphics so the graphics card can do the physics for them.

I don't see any game having to have a dedicated card to use the physics engine...and thats what ageia is selling, the engine licensing. The hardware is just a stimulant if you haven't figured it out yet.
 
Rifkin said:
Soldner. That game had deformable terrain, trees that would fall over buildings you could blow all the way to the ground floor.... That game really needed something like this hardware....besides proper marketing....

sounds interesting. if my future PPU can do anything for it (big if), i'll look into it. ;)

Low Roller said:
Possibly, but the same holds true for Havok FX. Devs using Havok FX have to pay for a Havok FX license and invest in developing content that can make use of that feature. Sure, with Ageia's PhysX devs still have to develope content that to some extent can take advantage of the PPU. The PhysX/Novodex license, however, is free.

Anyone know how much a Havok FX license costs?

PhysX license = $0

I agree red and green still hold an advantage over a startup like Ageia. Don't forget, though, Ageia has only been around a year. They've made a ton of progress in that short amount of time. When they announced the whole PPU idea last year at GDC they had nothing more than an idea. Now they have 60 devs and over 100 games supporting their card to at least some extent, and they're developement tools have been tied into both next-gen consoles, especially the PS3.

Its no supprise ATI/Nvidia would rather you buy a second gpu instead of a PPU.

H*ll no I don't want to pay $700 for a GPU with PPU that I have to get rid of in 8 months. Putting the PPU on the GPU is a stupid idea, and only people who don't realize that the GPU cost would go up proportionally suggest it.

----------------

And where the heck did you get a PPU??? ASUS site says they release in MAY. And I can't find any info on where it's even possible to buy one by itself. Help please.
 
BBA said:
Except you miss the point about the video card manufacturers supportting physics accelleration...thereby all games can use it and play the same, just the ones with dedicated cards will be able to play at full settings and those without dedicated cards will have to scale back graphics so the graphics card can do the physics for them.

I don't see any game having to have a dedicated card to use the physics engine...and thats what ageia is selling, the engine licensing. The hardware is just a stimulant if you haven't figured it out yet.


Um, you're missing the whole point of the argument. My point is that the type of physics acceleration that doesn't REQUIRE a physics card is just more eye candy, and for me not worth the extra dough. More flying crates with no impact on gameplay. I want physics that affects the way the game is PLAYED not the way it looks, and this is the type of thing that you would need the hardware for, which is why it is a catch-22. It would be hard for someone to justify developing a game that would REQUIRE a $250 add-in card.
 
Personally I think all this talk of PPU and GPUs doing physics is the wrong direction we should be going. Now that cpus are moving in the direction of multiple cores (dual and soon quad cores) developers could be using the other cores(s) for physics work instead of needlessly spending money on new hardware or siphoning off gpu cycles. Multicore cpus already have a large and growing installed user base wondering when these extra cores are going to be used anyway. If Intel or AMD where smart they could add some extensions to thier cpus similar to SSE and put Ageia out of business before they even get off the ground. A multicore Intel Extreme Edition or AMD FX that can do physics work would be the way to go since you have to have a cpu regardless.
 
bildad said:
Personally I think all this talk of PPU and GPUs doing physics is the wrong direction we should be going. Now that cpus are moving in the direction of multiple cores (dual and soon quad cores) developers could be using the other cores(s) for physics work instead of needlessly spending money on new hardware or siphoning off gpu cycles. Multicore cpus already have a large and growing installed user base wondering when these extra cores are going to be used anyway. If Intel or AMD where smart they could add some extensions to thier cpus similar to SSE and put Ageia out of business before they even get off the ground. A multicore Intel Extreme Edition or AMD FX that can do physics work would be the way to go since you have to have a cpu regardless.

A dual core CPU is still way way slower than a dedicated chip.

For example 1 CPU can do about 10 GFlops of data processing, so dual core would do 20 GFlops. A dedicated processor like a GPU or PPU can do about 350 GFlops. You'd need a 35 core CPU just to match the raw computational power.

/Edit - Just looked up the ATI documentation for physics acceleration, a dedicated GPU like ATI's X1K can do 375 Gflops of data processing. In CrossFire mode that would be 750 Gflops. So a CPU to match would have to be a 37 core CPU to match a single GPU and a 75 core CPU to match a dual GPU solution.
 
Here's what troubles me about CellFactor.

Okay, I see all the pretty pretties and stuff like that. But in the screen there it's obviously a deathmatch.

Most deathmatches work with current bandwidth technology because if you play, say, Battlefield 2, every player plays with their own client handling the background, the graphics, etc... including the physics.

Which means that even in a 64 player massive battle, all you download are the positions of 64 players, current weapons fire, and various objects (like grenades and vehicles) that move in the course of the game.

If there's 300 or so flying boxes, shrapnel, pipes, all going on at once, all of a sudden, you're tracking 300 different objects -- or at least giving 300 "initial impact" data points and letting the physics chip do the work. I think that will be laggy as hell on current broadband technology.
 
Brian Boyko said:
Here's what troubles me about CellFactor.

Okay, I see all the pretty pretties and stuff like that. But in the screen there it's obviously a deathmatch.

Most deathmatches work with current bandwidth technology because if you play, say, Battlefield 2, every player plays with their own client handling the background, the graphics, etc... including the physics.

Which means that even in a 64 player massive battle, all you download are the positions of 64 players, current weapons fire, and various objects (like grenades and vehicles) that move in the course of the game.

If there's 300 or so flying boxes, shrapnel, pipes, all going on at once, all of a sudden, you're tracking 300 different objects -- or at least giving 300 "initial impact" data points and letting the physics chip do the work. I think that will be laggy as hell on current broadband technology.

After reading through this thread, I would have to say that this is the main concern about physics thus far. It's how it will be implemented in multiplayer. In the game Cell Factor they were showing, I am guessing it can only be played on LAN because the server is sending all the locations, velocities, positions, etc. of the boxes and whatnot to the clients, and that just eats up the bandwidth? In my mind this equals out to only the server needing the physics processor and the clients just better make sure they have some darn good connection speeds.
 
kcthebrewer said:
I think Havok has too much of a developer-base that Ageia will not take off especially with the price. $200 is too much for not enough. I could see $300 for a soundcard (X-Fi Fat), but it is something that effects every game you play, not just specific ones (even though specific games benefit more). Same with a video card. If the Ageia card did Havok and PhysX this would all be a different story, but it doesn't.

They are shooting with too high of a price point without any userbase. There is little to no incentive to add this additional cost with a system build as of now.

What did we all do when the 3DFX Monster II came out? We bought it. I payed $200 for mine if I remember correctly. I was a poor student what that product came out. I am def going to be purchasing one when UT2007 comes out.

This is just like Glide vs. Direct X vs. Open GL all over again.
 
I already thought that the PhySX had slim chances, but after reading through this thread, it's obvious.

1. Cost. $250? Maybe $50-100 max, if it adds real value (see point 3 below).

2. Competition. Once the big boys find ways to give it to you easier and cheaper, (which they just did), game over. Side-note: It doesn't matter what developers have to pay for the Havok FX license, game prices are already set by market dynamics, not the developer's cost for individual features. The market has settled at $50 for a top-tier game at its debut, what the developer has to pay for a tech license isn't going to change this price. Actually, it's the other way around--what the market will bear determines what Havok can charge the developer.

3. Relevance. Gabe Newell's point clinches it. If you can't get the toys into the AI and the gameplay, it just doesn't matter. Would you pay $250 for HDR or angle-independent AF?
 
BBA said:
He bought it for $299, just like anyone can. It's an Asus card ageia has made available for the last three weeks.

I got the email to buy it myself.

I am combing Asus's website and I see nothing about it for sale. Please point to where it can be purchased?
 
I've already voiced my concerns over these cards breaking up MP communities and taking a game that has some people playing 1 version to little people playing 2-? versions.

This also sucks for poor people like myself. I am still in college struggling to use what little money I make to get by. That said I don't have $250 laying around to slap down on a card that makes a wall crumble, or a leaf wave in the wind. Even if I did have the money I'd still be hard pressed to actually buy this thing. I have 17" LCD so I'm limited to 1280X1024, so I don't need to go out and buy a X1900 XTX or 7900 GTX, it's a waste. Last time I bought the super high end was the Ti4600, since then I usually buy a cycle late or buy a midrange card. I don't like the thought of this card costing more than my video card.

On top of this I read somewhere, don't remember the source, that Ageia plans a 6 month product cycle. Are we going to have to upgrade constantly so we can see 500 pieces of debris instead of 300? The whole thing confuses me and actually pisses me off. Imagine if BF2 had this, that would mean 3 games, and two variations. In total that would be 6 different options for players to choose from, the community would be broke in half, if not more. This doesn't just apply to BF2, this is every other game with expansions and massive online communities.
 
GotNoRice said:
There will be a PCI-Express version in the future, however current cards won't be held back by the regular PCI bus. The PPU doesn't have to deal with stuff like textures which end up being a huge part of why Videocards have such high bandwidth demands.
Granted, but my 1x is the only available slot, so count me out until then or my new mobo
 
Major_A said:
I have 17" LCD so I'm limited to 1280X1024, so I don't need to go out and buy a X1900 XTX or 7900 GTX, it's a waste.
Not in games like F.E.A.R. or Oblivion (or any upcoming 3D game for that matter).

Major_A said:
The whole thing confuses me and actually pisses me off.
Someone needs a time-out!
 
Brent_Justice said:
A dual core CPU is still way way slower than a dedicated chip.

For example 1 CPU can do about 10 GFlops of data processing, so dual core would do 20 GFlops. A dedicated processor like a GPU or PPU can do about 350 GFlops. You'd need a 35 core CPU just to match the raw computational power.

/Edit - Just looked up the ATI documentation for physics acceleration, a dedicated GPU like ATI's X1K can do 375 Gflops of data processing. In CrossFire mode that would be 750 Gflops. So a CPU to match would have to be a 37 core CPU to match a single GPU and a 75 core CPU to match a dual GPU solution.

That may be so, but do we really need 350 gflops of physics processing power? I would hazard to say no, not even close. Developers as of late have been able to produce some pretty impressive games using just a single cpu to do physics, ai, sound, etc not to mention all the little background things like keeping the OS running. A cpu dedicated to nothing but physics would probably be more than enough to do what is needed.
 
bildad said:
That may be so, but do we really need 350 gflops of physics processing power? I would hazard to say no, not even close. Developers as of late have been able to produce some pretty impressive games using just a single cpu to do physics, ai, sound, etc not to mention all the little background things like keeping the OS running. A cpu dedicated to nothing but physics would probably be more than enough to do what is needed.

If we just settled for "what is needed" we wouldn't have the games we have today.

Look what dedicated chips did for 3D graphics, all that power allowed some very impressive graphics in games. If we had that kind of power for physics that would allow very impressive physics in games.

Physics is right now where hardware accelerated 3D was back in the Voodoo 1 days.
 
Brent_Justice said:
If we just settled for "what is needed" we wouldn't have the games we have today.

Look what dedicated chips did for 3D graphics, all that power allowed some very impressive graphics in games. If we had that kind of power for physics that would allow very impressive physics in games.

Physics is right now where hardware accelerated 3D was back in the Voodoo 1 days.

The way I view the issue isn't exactly the same though. Physics isn't something you can see and interact with all on its own. It requires another interface, in the case of computers it's through our graphics cards. Something has to display all the results of this extra physics and considering we can't seem to get past sprites for things like grass and leaves does it really matter if a ppu can create the effect of 50,000 blades of grass waving in the breeze or thousands of bits of 'suff' flying in every direction when a bullet strikes a wall if our grahics cards can't show us the results of it.
 
But making games to take advantage of all that graphics processing power is easier than with physics. Hence the name of the thread.
 
Commander Suzdal said:
1. Cost. $250? Maybe $50-100 max, if it adds real value (see point 3 below).

The PhysX processor is a 125million transistor chip (compared to for example 114million on the FX-57). It also will have 128 or 256megs of GDDR3. $50-$100 max? heh
 
cyks said:
Not in games like F.E.A.R. or Oblivion (or any upcoming 3D game for that matter).


Someone needs a time-out!
Why twist my words? Nothing better to do?

As far as FEAR and Oblivion I beat FEAR on my 6800nu and am playing Oblivion currently. I don't need 4XAA and 16XAF with max settings.
 
rjblanke said:
But making games to take advantage of all that graphics processing power is easier than with physics. Hence the name of the thread.

That may be so, but currently none of the power of our graphics cards are doing any physics, it's being done by the cpu. We'll free up cycles on the cpu but not the gpu. Adding in a ppu isn't going to make your graphics card any faster or be able to display more things. The cpu might be able to feed more stuff to the graphics card in the case of cpu limited games, but gpu limited games you're out of luck.
 
A Video Card or Sound Card affects nearly everything your computer does, from playing games to watching DVDs. A Physics card won't. The cost of a PPU vs the cost of these other items is something that must be balanced carefully -- especially when Joe Consumer can grab a full computer system from a major retailer for $600.00-$800.00 and run almost anything out there (maybe not well, but most stuff will run.).

As I read through the last 4 pages I was thinking about multi-core CPUs. I have seen news blurbs about Intel launching 4 core CPUs in the not so distant future. Dual core systems are really starting to grab market share. CoreDuo in laptops, the FX-60 and other offerings from AMD and Intel; multicore is realy hitting the mainstream. Could a game be written to use one of these cores or a PPU? Then the system requirements might be "multi core CPU or a PPU." Framerates may suffer a bit and detail settings and resolutions may have to be dialed down, but it wouldn't be an all or nothing proposition. For online play restricting folks to smaller servers when they only have the multi core CPU instead of the PPU.

Near the begining of the thread the PS3 was mentioned. If the PS3 is using PhysX type of software porting games back and forth may give the market more drive to adopt PhysX also. It might save companies $$ when coding ported titles.


Who knows, maybe in 5 years we'll be looking back on this and laughing as the new mobo we unwrap has an onboard PPU standard. :p

Of course with my luck, Microsoft will require the use of PhysX in Vista so they can make all the pretty effects of window scaling and movement they talked about -- rendering all my current hardware obsolete :eek:
 
Back
Top