Ageia's PhysX Needs a Killer App @ [H] Enthusiast

Sharps97 said:
I love my Verizon FIOS.

FIOS.jpg

You sir suck.
 
Spazilton said:
Dell is offering them with their systems now.


not yet they aren't, but there were 250 Aegia cards that were sold to "developers" recently, or anyone that could appear to be able to use it to develop games and content
 
nobody_here said:
not yet they aren't, ...

Dell is indeed offering the PPU :confused:
http://configure.us.dell.com/dellstore/config.aspx?c=us&cs=19&l=en&oc=DXPS600G1&s=dhs

"Physics Accelerator
The AGEIA ® PhysX ® physics accelerator is a radical breakthrough in the way you can experience the virtual environment. Experience real-time physically-based environments, action and game play that would be impossible without dedicated hardware acceleration. Note: Games must be designed to take advantage of the PhysX accelerator.

(x) None

(o) AGEIA® PhysX® Physics Accelerator [add $249 or $8/month3]

Are you ready for the future of gaming?"

Terra...
 
I feel like physics implementation in games has been a key thing missing for a while. While it is certainly not necessary to enjoy a game, the added realism and (more importantly) added gameplay options could bring so much to a game no matter what the genre.

When watching the Cell Factor video for the 1st time, I didn't really enjoy the idea too much. But after another time through I noticed how the physics gameplay changed the style of play alot and i figured out (to an extent) what was going on. I definitley think the game has much good potential. It seemed reminiscient of the chaotic DM play in the Quake games (especially Q3), which if done right can definitely be enjoyable.

I think that sadly many of the first games to sport the Ageia Physx tech will be very gimmicky just to make a statement but even then I think I could enjoy the idea. I'm all for this card and hopefully it can succeed. Can't wait to see what comes out in the future.
 
Do you all remember what happened with the release of Quake2? If you wanted it to run well, then you needed a graphics card.

I mistakenly once bought hardware (GF4) in anticipation of a software release (UT2K3), sixths months later when the game came out there was also faster hardware. DOH!

It really is simple to see why casual gamers will wait for a Killer app. As with real life, we first need to think of new ideas (software) before we put them into practice (hardware).

What form could this take? Well multiplayer will require lots of people to front the cash to buy a card, and so is not likely, so that leaves singleplayer, and I dont believe that it is likely that we'll be seeing a major company constructing a game from the ground up on physics. mod community - not talented enough (flamebait I know but who is going to risk learning the api for obscure hardware, only boring geeks who couldn't program a fun game)
That leaves John Carmack, but I think he's preoccupied with becoming an astronuat or something.

Ageia are premature, they will be bought by either ATI or nVidia, or together will phase them out.
 
I'm not sure if anyone's already said, but there is one thing the Physx can help with RIGHT NOW.

Servers.

Most of you have played BF2 or HL2 or whatever and noticed how the servers can get really laggy when a lot happens. I believe that the Physx can be used to offload the bullet and vehilce collisions from the CPU.
Let the CPU conduct the battle like a general, let the Physx figure out the details.

The obvious up side to this is that there's a chance we could see games that cope with many more players than they can today. I've seen some servers really struggle on some maps with a mere 32players.

I am certain that offloading the physics even if only on the server WILL create a more cohesive online experience.
 
I dont know if this was posted before
but Gdhardware has a counter article to what hardcop has

edit: nevermind was posted in the hardocp front page a time ago.. better watch the front page more oftern
 
What exactly makes their conclusion opposite that of [H]'s? They seem in agreence to me.
 
Basically, their stance is that Ageia only needs one "killer app" if it is the right one. IIRC, [H]'s article suggests a wealth of them creating a critical mass.

 
Lord of Shadows said:
In ageia's little demo engine they had liquid modeling, I'll try to find the demo. Other things like cloth simulation etc are all possible.

the metal bending was nice, but we need a general script for this for it to really take off, right now the physx engine has to be ordered to bend the metal by the application, we need somehting like Direct X for physics, Direct Z. V1.0, it could work.

but for sure, physics cores powerful enough to really get some calculations done, pci-e 16X bandwith required. that would be sweet.
 
Russ said:
What exactly makes their conclusion opposite that of [H]'s? They seem in agreence to me.
The [H] article basically said that gamers are going to need a few good reasons to buy the PhysX chip. The other one basically compared Ageia to 3DFX. The difference between the two is that [H] argues that in order for PhysX to be successful there will need to be several awesome games on the market that use the PPU to positively benefit gameplay (i.e. adding more to the game than extra debris from explosions) while the other article argues that everyone is going to blindly run out and buy a PPU even if support for it is minimal.

However, what they fail to address is that the level of support for the PPU and how it is implemented is the key. If it does nothing more than make explosions look more Hollywood-eqsue and a bit more debris to them, it's going to be hard to convince people to buy the card. On the other hand, if the card actually adds something significant to gameplay, people will be more eager to buy it.

If given the choice between killer graphics and killer physics, I think everyone will choose graphics. Without graphics, physics are nothing. Without physics, graphics are... the same. This is where the comparison between Ageia and PhysX falls apart. Powerful 3D cards enabled a lot of new types of games to be developed. Will a powerful physics card do the same? Yes, but on a much more limited scale.

I think that the number of real innovations that can come out of the PPU is fairly limited. It will generate some niche games that some people will enjoy playing, but I don't think games where physics is the main thing the game is designed around will do very well. Sure, the Cell Factor demo looks somewhat interesting... but how many games of "throw the boxes around" can people really handle. Eventually it will get old and people will want more out of games than just the ability to throw boxes around.

Another thing to ask is, just how much interaction is really necessary? Is the ability to knock everything over that big of a deal? Would adding better barrel physics to Counter-Strike Source make it a better game? No, the game does not revolve around it. However, adding the ability to interact objects in a strategic way may make future games better. Is that something that can't be done with today's technology?

Some people want the ability to fully destroy the environment around them. While it sounds like a great idea on an internet forum, I feel that in practice this would create an unenjoyable gaming experience. Imagine if in a game like BF2 all the cover was destroyed and every map ended up being a barren wasteland with nowhere to hide and you would get dominated by air power the second you spawn. Is that fun? No.

At the end of the day, physics will make stuff fall down better. Physics is not the most exciting thing in the world... actually, it's pretty damn boring. The real world is nothing like the movies or some games. Stuff does not tumble and bounce around dramatically, it falls to floor and ends up sliding a short distance. It will add a little bit to the immersion factor of the game, if implemented somewhat realistically, but in many cases it won't make the game.

Now, I'm not saying that the card is completely useless. I'm just saying that I don't think it's going to cause a massive revolution in game design. I fear that for a time developers might pursue shoving raw physics down our throat instead of games that are actually good.
 
I think the difference here is that there was a stunning difference between software rendering and Glide. I remember showing it to my parents back in 1997 and even they were "wow'd" - and they were not gamers by any means. The difference was absolutely mindblowing and immediately recognizeable.

However, looking at the Cell Factor demo, a non-gamer/enthusiast would be hard pressed to understand what the difference is. Unless you are familiar with technology and the limits of even today's high end technology, a novice simply won't "get it". These are people that are lucky to have a 6600 in their OEM system they bought from Best Buy. For all they know, the difference may be your $1200 SLI/Crossfire setup and super expensive CPU.

In other words, Glide needed no explaination, PhysX will.

I, however, am very excited about the possibilities and look forward to fully realized PhysX enabled games - or even physics acceleration by NVIDIA/ATI, whatever's the best, I want in on it.
 
Forgive the naive consumer question, but why isn't this a capability that can be added to videocards or motherboards? Why do we have to keep adding to the list of must-haves to play the latest games? When you lay down 600+ bucks for your top of the line videocard, this should be something you get for the value. If they can't put it on the card they should ship the peripheral card with the package.

Unrealistic, perhaps - but I'm beginning to feel slightly used. This is an example of putting the cart before the horse.

All this being my humble opinion, etc, and so forth.
 
Captain Rehab said:
Forgive the naive consumer question, but why isn't this a capability that can be added to videocards or motherboards? Why do we have to keep adding to the list of must-haves to play the latest games? When you lay down 600+ bucks for your top of the line videocard, this should be something you get for the value. If they can't put it on the card they should ship the peripheral card with the package.

Unrealistic, perhaps - but I'm beginning to feel slightly used. This is an example of putting the cart before the horse.

All this being my humble opinion, etc, and so forth.
Agreed! Why not a PPU on the CPU? or on the GPU? or even reminescent of on board cache put it on the motherboard.
 
griff30 said:
Agreed! Why not a PPU on the CPU? or on the GPU? or even reminescent of on board cache put it on the motherboard.
That's a waste of space on the CPU and it won't ever happen. In the end, VERY few PCs end up playing games. However, I see a possibility of it being put on a video card in the future, and that would be interesting.
 
Captain Rehab said:
Forgive the naive consumer question, but why isn't this a capability that can be added to videocards or motherboards? Why do we have to keep adding to the list of must-haves to play the latest games? When you lay down 600+ bucks for your top of the line videocard, this should be something you get for the value. If they can't put it on the card they should ship the peripheral card with the package.

The real issue here is that “they” aren’t the same people. You have ATi and nVidia making videocards but the company behind the PhysX PPU is Ageia. You expect nVidia or ATi to bundle their videocards with a $200+ physics card from another company?

griff30 said:
Agreed! Why not a PPU on the CPU? or on the GPU? or even reminescent of on board cache put it on the motherboard.

The PPU isn’t a lightweight when it comes to specs; there is some real hardware onboard. The PhysX PPU chip is 125million transistors. Compare that to the 114million on an FX-57 and all of a sudden it is easy to understand why this is not just something that they can “tack on” to a GPU or a CPU as if it was no big deal. The card also has 128megs of dedicated GDDR3, the same type of high-speed ram already found on most high-end videocards.
 
GotNoRice said:
The PPU isn’t a lightweight when it comes to specs; there is some real hardware onboard. The PhysX PPU chip is 125million transistors. Compare that to the 114million on an FX-57 and all of a sudden it is easy to understand why this is not just something that they can “tack on” to a GPU or a CPU as if it was no big deal. The card also has 128megs of dedicated GDDR3, the same type of high-speed ram already found on most high-end videocards.

Plus it got a internal memory bandwith of 2 Tb/sec...geared for massive realtime multiple physics threads

Terra - PhysX > HavockFX...
 
Well, I dont know how all this is gonna pan out and if it will take off or not, but I bought bfg PPU yesterday, only reason I bought it was for the upcoming Ghost recon game, but until then, it is useless as there is nothing that I have that supports it, but at least it passed all of its built in tests :)
 
All they need is tangible support in the Unreal engine. If Epic supports it, others will fall in line(since a lot of games use their engine).

Edit:
This could make a really interesting Anime style game(Think DBZ)
 
uzor said:
Where'd you find one?



I pre-ordered mine from a OEM builder, and it came on friday, bfg make, it had no cd no nothing, just came in a nameless box and inside an anti static bag, but it has a bfg sticker on the fan....

It has a cool little game with it, where you setup your ball to knock down some blocks. ( the game is contained within the drivers, well its more of a demo than a game, it can also be played without the card in, but you get a warning that it is being played in software mode across the screen.)

For £200, that better not be the only game that I get to play...

I had to steal the drivers from the Ageia website, and there is no problem with them, the card has a built in test which encompasses 30-40 small tests and everyone of them passed, so the drivers are looking good...

I just cant wait for Ghost Recon AW to come out now :~)

If you havent seen what they look like, then this is it, sorry for the quality, as the camera has a crappy macro quality to it

 
EVIL-SCOTSMAN said:
it can also be played without the card in, but you get a warning that it is being played in software mode across the screen.)
what kind of difference is there between the software and hardware modes? speed, number of blocks, etc?
 
ScotteusMaximus said:
what kind of difference is there between the software and hardware modes? speed, number of blocks, etc?


speed, the demo is exactly the same in both software and hardware modes, except, when your ball hits the blocks, in hardware mode, there is no slowdown, but in software mode there is a pronounced stutter and slowdown dependant on the number of items that are moving...

i.e. less items moving = normal speed. More items moving will slow the demo right down at times....
 
EVIL-SCOTSMAN said:
speed, the demo is exactly the same in both software and hardware modes, except, when your ball hits the blocks, in hardware mode, there is no slowdown, but in software mode there is a pronounced stutter and slowdown dependant on the number of items that are moving...

i.e. less items moving = normal speed. More items moving will slow the demo right down at times....

Could you gives us some raw figures?
What's the difference at 100 objects?
At a 1000 obejcts?
At 10.000 obejcts?
And with your CPU, Mobo and RAM specs so we can see where the CPU starts to get pushed to it knees...and where(if at all) PhysX takes off? :)

Terra...
 
You can download the “driver” from the Ageia site:

http://www.ageia.com/products/drivers.html

But the package contains more than just the driver for the card, it contains the engine. When you install the drivers you can actually specify for it to only install the engine and not the driver. The boxes demo will be located in “Program Files\AGEIA Technologies\bin”. I tested this on my laptop and it will install fine with no card installed, and it will let you run the boxes demo.

Like has been said, in hardware mode there is absolutely no slow down. But what I noticed during my brief testing of the software mode is that it ran considerably faster in software mode on my desktop rig than on my laptop. The Laptop has a fairly decent Pentium-M in it so I wondered why this would be the case. Sure enough, having task manager open while testing shows that the demo is definitely multi-threaded, and was in fact actually almost maxing out both my processors (in software mode). In hardware mode all the CPU usage was limited to one processor.

The demo is fairly simplistic, but anyone can run it, give it a try.
 
Terra said:
Could you gives us some raw figures?
What's the difference at 100 objects?
At a 1000 obejcts?
At 10.000 obejcts?
And with your CPU, Mobo and RAM specs so we can see where the CPU starts to get pushed to it knees...and where(if at all) PhysX takes off? :)

Terra...


I wouldnt know what the difference is between 1000 and 10,000 objects, as the demo only has i would say 500 boxes max, and you cannot increase that....

The only difference I did notice ws that in software mode, it runs slower and at times stutters compared to having the card which didnt slow down whatsoever, also there is SOFTWARE MODE written across the screen...

Apart from that small demo, testing will not be able to be carried out til a game title that is specifically coded for it.....

I did not even think about looking at cpu usage and such like gotnorice did, but i will try that now....

EDIT:

Ok, I just did the test, same as yesterday, this time there was no stutter, but it still slows down compared to having the card, and I can also agree that it nearly maxed out both my cores 86%-90%...

The card isnt even in my pc as of yet, as I took it out last night, as I want to move mmy soundcard into a different slot, so the ppu can use the soundcard slot, but I dont really wanna go to the hassle of messing with creative's magic drivers, so the test works whether you have the card in or not. I dont know if the drivers will install without the card being present ? but when I tried to do the demo, I got an error ppu not detected, but it still allows you to perform in software mode...

Gotnorice, i found my demo is on the Ageia icon, if you push the systray icon or open the ageia processor settings link , it is contained in the program gui. no need to goto program files directory to find it..... But i installed everything except xfire, so maybe not installing the driver and only installing the engine affects where you can find the demo ?....
 
EVIL-SCOTSMAN said:
Gotnorice, i found my demo is on the Ageia icon, if you push the systray icon or open the ageia processor settings link , it is contained in the program gui. no need to goto program files directory to find it..... But i installed everything except xfire, so maybe not installing the driver and only installing the engine affects where you can find the demo ?....

On my laptop it let me install it fine, I just unchecked the box to install the driver at the selection screen. On my desktop I was doing testing by disabling and enabling the PhysX card via device manager, however when I disabled the card the Ageia icon in the systray would disappear, and I would instead have to browse into the program files folder to access it manually, no biggie.
 
mashie said:
Cool, did push both the Opteron 180 cores to 85%, smooth as silk though :D

I just noticed this, guess having Microsoft games supporting the PhysX will help quite a bit.

I doubt it. Judging from MS's past business tactics, they'll probably make their own API based on Havok and PhysX and market it as part of Vista's DX10. At most, MS will probably buy out one of them to get the technology and revise it.
 
mashie said:
What difference do you see in CPU load between having the PhysX card enabled and disabled running that demo?

With the card, the CPU usage is constant at 25% and remains at 25% from the time I start the demo to the time I close it, regardless of activity.

Without the card, the CPU usage is at about 28% at the beginning when the demo is idle and it jumps up to 45-47% when there is lots of activity on the screen.

My box has 4 virtual processors (Dual Hyperthreading), so 25% basically represents full usage of one processor while 50% would mean full usage of both processors.
 
If I manually set the affinity to a single core I get almost identical results as running with dual affinity. It seems the multithreading is broken.

FYI: X2 3800+ @ stock runs it just fine. Once or twice it maxed out, but that's it.
 
Dew said:
If I manually set the affinity to a single core I get almost identical results as running with dual affinity. It seems the multithreading is broken.

FYI: X2 3800+ @ stock runs it just fine. Once or twice it maxed out, but that's it.

FPS numbers? CPU usage numbers? Just wondering if your conclusion that it is "broken" is entirely subjective or if you actually recorded any relevant data.
 
GotNoRice said:
FPS numbers? CPU usage numbers? Just wondering if your conclusion that it is "broken" is entirely subjective or if you actually recorded any relevant data.


Unfortunately, there is no fps counter. My observations are based on running task manager and watching the demo. I can only offer my subjective opinion as to the smoothness of the demo. However, upon further investigation, I would like to retract my asertation that 1CPU is enough and that miltithreaded is not working. You get the heaviest workload by hitting the tower so that it falls over. The more block separation you get, the easier it is to compute. I did some more testing and saw a noticable slowdown when maximizing the block interaction on a single CPU. Dual CPU never shot above 80% and averaged 50-70% diring block interaction. 27-28% when no block interaction.
 
can you try running it with fraps going, and see what kind of data that records?

 
with a dual core Opteron running at 2.7ghz, CPU usage never went over 43%, and usually was around 30%. I do have two instances of F@H running, but if more CPU cycles were required by the program F@H should drop down to 0%.. It was very smooth, not choppy at all for me.

I have a 7800GT btw
 
Don't know how scientific this is but I just ran the demo with FRAPS. Static, the demo was running about 50fps but when I shot the ball and it started crumbling it dropped down to 24fps and looked like it was collapsing in slow-mo. This was on my Dell at work (2GHZ, 512MB, Intergrated video) but I am anxious to try it on my dual-core rig when I get home.
 
Back
Top