Ageia PhysX - Another Kick in the Wallet

AaronP

[H]F Junkie
Joined
Jan 13, 2005
Messages
11,527
So I read up on the new Ageia PhysX card, and while the idea of a card dedicated soley to ingame physics is exciting, it's also a little scary. When ever I saw the intereview and saw that hte asking price for these processors would be 250-300 dollars, I nearly fell out of my seat. I spent only $250 on my Video card just last week. I'm afraid that one of these days that building a gamer PC will be even more expensive because who knows what games may require this card to play. This also concerns me as OEM Builder. Some of my top systems already cost nearly $4000.00. The last thing I need to be doing is adding another $300 to that. Not to mention with new hardware comes new issues. I don't know, but to me this whole idea of a Physics Processing Unit gives me a major headache.
 
consider how much it would have cost to get that level of visualization just a decade ago
there are a few hundred thousand dollars in computing power sitting on most of our desktops by that measure

at least gaming is actually employing that computing power unlike most "consumer" applications
 
Ice Czar said:
at least gaming is actually employing that computing power unlike most "consumer" applications

That's part of the problem. What else besides games would use this? In a normal desktop environment that is, not a high-end workstation. You have to drop $300 and the only thing you get is faster physics in games. At least with a video card I can pull myself into denial and say things like "yeah, the computer NEEDS one to even work" then I justify that the $500 makes everything, not just games, more pretty. :p
 
If I could use the physics processor for programs like Solidworks, or other 3D technical design software it would appeal a lot more to me.
 
The peculiar thing about it is what happens to that company that wants to sell those once we have 16 core processors? Surely, a 16 core processor would be able to handle physics.
 
They better get Vista's 3d desktop to make use it of it somehow lol... (to help me justify the cost) lol :D :p

So People cant say "what a waste just for games" haha :cool:
 
Personally I am just looking forward to another device which we can improve performance by overclocking!
 
something that improves game performance is never a waste! (Unless there's something that can do the same thing overclocked for $800 less.)
 
I really can't see this taking off at all in it's current form.

People don't wanna spend loads on soemthing that has little or no software support, and the software companies aren't gunna be willing to work on something for such a tiny audience.


It'll only really take off if it gets shoved into a future console. That's what it's gunna take.
 
I think it'll get away from PCs and move into the console world. PS3 is already supposed to be supporting it.
 
Is this really necessary with the advent of faster processors and particularly dual core processors? Maybe a better solution would be for video card manufacturers to implement a physics unit with their gpu. Surely nvidia or ati have the capacity to come out with such a product, which begs the question why haven't they?

I would say that both ATI and NV have the R&D resources to come out with a product such as Ageia (potentially even better and cheaper then what Ageia has shown) and certainly have much greater market penetration to make it a successfull technology
 
Sorry for the thread necromancing, but I just wanted to cover a few points about this item.

From some of the stuff I've read from various sources, this thing can compute complex formulas in one clock cycle than a CPU can (a PPU is 400% more efficient at physics calculations than a normal CPU would... I need to find the links where I read all this). Now not only does this improve the concept of adding realistic physics into the mix, but any such computationally intensive task (including but not limited to: raytracing, radosity, dof, etc), this could greatly improve the field of high-end 3D graphics, and even medical research (protein folding anyone?). Theres also other scientific things that can benefit from this.

So... Recap:
* A GPU is specialized at graphics and is much more efficient than a CPU (even at much higer "speeds").
* A CPU is a "general purpose" processor for various operations. Jack of all trades, master of none...
* A PPU is specially designed to computer complex physics formulas compared to a CPU (by 40 to 1 if I remember the ratio correctly).

Anyway, I'll update this post once I track down my sources, but I remember reading this when they first started mentioning the existance of a PPU...

Personally, I'll get one if my 3d apps can utilize it for helping compute the solutions for rendering and if the games I want come with suppport for them (UT2k7 will supposedly support it). Also, I dont plan on getting one till I build my new 939 dual core system.. :)
 
depending on how the next crop of games run on my computer i.e. call of duty 2, oblivion I will probably do one more upgrade, and then i am gone from pc gaming for the most part. its so damn expensive for such little return, seriously. PCs have a really narrow crop of games that fall into basically 3 categories, FPS, RTS, and MMORPG, with various twists on each, and the occasional, and i mean occasional other kind of game or console port. also, PCs are really really innefficient, we drop $1000 to $1500 dollars onto a new mostly top end system to start out, then every year or two drop 200 - 500$ on a graphics card alone to "keep up" with the latest 3 games that are actually good. heh, its a friggin waste. I've had consoles and PCs coexisting in my house for awhile now, fighting for my gaming time, but PC is losing out. for 300 - 400 dollars (the price of a current top end graphics card) I can get me an xbox 360 or playstation 3 which will last me 4 to 5 years, keeping a nice pace with pcs and scaling nicely to current demands and graphical trends. both contain either an ati or nvidia graphics chip which both companies state is just as or more powerful than what they expect from their "next-gen" pc counterparts (which are usually refreshes, sigh...) and considering the efficiency of a console, they are realistically much better. so basically for that one time good price, you get a system that will be better than pc for awhile, run atleast even with PC for even longer, and when finally eclipsed....still not be that far behind really. so basically my oppinion on 200 or 300 for another damn card to put into my pc is go shove it computer companies. same with 200 and 300 dollar sound cards, friggin rediculous. like it or not, PC are way too expensive, and their price to performance ration is crap. at best, you get like 2---maybe 2.5 years out of your 400 dollar graphics card, I can get 5 out of my 300 - 400 dollar ps3, and late adopters can get it for 250, or 200!! I dunno, I could go all day about this
 
chameleoneel said:
depending on how the next crop of games run on my computer i.e. call of duty 2, oblivion I will probably do one more upgrade, and then i am gone from pc gaming for the most part. its so damn expensive for such little return, seriously. PCs have a really narrow crop of games that fall into basically 3 categories, FPS, RTS, and MMORPG, with various twists on each, and the occasional, and i mean occasional other kind of game or console port. also, PCs are really really innefficient, we drop $1000 to $1500 dollars onto a new mostly top end system to start out, then every year or two drop 200 - 500$ on a graphics card alone to "keep up" with the latest 3 games that are actually good. heh, its a friggin waste. I've had consoles and PCs coexisting in my house for awhile now, fighting for my gaming time, but PC is losing out. for 300 - 400 dollars (the price of a current top end graphics card) I can get me an xbox 360 or playstation 3 which will last me 4 to 5 years, keeping a nice pace with pcs and scaling nicely to current demands and graphical trends. both contain either an ati or nvidia graphics chip which both companies state is just as or more powerful than what they expect from their "next-gen" pc counterparts (which are usually refreshes, sigh...) and considering the efficiency of a console, they are realistically much better. so basically for that one time good price, you get a system that will be better than pc for awhile, run atleast even with PC for even longer, and when finally eclipsed....still not be that far behind really. so basically my oppinion on 200 or 300 for another damn card to put into my pc is go shove it computer companies. same with 200 and 300 dollar sound cards, friggin rediculous. like it or not, PC are way too expensive, and their price to performance ration is crap. at best, you get like 2---maybe 2.5 years out of your 400 dollar graphics card, I can get 5 out of my 300 - 400 dollar ps3, and late adopters can get it for 250, or 200!! I dunno, I could go all day about this

PC's are used for more than games.

--------

As soon as I see favorable benchmarks later in the year, I'm jumping on this ASAP.
 
I'd like to get it, probably just to brag. And because I like to have lots of stuff connected inside. Weird, I know.
 
Russ said:
PC's are used for more than games.

QUOTE]


I understand that, but its pretty obvious my blurb was focused on pc as a gaming machine, because gaming is 90% of the reason why you would need to spend the huge dollars on graphics cards etc. if i was using my pc for everything but gaming, I would still be using my athlon xp based system with a ti4200 and a gig of ram as my main machine because that computer works perfectly fine in all situations except gaming. it actually still keeps up with the 6800nu that i had in it that i paid $300 for last year when they first came out, that computer runs the Fear demo at full settings (medium texture, no SS) at 40fps average, but thats because it had a $300 dollar graphics card sitting in it. its so dumb, computer review sights are always saying, "oh yeah, we just got to go and playtest (insert upcoming game here) and it ran butter smooth on the 7800gtx equipped systems that we played it on. good to see the game running so well even before final release." well of course it ran great, that graphics card alone costs $450 dollars, not to mention all the other expensive hardware that was likely backing it up. my playstation 3 can run that game too, for atleast that same price, probably $150 less---for the entire system, and it can run any other game for the next 4 years.
 
It'll depend on the uptake.

The phys-x could fizzle out like so many other revolutionary gimmicks, but it actually seems to be gathering a bit of steam.

Like 64bit apps and dual-core processors, it'll be long time (years) before you can't live without one though, even with high spec machines/games.
 
I could see this becoming a part of either motherboards or graphics cards in the future as the tech becomes cheaper. Otherwise, it's a tad expensive for the principle.
 
I'd get one if it was $100. Any more than that, and I think it'd be rediculous. If they want people to adopt it early on, they shouldn't charge upwards of $300. I'd buy one in a heartbeat at $100 or less. Anything above that and they can stick it until they drop the price. I don't get to play games often enough to justify a higher than $100 pricetag for something that's solely dedicated to physics.
 
I swear I read that there would be multiple versions, with a low end being a much more decent price. I think $150 maybe it was.

One thing I hope to see with physics acceleration is perhaps a tendency towards better physics in games that normally wouldn't have them. Even simple things, like maybe collision detection (surely it can do that, right?) would be nice. It just gets so annoying some of the crap you get in, for example, RPGs sometimes because they were too lazy. But, if a thing like this is more popular, they might reconsider.

Either way, I for one wish to wait and find out first-hand what this CPU can do. It may not be quite as good, but, my old barton @ 2.5GHz was plenty enough for modern games, so I find it hard to believe a modern cpu can't at least squeek by with early physx games. I am worried about one thing though. If they do like is done with video cards and decrease quality if you lack the accelerator, or have the lower end one (if such a thing will exist) then I'll never get to see what it's like before buying it. IMO that would really suck.
 
Wait 6 months after it comes out and it'll either drop in price or be not be worth buying because no one is supporting it. I'm really interested in seeing what it does for games because I like to see new games actually USE highend hardware. It does make sense to have lots of slower CPUs then one really fast CPU or at least that seems to be the trend lately.
 
J-Mag said:
Personally I am just looking forward to another device which we can improve performance by overclocking!
But thats the thing... You can't overclock physics like you can graphics. The game still works if it is ugly, but if the physics don't work then you need to redesign the whole game. Just take Half-Life 2 for example.
 
Actually, it works more like it than you think. What happens if you leave the same settings on the video card but over/underclock it? Same for CPU on a CPU intensive game. Catch my drift? Unless they design it under the specific assumption that you will have one specific physics accelerator running at one specific speed, that won't be an issue. And this would be a bad bad mistake. If PhysX catches on, there WILL be clones. It's inevitable, only a matter of time. That or tricks like maybe nVidia or ATI deciding to put a seperate physics accelerator chip on one of their cards to make physics acceleration more accessable officially (unofficially so the CEO or whatever can get that new Mercedes he's had his eye on.) There would be a lot of upset people if designers made it TOO specific. Either way, future games would work under the assumption you can have different accelerators, even if the original didn't. No, I tend to believe all those with foresight will make them scalable, and, by the time people like me enter the market, the initial mistakes by the shortsighted programmers will be gone.

Anyway, I tend to think that it should be able to use the CPU to compensate for whatever the accelerator can't manage. While the CPU may not be as efficient at it, CPUs are currently not being properly challenged by games today, so there's a little leeway to play around with at least. And that's considering that today's games do use the CPU for physics (just less of them.)
 
If your graphics card isn't up to snuff you can reduce the graphical intensity of the game down to 640x480, turn off shaders, etc, and it doesn't affect the core gameplay it just reduces your enjoyment.

If the grenade doesn't bounce around the corridor on user A's machine like it does on user B's machine then the game is broken. You can't lower physics intensity in a game unless you then scale the games puzzles to compensate.
This would work fine for single player games but what about for online games...
 
Core gameplay. You said it yourself. If you reduce to 640x480, does it cut a chunk out of your screen, or does it just simply reduce the resolution? A decrease in the physics quality mean that grenade won't bounce around the corner. It means that when you try to make it bounce, but, time it wrong and it blows a chunk in the wall, you only see a few large shards flying through the air instead of a hundred tiny ones. You have seen the demo of what PhysX does, right?
 
Uhm, maybe... o_O

Anyway, mainly a decrease in quality isn't going to mean a decrease in the laws of physics themselves, so no spontaneous enemies suddenly floating in the air or something silly like that. It would just mean a decrease in the decreasable effects, which are the ones that would be tough to begin with. Easy to track a grenade bouncing through a corner, but, tracking hundreds of little shards of stone flying through the air is a lot more tricky. So, we just make fewer larger shards and presto, we have a lower quality physics. Their exact implementation of scaling may differ, but, this is the gist of it.
 
All I can say is I'm going to have to see a demo of this running before I want to drop a few Benjamins. Otherwise, good ol' rag doll phsyics is about all I need right now.
 
Bluenile said:
But thats the thing... You can't overclock physics like you can graphics. The game still works if it is ugly, but if the physics don't work then you need to redesign the whole game. Just take Half-Life 2 for example.

With this logic, there would be no need for the card at all as the software SDK can perform physics calculations too... Or hey why not just clock at 1 hz because it will perform just as fast? WTF?!?!?
This info is wrong. If it can be overclocked it will be able to perform more physical calculations per second, bottom line.

Lets say the developers add in more objects to be used with the physics than a first generation Ageia card can handle... Well, I am assuming the overflow would be burned on the CPU as the SDK can be run via Ageia hardware or just plain software. SO if an overclocked Ageia card can handle more calcualtions then this would not occur (or just happen less frequently)

Also you can look at object collisions, are these set arbitrarily? NOPE...
So every time object A collides into object B, their paths are both re-calculated via momentum and friction. Now what happens if this re-calculated path puts object B into a collision with object C, well more calculations are required. This is a very simplistic view, you can imagine how complicated it gets when looking at hundreds or thousands of object collisions.

Essentially the ammount of physical calculations done for this type of interaction IS NOT STATIC. There can and will be instances of the 1st generation Ageia card being overloaded.

Also, why would they plan on releasing faster cards if "overclocking" didn't do anything.

Anyway, believe what you will, but there are a lot of morons pulling crap outta their asses.
 
When playing most newer games in a high end system the video card is still the bottle neck. Seems like physics engine complexity isn't an issue I'm worried about.
 
J-Mag said:
but there are a lot of morons pulling crap outta their asses.

where I come from, that's about the only place to pull crap from....unless you like playing in the septic systems or waste treatment plant :eek:

:p...sry, had to....
 
I don't really understand how more complex physics modelling, requiring an expensive dedicated processor, will proportionately increase my enjoyment of any given game. I mean is it really going to revolutionize gameplay significantly to warrant that sort of additional cost to a gaming rig..? I think newer CPUs will comfortably cope with the kinds of physics modelling we're going to have in the foreseeable future.

I can see how it could be employed effectively in say, 'destructable' environments and the like, but until the tech has a widespread installed userbase, developers aren't really going to bother building that sort of complexity into their games. It's only until someone like a Sony, Nvidia or ATI buys the tech and uses it in all their products that there will be a point to it. As an niche market add-in card, it will be pretty much pointless as far as I can see...
 
Leon2ky said:
So I read up on the new Ageia PhysX card, and while the idea of a card dedicated soley to ingame physics is exciting, it's also a little scary. When ever I saw the intereview and saw that hte asking price for these processors would be 250-300 dollars, I nearly fell out of my seat. I spent only $250 on my Video card just last week. I'm afraid that one of these days that building a gamer PC will be even more expensive because who knows what games may require this card to play. This also concerns me as OEM Builder. Some of my top systems already cost nearly $4000.00. The last thing I need to be doing is adding another $300 to that. Not to mention with new hardware comes new issues. I don't know, but to me this whole idea of a Physics Processing Unit gives me a major headache.

If you can afford $4,000 systems, multiple times, you can afford $300 for a PPU.
 
What about SLI?? LOL. Will we need one Ageia per video card???? Two Ageia's and two video cards @ about 400 a whack = $1600.00 JUST ON EYE CANDY :eek:

No thanks.

(Im sure you wouldnt need two of them.............added for effect)
 
Im sure you dont need more than one.

I think with a Dual Core CPU, One High End Top of the line Videocard, and One PPU, we will pretty much excel into the Future. I Think SLI is a waste totally. IF it did offer a full 100% Increase over 1 Card, It would be worth it, but it dosent. In some cases, the Difference is less than 20%. No Thanks. Not Yet anyway. Until I can play @ 2048*1536, I dont need it
 
fl00d said:
I don't really understand how more complex physics modelling, requiring an expensive dedicated processor, will proportionately increase my enjoyment of any given game. I mean is it really going to revolutionize gameplay significantly to warrant that sort of additional cost to a gaming rig..? I think newer CPUs will comfortably cope with the kinds of physics modelling we're going to have in the foreseeable future.
What it means is programmers will be able to add massively more complex physics, such as walls that break per brick rather than some big triangular chunk breaking out, etc. In other words, it encourages programmers to use more realistic physics because it becomes mroe accessable. One thing to bear in mind is that the reason why normal CPU CAN'T actually cope with what physx can provide is that it's not just a matter of simple physics, but, it's a matter of keeping track of simple physics for many many individual objects and their interactions. It's not just destructables though, many things could be done better.

I can see how it could be employed effectively in say, 'destructable' environments and the like, but until the tech has a widespread installed userbase, developers aren't really going to bother building that sort of complexity into their games. It's only until someone like a Sony, Nvidia or ATI buys the tech and uses it in all their products that there will be a point to it. As an niche market add-in card, it will be pretty much pointless as far as I can see...
You may be right, and I worry about that. Still, many companies are smart and are willing to take advantage of newer technologies pretty quickly. I'm hoping that overall, between the fact that the CPU can replace some of the basic functions at least making it possible to have support for one without requiring it and the fact that there is money to be gained through this, I think people will adopt the technology. Mind you, I'm among those who will not run to buy one the moment they come out, and may well end up waiting for nVidia or someone to create their own (they can't just buy it out, Ageia isn't stupid) anyway.
 
Don't bitch about it being $250. Sometimes in Canada it costs us DOUBLE what you guys have to pay for some computer articles. So don't complain.
 
Back
Top