Ageia PhysX - Another Kick in the Wallet

Nazo that's it exactly.

I just have to say that I wouldn't be worried about it taking a long time...it is going to take a long time for this to get going. Graphics cards didn't spring up over night. It took almost 5 years for them to become standard in just hardcore games. Longer for regular games. But it will get there. I myself am not upset by this though, because it's not going to be no benifit then all of a sudden in 2012 everything uses. Slowly games are going to use it and get better. That's why I look forward to the progression of getting it going.

I do look forward to our end destination, but for me even the time traveling to the destination will be great.
 
Nazo said:
Now that I think on it more, frankly, there is one problem that I see. This thing is useless until it becomes "accessable." What I mean is, the CPUs are being SERIOUSLY underutilized in games. This is not because the programmers can't make better AIs and such, but, that they have to make the game playable on pretty low end hardware. They all insist on the assumption that the majority have crappy systems. Until they can assume that those with the crappy systems have at least a low end physics accelerator, we'll only see a few braven enough to support it. I don't give a crap if it's in four or five major games, I want to see it in just about all of them, kind of like where 3d acceleration is today.

This is the part that worries me. It may be useless until it's a decent price. But, then again, we are talkinga bout introductory price here. Even if they are moronic enough to not realize that there is a larger potential market on the lower end (especially if overclocking comes into play) they'll probably lower the price after a while anyway. So I hope anyway.


the 4 or 5 i listed are just about all the games coming out in the next 6 months
 
Leon2ky said:
So I read up on the new Ageia PhysX card, and while the idea of a card dedicated soley to ingame physics is exciting, it's also a little scary. When ever I saw the intereview and saw that hte asking price for these processors would be 250-300 dollars, I nearly fell out of my seat. I spent only $250 on my Video card just last week. I'm afraid that one of these days that building a gamer PC will be even more expensive because who knows what games may require this card to play. This also concerns me as OEM Builder. Some of my top systems already cost nearly $4000.00. The last thing I need to be doing is adding another $300 to that. Not to mention with new hardware comes new issues. I don't know, but to me this whole idea of a Physics Processing Unit gives me a major headache.
People said the same things about GPU's when they first came out. Quit whining. If you don't like the PPU, feel free to choose not to buy one. Meanwhile the rest of us will enjoy awesome game physics in UT2k7 and other upcoming games. Go Ageia! :cool:

On another note: I'm getting rather tired of all these negative threads concerning Ageia's PPU... :mad:
 
Elios said:
the 4 or 5 i listed are just about all the games coming out in the next 6 months
I was just making a point with extremes. Yes, there will be more, but, so long as it's just the rare few, it's just not worth it. Especially considering how many of that sort of game tend to have no replayability (ok, a few like UT2K should, but, overall, games like Doom 3, Half Life 2, etc which are supposedly supposed to push hardware to the limits and such have a painful tendency to have no replayability.)
 
with rising costs in computers etc etc.. don't forget to factor in inflation.. its simple math .. high end video cards used to cost 300$.. now they are $400-$500.. woopty..
 
wow these people are ridiculous... they are speaking badly about something which is completely optional... not only are they speaking badly about it, but they are almost "wishing it would never come out" .... 200 bux is not that much money at all i don't know what you're whining about.
You're in a computer enthusiasts' forum... i mean come on! everyone should be happy about this. The worst thing that can happen is games will become so good that todays technology won't be able to support it and you'll be forced to buy something with a PPU .... i mean god damn STOP WHINING... THIS IS A GOOD THING
 
Entropy1982 said:
wow these people are ridiculous... they are speaking badly about something which is completely optional... not only are they speaking badly about it, but they are almost "wishing it would never come out" .... 200 bux is not that much money at all i don't know what you're whining about.
You're in a computer enthusiasts' forum... i mean come on! everyone should be happy about this. The worst thing that can happen is games will become so good that todays technology won't be able to support it and you'll be forced to buy something with a PPU .... i mean god damn STOP WHINING... THIS IS A GOOD THING
Well fucking said! :mad:
 
Jasonx82 said:
They better get Vista's 3d desktop to make use it of it somehow lol... (to help me justify the cost) lol :D :p

So People cant say "what a waste just for games" haha :cool:

LOL...yeah, maybe the aeroglass windows can crash into each other and shatter realistically. :p
 
ricetek said:
with rising costs in computers etc etc.. don't forget to factor in inflation.. its simple math .. high end video cards used to cost 300$.. now they are $400-$500.. woopty..

Inflation hasn't risen that much .... not nearly.
 
dagon11985 said:
Inflation hasn't risen that much .... not nearly.


Your right, people also forget that this stuff gets CHEAPER to manufacture. I just stopped arguing with people about the price of video cards. I think they are way over priced. Back on topic, the Ageia chip would take off if its priced at 125-150, at 250-300, I think it will be harder for folks to swallow.
 
Entropy1982 said:
wow these people are ridiculous... they are speaking badly about something which is completely optional... not only are they speaking badly about it, but they are almost "wishing it would never come out" .... 200 bux is not that much money at all i don't know what you're whining about.
You're in a computer enthusiasts' forum... i mean come on! everyone should be happy about this. The worst thing that can happen is games will become so good that todays technology won't be able to support it and you'll be forced to buy something with a PPU .... i mean god damn STOP WHINING... THIS IS A GOOD THING

'A fool and his money are easily parted' - and you sir, are just that kind of fool.

Developers are NOT going to bother properly coding for such a device until it achieves a large enough installed user base for there to be a reason to build that kind of complexity into game engines. They're complex enough as is! Look at how long games take to develop as they are! HL2? Doom 3...? etc, with their relatively simple physics modelling... Christ knows how much time would be added to those cycles by adding physics modelling the calibre of which would require Aegia's card. I honestly don't think we'll see mass produced and utilised PPUs until the next next gen consoles.

Again, I don't really see how a 'realistically' physics modelled game is really going to significantly impact gameplay anyways - not when physics can be 'faked' well enough in CPU, to provide the kind of visual illusions required for good gaming immersion. There are other areas of gameplay that need vast improvement LONG before I give a shit about realistically modelled fluid dynamics, needing a extra $200 card. AI anyone..?

Prediction: Aegia will fail, or simply be bought by a company like Nvidia, and we'll see their tech surface in game consoles some 3-4 years from now.
 
fl00d said:
'A fool and his money are easily parted' - and you sir, are just that kind of fool.

Developers are NOT going to bother properly coding for such a device until it achieves a large enough installed user base for there to be a reason to build that kind of complexity into game engines. They're complex enough as is! Look at how long games take to develop as they are! HL2? Doom 3...? etc, with their relatively simple physics modelling... Christ knows how much time would be added to those cycles by adding physics modelling the calibre of which would require Aegia's card. I honestly don't think we'll see mass produced and utilised PPUs until the next next gen consoles.

Again, I don't really see how a 'realistically' physics modelled game is really going to significantly impact gameplay anyways - not when physics can be 'faked' well enough in CPU, to provide the kind of visual illusions required for good gaming immersion. There are other areas of gameplay that need vast improvement LONG before I give a shit about realistically modelled fluid dynamics, needing a extra $200 card. AI anyone..?

Prediction: Aegia will fail, or simply be bought by a company like Nvidia, and we'll see their tech surface in game consoles some 3-4 years from now.


fail?
hardly MS Sony Epic Atari Ubisoft all backing it i dont see how i can
then on top of that whats to code? it uses a physics engine that can interface with the card but like any other ie HAVOC it works the same way with the game the dev just has to use that engine and it works with the card not that hard as you need one any way and the one the work with the card works just fine with out it no more coding the thay have now
oh you know that air is a fliud right flightsims would LOVE this thing as would race sims
 
fl00d said:
'A fool and his money are easily parted' - and you sir, are just that kind of fool.

Developers are NOT going to bother properly coding for such a device until it achieves a large enough installed user base for there to be a reason to build that kind of complexity into game engines. They're complex enough as is! Look at how long games take to develop as they are! HL2? Doom 3...? etc, with their relatively simple physics modelling... Christ knows how much time would be added to those cycles by adding physics modelling the calibre of which would require Aegia's card. I honestly don't think we'll see mass produced and utilised PPUs until the next next gen consoles.

Again, I don't really see how a 'realistically' physics modelled game is really going to significantly impact gameplay anyways - not when physics can be 'faked' well enough in CPU, to provide the kind of visual illusions required for good gaming immersion. There are other areas of gameplay that need vast improvement LONG before I give a shit about realistically modelled fluid dynamics, needing a extra $200 card. AI anyone..?

Prediction: Aegia will fail, or simply be bought by a company like Nvidia, and we'll see their tech surface in game consoles some 3-4 years from now.
..... lol... ok first of all i never said i'm going to buy it right away..... and in fact... I plan not to for about 1/2 a year after its' release. Second of all, money is very relative. I'll be done with dental school in 1 year and we make $200 in about 30 minutes in my profession. That really is not too much work even if it will not do much at the beginning. So please, stop being jealous of the people that can afford it and go to another thread.
 
Entropy1982 said:
..... lol... ok first of all i never said i'm going to buy it right away..... and in fact... I plan not to for about 1/2 a year after its' release. Second of all, money is very relative. I'll be done with dental school in 1 year and we make $200 in about 30 minutes in my profession. That really is not too much work even if it will not do much at the beginning. So please, stop being jealous of the people that can afford it and go to another thread.
You could try not being full of yourself.
 
Elios said:
fail?
hardly MS Sony Epic Atari Ubisoft all backing it i dont see how i can


righty right. none of these guys ever backed a loser.

very truly yours,
politenessman
 
ROFL @ both previous posts!

Personally, I'm just worried about the "installed user base" thing. It's not a solution to say for them to all use Havok or whatever. There's a darned good reason so very many games out there have their own custom built engine, and it's not that the company just enjoys paying their own programmers to spend extra long periods of time creating the base for the product just so they can get STARTED on creating the real product rather than simply forking over a license fee that would be cheaper in the long run for them. There WILL be a lot of games that support this right out of the box, but, still not enough. Well, I don't worry about PhysX failing, I worry about it taking far too long to get to the point that video acceleration is today. We live in the information age, in a single year immense changes can occur. These days, those five years mentioned earlier are more like two generations than one (well, guess we could go by Moore's Law and say 3.33 generations if you accept that.) In the meantime, those who did shell out the cash only get to enjoy the results in a small percentage of games, and those who can't afford to blow that much cash on an enthusiasts product for a few games here and there -- well, they loose out on those few games, but, overall not a huge loss. The thing is, both people loose out if the games themselves still don't have the increase of quality this could make easy. Heck, even if you did use it for physics acceleration, but, didn't bother to actually add more physics to the game, it does nothing but increase the FPS a little bit more for those people who already obsessively think that the 200 FPS they see in FEAR just isn't good enough.

Well, I'm not complaining about the product itself, more, I'm complaining that the market is so slow to adopt new technologies -- for example, just take a look at how incredibly long it's taken MP3 players to get to where they are today when MP3s were invented some time back in the 90s (according to wikipedia, 1991, so it's way over a decade old, though I can safely say that you would need hardware decoding to manage tolerable MP3 playback back in 1991.) There are already better formats like OGG Vorbis for lossy, FLAC for lossless, and plenty of companies dangling the carrot of "possible future formats" over the heads of their customers who don't realize that they will never eat that carrot. We'll not see support for them until it's too late, and even then it won't be official (I think companies are scared to death of open source.) Back then, storage was a bit issue and it was back then that MP3 was truly needed. Today, we need better things like FLAC (dear god I'm so sick of those "swishy" sounding MP3s!) Similarly, physx may well be the next thing like 3d acceleration, but, the question is, how long will it take before it's TRULY adopted by anyone but the early adopters who just want to get the enthusiasts?
 
gsboriqua said:
Your right, people also forget that this stuff gets CHEAPER to manufacture. I just stopped arguing with people about the price of video cards. I think they are way over priced. Back on topic, the Ageia chip would take off if its priced at 125-150, at 250-300, I think it will be harder for folks to swallow.

The cost of manufacturing the parts is the least of the graphics cards worries. Do you think it is EASY to design a part with over 300,000,000 transistors? The cost of DESIGN is going up EXPONENTIALLY, also while the parts may be cheaper to manufacture in raw materials you also have to factor in tooling costs. In fact parts are only cheaper on smaller processes if they are produced in sufficient quantities.
 
I for the life of me cannot understand how anyone could think having the kind of physics this thing will bring to games to be a bad thing. They are saying hollywood type effects will now be possible. This is like renting a movie that is never the same everytime you watch it. Hell, if only 1 game came out that used this thing to is full potential, I'd go buy one.

Also, people are forgetting that when graphics cards first came out, 3D games were not the norm, and it took a while for them to develop. Now even RTS games are using nothing but 3D, everyone is required to have one now, and that is expected. This will not take nearly as long to catch on is the gaming market is gigantic, there are so many people playing games now compared to when 3D graphics were still a dream. In a game where everything is interactive, you could never play the same game twice, it would ALWAYS be different. Same scenery yes, but replayability will be much much higher. Also, lets not forget the booby factor here people, I mean COME ON!
 
One of the things that has always annoyed me in games is the pre-animated character model boxes. You see a guy coming towards you and, you know... it's just an animated box moving around, sliding etc.. Looks like crap and totally breaks the immersion.

The 1st gen PPU is supposed to bring a revolution to that (I hope) by blending physically accurate character interactions (with scenery) and pre-animated behaviours ala Endorphin by Natural Motion. Check the demos.

The first generation of PhysX also brings fluid, fog, mist, smoke etc. simulation to a whole new level. Water that reacts naturaly, smoke and fog that moves when people run through it or shoot a rocket through it, for example.

And there's also the huge increase of rigid bodies that can be included in a scene allowing totally believable looking destruction of houses etc. on the battlefield. Deformable terrain, tanks wrecks, large holes in walls etc. etc.. just use your imagination :) Rising the immersion level is the name of the game here.

- - - - -

Here's my little dedicated PhysX links & info page: http://personal.inet.fi/atk/kjh2348fs/ageia_physx.html
 
There are a lot of anti-technology crybabies in this enthusiast's board. Somone change thier diapers and then direct them to a more suitable website.

It is new technology, if you cannot handle the fact new shit is coming out stay on console websites, or better yet stay out of all internet forums. Expecially the tech sites.

Edit: If you cannot afford it, then quit sitting on your ass bitching about the price and spend that time getting a better fucking job.
 
Seizure Explosion said:
Here's why this will never work: I wont buy it. Some of you may buy it, but then I'll just sit back and laugh at you. Introducing a new technology that is dedicated to one and only one specifice purpose in a handful of specific programs for more than twice what the programs themselves cost is unreasonable. You want something to give you physics? Get a high end graphics card and a decent processor.

Indeed what is this 3DFX company thinking making a 3d only card. I mean computers only do 2D graphics for christ sake. No one would ever pay $250 for a card that only does one thing. And this idea that in the future it might get integrated into the 2d video card or motherboard is a pipedream.

Oh wait, it's not 1996 anymore and the Voodoo1 was a run away success that did totally change gaming, and yes 3DFX did eventually get eaten by Nvidia, but the PhyX is the exact same thing. Don't believe me? Download their demo software and run the "big bang" or just play around with the other demos.. Watch your poor little CPU (even if it is a dual - dual core opteron) fall to it's knees and cry for mercy. Now imagine that at full speed with any CPU if you have the PhysX card. (btw many companies have signed up for thier physics engine, it will be used)

==>Lazn
 
DocFaustus said:
There are a lot of anti-technology crybabies in this enthusiast's board. Somone change thier diapers and then direct them to a more suitable website.

It is new technology, if you cannot handle the fact new shit is coming out stay on console websites, or better yet stay out of all internet forums. Expecially the tech sites.

Edit: If you cannot afford it, then quit sitting on your ass bitching about the price and spend that time getting a better fucking job.
QFT. Straight and to the point. I like it.
 
Erasmus354 said:
The cost of manufacturing the parts is the least of the graphics cards worries. Do you think it is EASY to design a part with over 300,000,000 transistors? The cost of DESIGN is going up EXPONENTIALLY, also while the parts may be cheaper to manufacture in raw materials you also have to factor in tooling costs. In fact parts are only cheaper on smaller processes if they are produced in sufficient quantities.


R&D generally doesn't go up exponentially. I'm not going to argue with anyone about it because people really don't understand. They will continue to justify the prices no matter what. Look at the 7800 GTX how the prices dropped so much so fast. You think those prices dropped because Nvidia was taking a hit on the cards just to out do ATI? I'm an Engineer at a semi conductor fab, it would boggle your mind how much 1 lot (25 wafers) is worth. We had record profits, just at our FAB alone (not even counting the others in the company). Tooling....lol, tools are paid for within the first two years of the plant running production. Hell some companies give us tools in hopes of getting a good support contract, or in hopes of us buying their tools for our next fab. Smaller parts = more profit. The wafer goes through the same process as before except now they are getting about 50%-75% more chips out of it. This is the reason why prices are suppose to go down, but why should they if people continue to believe that Nvidia HAS to charge more, well because it takes so much R&D to produce such a chip.
 
I dont know if anyone has said, since its 6 pages of posts lol..im just lazy, but does anyone know if the card will fit in the PCIe x1 slots that are between the x16 slots on an SLI board..say the Asus A8N-SLI Dlx while one has 2 cards in SLI?
 
DangerIsGo said:
I dont know if anyone has said, since its 6 pages of posts lol..im just lazy, but does anyone know if the card will fit in the PCIe x1 slots that are between the x16 slots on an SLI board..say the Asus A8N-SLI Dlx while one has 2 cards in SLI?

Considering that they will be PCI and not PCIe to start, my guess would have to be, no.

==>Lazn
 
My thing is the games actually has to be written to incorporate their physx engine..so you will get NO boost in performance in games out there that use another physics engine or no dedicated physics engine at all. Kind of a narrow scope..

Is it worth it to pick one up if only two or three of the big games of the year actually support it...to some I guess...

If all you do is surf the [H] and play WoW, and WoW ends up being one of those games that integrates support, then it might be worth it to you. Then again, getting a life might be worth it to you too.
 
Yes, a very narrow scope... For now. Hopefully not forever.
And before everyone starts throwing the 3DFX argument around keep in mind 3D was an obvious and natural progression for gaming, especially given the rising popularity of the FPS game, so obviously 3D accelerators took off.
CPUs arent really having a hard time with physics yet.. In fact, the problem with game physics right now isnt the lack of a dedicated processor, its the relatively new engines.
Speaking of 3DFX, this argument has a hint of Glide vs Direct3D/OpenGL in it. Maybe history wont repeat itself this time.
 
-freon- said:
Yes, a very narrow scope... For now. Hopefully not forever.
And before everyone starts throwing the 3DFX argument around keep in mind 3D was an obvious and natural progression for gaming, especially given the rising popularity of the FPS game, so obviously 3D accelerators took off.
CPUs arent really having a hard time with physics yet.. In fact, the problem with game physics right now isnt the lack of a dedicated processor, its the relatively new engines.
Speaking of 3DFX, this argument has a hint of Glide vs Direct3D/OpenGL in it. Maybe history wont repeat itself this time.

That arguement is flawed. Saying that processors don't struggle with physics is is more like saying before 3d accelerators, 2d wasn't a problem.

The whole benefit of these cards is not making current physics faster - it's being able to use massively more amount of objects. And to those who say this isn't as natural of a step as 3d, I say it is. We have 3d now, and it's getting to the point of being very, very close to realistic quality. However, without being able to handle more objects on the screen at a time are graphics will always be limited. And to those who say in the future processors will be able to cope with more objects. I doubt the processors in 5 years will be able to track as many objects as the current PhysX. Why wait 5+ years for something that could be available within a year?

Astaroth
 
-freon- said:
Yes, a very narrow scope... For now. Hopefully not forever.
And before everyone starts throwing the 3DFX argument around keep in mind 3D was an obvious and natural progression for gaming, especially given the rising popularity of the FPS game, so obviously 3D accelerators took off.
CPUs arent really having a hard time with physics yet.. In fact, the problem with game physics right now isnt the lack of a dedicated processor, its the relatively new engines.
Speaking of 3DFX, this argument has a hint of Glide vs Direct3D/OpenGL in it. Maybe history wont repeat itself this time.

CPU's are having a hard time with physics, that is the problem. If you are in a game and you blow up 12 enemies and they fly around wtih ragdoll physics your game would slow to a crawl because the CPU cant handle it. More complex physics aren't put into games because CPU's cant handle it. The physics engines available are actually very robust and have been for quite some time now.

Dont believe me? The Physics engine that is being used for the PhysX chip has already been around for quite some time and has been used in games already.
 
J-Mag said:
Personally I am just looking forward to another device which we can improve performance by overclocking!

heh... yup! add one more water-block to the loop! :)
 
Entropy1982 said:

where in that statement is there any jealousy?

I definately wouldn't want you working on my teeth if you're seeing things that aren't there on a computer screen.
 
revenant said:
heh... yup! add one more water-block to the loop! :)

FUCK YEAH!

I am always excited when my machine draws MORE POWER, outputs MORE HEAT, or KILLS CATS. Whatever happens, you know it's good.
 
hardwarephreak said:
My thing is the games actually has to be written to incorporate their physx engine..so you will get NO boost in performance in games out there that use another physics engine or no dedicated physics engine at all. Kind of a narrow scope..

If DirectX begins supporting Physics like MS said, a lot of the negative arguments here would be toned down.
 
Back
Top