Questions for Ageia

Joined
Mar 16, 2006
Messages
4,064
I will be going to a LAN party in early July in St. Louis where there will be reps from Ageia. (last time Ageia's CEO showed up) So if there are any serious questions that you would like me to try and get answers for please post them here or PM them to me. I will try my best to get them answered but I can't guarantee that they will.
 
Here's a fun one.

Do you think PPU's will go the way of CPU's and GPU's in terms of parallel processing, with either multiple cores (a la dual-core and multicore CPUs) or multiple PPUs working in tandem like SLI or Crossfire?
 
Will the announced die-shrink for upcoming PhysX cards result in:
a) faster clockspeeds (for greater performance)
b) cooler chips (passively cooled PhysX cards)
c) a mixture of both
d) neither
 
I'd be happy for a:

"When is the estimated timeframe that PCI-E PhysX cards will be released, if ever?"
 
The only way to get a physx card with PCI-E right now is from some system builders. I don't know which ones or how to tell if it is actually PCI-E or not. I already planned on asking that one;)
 
I'd be happy for a:

"When is the estimated timeframe that PCI-E PhysX cards will be released, if ever?"

I second this.

This is my main question as well. I would also like to see a die shrink, higher clocks and passive cooling, but that isn't as important to me as going PCIe instead of PCI.
 
This is my main question as well. I would also like to see a die shrink, higher clocks and passive cooling, but that isn't as important to me as going PCIe instead of PCI.

If they remake it at this point I think some sort of die shrink is inevitable, I would think. Isnt the die size fairly large? I cant remember the exact die size, but its fairly big by todays standards.

Heck, maybe even ask if they are doing well financially. :p We dont hear any news from Ageia much.
 
If they remake it at this point I think some sort of die shrink is inevitable, I would think. Isnt the die size fairly large? I cant remember the exact die size, but its fairly big by todays standards.

Heck, maybe even ask if they are doing well financially. :p We dont hear any news from Ageia much.

I'm not sure what is going on with them. The lack of news doesn't sound promising. I think consumers and developers alike are waiting to find out what method of physics processing will become the standard before comitting to anything right now. Much like how consumers are waiting to find out if Blue Ray or HD DVD will win the format war.
 
I'm not sure what is going on with them. The lack of news doesn't sound promising. I think consumers and developers alike are waiting to find out what method of physics processing will become the standard before comitting to anything right now. Much like how consumers are waiting to find out if Blue Ray or HD DVD will win the format war.

It is concerning... I want PhysX to work, but there's no compelling reason for me to buy a card yet. Perhaps with UT2K7 and other major game releases, but it's just of very little use to me currently. For awhile there was a lot of news about the PPU, but now it seems pretty quiet, which is... odd.

Here's a fun one.

Do you think PPU's will go the way of CPU's and GPU's in terms of parallel processing, with either multiple cores (a la dual-core and multicore CPUs) or multiple PPUs working in tandem like SLI or Crossfire?

They already have many cores internally for parallel processing, according to early information back toward the PPU's introduction.
 
It is concerning... I want PhysX to work, but there's no compelling reason for me to buy a card yet. Perhaps with UT2K7 and other major game releases, but it's just of very little use to me currently. For awhile there was a lot of news about the PPU, but now it seems pretty quiet, which is... odd.



They already have many cores internally for parallel processing, according to early information back toward the PPU's introduction.

I am not sure I want Ageia's implementation of physics processing to become the standard. I already have four expansion cards in my machine. Due to the fact that two of them take up two slots each and I have one PCI card, the only slot left open to me is a PCIe x1 slot. That's only if I use my eVGA board. On my Striker I don't have that because of the onboard audio/proprietary slot replacing the PCIe slot at the top of the expansion slot area. So for me it would have to be PCIe or nothing. I don't see me changing motherboards real soon and even if I did I don't know how the situation would improve to where I would have an open PCI slot. Even if I did I am not interested in supporting legacy PCI anymore than I have to.

Plus if the physics processing power of our GPUs can closely match or best that of the Ageia card I'd rather do that. Additionally I wouldn't mind it if physics processing could simply be done on all these extra cores we keep getting every year or so. Last year dual core was the rage, now quad, soon octi. Right now at least two of my cores don't usually do anything and I'd like to use them for something that would improve my gaming experience if possible.
 
there has to be more supported games. Its stupid to get a card now because unless your playing GRAW or some demos.
I keep looking at them in store, and now with them locally dropping 30% in price...I still cant bring myself to get something that never gets used.
 
I'd be happy for a:

"When is the estimated timeframe that PCI-E PhysX cards will be released, if ever?"

Yes like everyone else waiting to buy one, im only going to buy one when the pci-e version is available.

Also possibly a low height version, not that it really bothers me :)
 
I am not sure I want Ageia's implementation of physics processing to become the standard. I already have four expansion cards in my machine. Due to the fact that two of them take up two slots each and I have one PCI card, the only slot left open to me is a PCIe x1 slot. That's only if I use my eVGA board. On my Striker I don't have that because of the onboard audio/proprietary slot replacing the PCIe slot at the top of the expansion slot area. So for me it would have to be PCIe or nothing. I don't see me changing motherboards real soon and even if I did I don't know how the situation would improve to where I would have an open PCI slot. Even if I did I am not interested in supporting legacy PCI anymore than I have to.

Plus if the physics processing power of our GPUs can closely match or best that of the Ageia card I'd rather do that. Additionally I wouldn't mind it if physics processing could simply be done on all these extra cores we keep getting every year or so. Last year dual core was the rage, now quad, soon octi. Right now at least two of my cores don't usually do anything and I'd like to use them for something that would improve my gaming experience if possible.

You know what, you're right. What I *really* want is for PHYSICS to be implemented in games, not neccessarily as yet another add-on board.
 
Plus if the physics processing power of our GPUs can closely match or best that of the Ageia card I'd rather do that.

Haveyou read the details of Havoc FX :) ? Non-gameplay at best, plus their tech demos are crap. PLUS their prices are much more, PLUS where are the GPU physics titles? Sorry, I just hate GPU physics :-D



WHY would you rather pay more money for less? Are you an ATI fan? (OOOOHHH thats a sick burn.)
 
Haveyou read the details of Havoc FX :) ? Non-gameplay at best, plus their tech demos are crap. PLUS their prices are much more, PLUS where are the GPU physics titles? Sorry, I just hate GPU physics :-D



WHY would you rather pay more money for less? Are you an ATI fan? (OOOOHHH thats a sick burn.)

I don't dislike ATI. Besides we don't know if GPU physics is worth a damn or not. No one has done anything with it yet. All I care about is the game play. I am not interested in who provides what and I certainly don't care about tech demos. I'd rather not have to screw around with shoving an extra card into my system which I don' thave room for.
 
Well the mass already favor the GPU. While that solution haven't prove anything, Beside that nV and ATI can make flashy Tech demo's showcase it. Wich is EffectPhysics. With a very few know titles showing something but wenn do they get out 2008?.

Using Physics extensivly has some concequences if use for gameplay Physics. That why Havok FX focus on Effect Physics not that it's imposseble, it is posibble. Wich is not a bad choice at all. If Dev's don't jump on the gameplay physic wagon. Graw wont be the last even a lot of Havok FX games will go that route.
Till mainstream ISP connection by average gamer have a lot of bandwide like ADSL2 or ADSL3. For heavy gameplay Physics online.

If they go the Effectphysics way like GRAW it will all get grawish. the value of HArdware physics will be cosmetic more eyecandy thing. but done right or at more scale it's also welcome.

For me that's welcome to but prefere someting " bold where no dev has gone before" extensive gameplay Physics like warmonger and CFR.
But off course done and used right. Must drop online play for that. It comes not for free.

As for I need no extra add-on card. That a minority with uses every slot already. That not makes PPU PCI a bad solution more that some don't get away with that. Because of personal situation. For comparable GPU physics a High-end GPU would do it wenn you exualy needed midrange to have spare power for competing hardware accelerated Physics. While the rest need a addon GPU to.

As I have 3 PC system all of them can handle a PPU PCI card.

SLI and CF use is also a minority. Wich sacrifice a lot of slots for SLI CF wich is there choice. PPU addon works for the masses but not for every one.

The PhysX SDK is for free. For me PhysX wins headon because I cant afforf Havok + Havok FX SDK. Dev teams on a budged would go for PhysX where some take the effort to support PPU.
Bigger budged Project can easy go for havok + HavokFX. and some will go for The FX part to.

What I would ask:
What, wenn and spects will the PPU2 Nextgen have and how much it would cost.
For spects will the 130nm -> 80nm be a 1:1 dieshrink or will it have some advancement.
Like be able to utilise more then 128MB Dedicated Physics Memory on PCB. Like 512. That's what affordable competing GPU card already can have.
Even a scaled up design. More units.
 
i like hardware accelerated anything, so i am all for hardware accelerated physics.

however, i do wonder about Agiea's ability to produce competative cards in an age of directx 10 unified shader vga cards....................

yes a dedicated PhysX card may be far more efficient than an 8600GT, but it is manufactured on 0.13u whereas an 8600GT is manufactured on 0.08u.
one can therefore presume that an 8600GT chip will clock a lot higher than a PhysX chip (making up for the reduced efficiency), and that it will be a lot cheaper to produce (£80 for a 8600GT vs £120 for a PhysX card).

The fact that millions of 8600 cards are made mean it is also much cheaper to outfit that card with more memory (8600GT = 256MB vs PhysX = 128MB), and cheaper to produce it on a PCIe bus.

The argument that you would be daft to reduce the performance of your 8800GTX(SLI) setup to run a few physics effects is moot, because what we are really talking about is adding a cheap mid-range 8600 card as a second (or third) card into your rig, just as you would do with a PhysX card.

I would prefer the dedicated PhysX card to survive and become the standard, but they are up against some tough market-force induced performance penalties.
 
Ask them if I can hook up a scanner to it, so I can scan barcodes and go to the website of the barcode, just like the CueCat that it replaced. :D

Actually, Tell them that the game the killer GAME or APP they are looking for is AVP3 and they need to push whatever buttons, wave whatever magic wand to make it happen.
I can only imagine the brand identification AVP3 would do for Ageia's PPU and more importantly what that chip can do for the game. Aliens VS Predator 3........
 
i like hardware accelerated anything, so i am all for hardware accelerated physics.

however, i do wonder about Agiea's ability to produce competative cards in an age of directx 10 unified shader vga cards....................

yes a dedicated PhysX card may be far more efficient than an 8600GT, but it is manufactured on 0.13u whereas an 8600GT is manufactured on 0.08u.
one can therefore presume that an 8600GT chip will clock a lot higher than a PhysX chip (making up for the reduced efficiency), and that it will be a lot cheaper to produce (£80 for a 8600GT vs £120 for a PhysX card).

The fact that millions of 8600 cards are made mean it is also much cheaper to outfit that card with more memory (8600GT = 256MB vs PhysX = 128MB), and cheaper to produce it on a PCIe bus.

The argument that you would be daft to reduce the performance of your 8800GTX(SLI) setup to run a few physics effects is moot, because what we are really talking about is adding a cheap mid-range 8600 card as a second (or third) card into your rig, just as you would do with a PhysX card.

I would prefer the dedicated PhysX card to survive and become the standard, but they are up against some tough market-force induced performance penalties.

I am for hardware accelerated anything as well, but at this point I don't really want to try and shove another card into my system. Right now that would be very difficult. And if I did at this point it would have to be PCIe x1.
 
i won't get a PCI PhysX card either.

it must be PCIe, whether it be 1x or 4x i care not as long as the decision is made for performance and not market share reasons.

the only PCI device i have time for is an X-fi, and only because the low latency of PCI is actually much better for audio devices than PCIe. and i won't even buy one of those until i get a linux driver.
 
i won't get a PCI PhysX card either.

it must be PCIe, whether it be 1x or 4x i care not as long as the decision is made for performance and not market share reasons.

the only PCI device i have time for is an X-fi, and only because the low latency of PCI is actually much better for audio devices than PCIe. and i won't even buy one of those until i get a linux driver.

Well currently for me it would have to be PCIe x1. It would be a bad move in my opinion to make it a PCIe x4 card. Few motherboards have dedicated PCIe x4 slots and few have three x16 slots either. In my case, I'm using a PCIe x4 card already in my third PCIe x16 slot. So all that I have left is one single PCIe x1 slot.
 
i won't get a PCI PhysX card either.

it must be PCIe, whether it be 1x or 4x i care not as long as the decision is made for performance and not market share reasons.

the only PCI device i have time for is an X-fi, and only because the low latency of PCI is actually much better for audio devices than PCIe. and i won't even buy one of those until i get a linux driver.
Acording to Ageia the Physx cards currently don't even use all of the PCI bandwidth which is why the current manufactures haven't gone to PCI-E so the only reason that they would switch is for market share reasons.
From Ageia's FAQ:
At this point, every AGEIA PhysX Accelerator is configured as a PCI 2.1 add-in board with 128MB GDDR3. The PCI interface is more than adequate for handling complex physics calculations, so changing the design to PCI Express would be simply to address slot availability on the motherboard, not to enhance raw performance.
AGEIA has no plans at this time to create a retail PCI Express product or to introduce additional memory configurations. Some system integrators do offer a PCI Express 1x version of the PhysX Accelerator as a part of a new system configuration.
 
Well they need to start doing PCIe x1 cards. Fact is most dual video card machines completely block out almost all the PCI slots.
 
Well they need to start doing PCIe x1 cards. Fact is most dual video card machines completely block out almost all the PCI slots.

Amen. That's stupid. I've got 2 PCIe 1x slots and 0 PCI slots. Even if I really wanted a card, I couldn't use it.
 
Well they need to start doing PCIe x1 cards. Fact is most dual video card machines completely block out almost all the PCI slots.



Aye. The irony is that the people who would be the early adopters of this technology are the ones who are running SLI or a lot of other components, also they are the ones probably with the most PCIe slots on their boards too...

Ageia needs to get on the ball and release to the public.
 
Aye. The irony is that the people who would be the early adopters of this technology are the ones who are running SLI or a lot of other components, also they are the ones probably with the most PCIe slots on their boards too...

Ageia needs to get on the ball and release to the public.

Yep.
 
Aye. The irony is that the people who would be the early adopters of this technology are the ones who are running SLI or a lot of other components, also they are the ones probably with the most PCIe slots on their boards too...

Ageia needs to get on the ball and release to the public.

This is the situation I've been in for a while. My current PC sports 8800gts sli over an asus striker. So I currently have left 1x pcie slot, and one 1x PCI slot.

I've wanted to purchase a PPU for a while, however I'm not willing to sacrifice my PCI slot for it. I'll add a sound card first.

For this reason alone I'm prone to just drop an 8600 into the third PCIe slot and be done with it, and I'll probably make the splurge next month.

if Ageia had a product that was PCIe that would change, but they don't.
 
Are you aware that your product has failed to bring anything that current GPUs and CPUs can't already do?
 
It's not that it can't, it's that it doesn't due to lack of software support.

With all due respect, you cannot know that for a fact... How do you KNOW that it's software related and it isn't just that the hardware is a turd? Or quite likely both? In the end it doesn't matter anyway, if the product can't deliver then it can't deliver, for whatever reason.
 
With all due respect, you cannot know that for a fact... How do you KNOW that it's software related and it isn't just that the hardware is a turd? Or quite likely both? In the end it doesn't matter anyway, if the product can't deliver then it can't deliver, for whatever reason.

The hardware and the SDK/API are both in place, mature, and are delivering what they promise. The problem is that the software vendors aren't using it. I use the SDK on a regular basis and I've got nothing but love for it. As others have pointed out, the PPU is essentially a poor-mans vector processor, so what's not to love about that. As to how to get software vendors to use it, I don't have an answer for you. but, it's not that the hardware "sucks" -- it most certainly doesn't suck. However, there's not currently a definitive and quantitative way that is independent to prove, beyond a shadow of doubt, that PPU a good add-on.
 
Well the mass already favor the GPU. While that solution haven't prove anything, Beside that nV and ATI can make flashy Tech demo's showcase it. Wich is EffectPhysics. With a very few know titles showing something but wenn do they get out 2008?.

Using Physics extensivly has some concequences if use for gameplay Physics. That why Havok FX focus on Effect Physics not that it's imposseble, it is posibble. Wich is not a bad choice at all. If Dev's don't jump on the gameplay physic wagon. Graw wont be the last even a lot of Havok FX games will go that route.
Till mainstream ISP connection by average gamer have a lot of bandwide like ADSL2 or ADSL3. For heavy gameplay Physics online.

If they go the Effectphysics way like GRAW it will all get grawish. the value of HArdware physics will be cosmetic more eyecandy thing. but done right or at more scale it's also welcome.

For me that's welcome to but prefere someting " bold where no dev has gone before" extensive gameplay Physics like warmonger and CFR.
But off course done and used right. Must drop online play for that. It comes not for free.

As for I need no extra add-on card. That a minority with uses every slot already. That not makes PPU PCI a bad solution more that some don't get away with that. Because of personal situation. For comparable GPU physics a High-end GPU would do it wenn you exualy needed midrange to have spare power for competing hardware accelerated Physics. While the rest need a addon GPU to.

As I have 3 PC system all of them can handle a PPU PCI card.

SLI and CF use is also a minority. Wich sacrifice a lot of slots for SLI CF wich is there choice. PPU addon works for the masses but not for every one.

The PhysX SDK is for free. For me PhysX wins headon because I cant afforf Havok + Havok FX SDK. Dev teams on a budged would go for PhysX where some take the effort to support PPU.
Bigger budged Project can easy go for havok + HavokFX. and some will go for The FX part to.

What I would ask:
What, wenn and spects will the PPU2 Nextgen have and how much it would cost.
For spects will the 130nm -> 80nm be a 1:1 dieshrink or will it have some advancement.
Like be able to utilise more then 128MB Dedicated Physics Memory on PCB. Like 512. That's what affordable competing GPU card already can have.
Even a scaled up design. More units.

Uh....a GPU does more then effect physics and Havok does more then effect physics as well. This was announced a long time ago. Also anyone who owns a 8800 series video card has a physics card. Now just like Physx we just need more games. Sadly the games that are using Physx sdk are using it for software physics and only using the add on card to offload what the software might do but your experience will be exactly the same just with 1-2 extra fps. Only Unreal Tourn 2007 has announce it will have added content if used with the addon card. Mind you it wont be online play that is affected because it would be impossible with the small amount of bandwidth cable/dsl provides and lag would also be a terrible problem.
 
If you read up on Havok, only HavokFX runs on the GPU. Any interactive physics through the normal Havok SDK are still handled on the CPU.
 
The Ageia PPU is a failed invention. Current and future GPUs and CPUs can do physics calculations just fine. There is no need to buy an extra card for that.

Maybe a PPU could help, but they've had a long time to work with it and it still is a paper-weight.
 
Back
Top