ageia card in a newer 7970 xfire setup

fuzchunk

Limp Gawd
Joined
Dec 10, 2012
Messages
128
I picked up one of these http://www.amazon.com/Ageia-PhysX-1...F8&qid=1366657568&sr=1-1&keywords=ageia+physx I have no open 16x slot for an nvidia card. as i only have a pci 1x slot available in my xfire 7970 eyefinnity setup with an asrock extreme 4 3570k. I installed the old ageia drivers rolled back my physx to a amd compatible version can run the ageia demos fine. as soon as i launch a physx enabled game on steam it auto updates the physx driver thus disabling the ageia card any one have a workaround? perhaps a 3570k @ 4.2 ghz cpu is superior to this older ageia ppu and its not worth the effort. :confused:
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I doubt a workaround would even exist, because Nvidia ported their Ageia libraries to work on their cards, so while your Ageia PPU could be able to process the Ageia demos, it would not be able to run Nvidia physx code.
 
I don't know how to stop Steam from overwriting your physx every time you play a game... but this thread may help you in getting the card to work. There is some good info in the following pages too.

http://www..com/graphic-cards/16223-nvidia-disables-physx-when-ati-card-is-present-78.html#post86252

It works great for Mirror's Edge.. which is the only game I have that uses hardware accelerated physX. I like the game so much that I actually threw down $20 for an Agiea card a couple years back, just to get the full experience. I know it works with Arkham Asylum as well.
 
seems this forum blocks ngo hq.

http://www.ngo h q.com/graphic-cards/16223-nvidia-disables-physx-when-ati-card-is-present-78.html#post86252

remove the spaces.
 
GPU based physics is orders of magnitude more efficient and faster than an old Ageia card. I read a benchmark a few years ago that compared an Ageia card to an 8800GTS (G92 core) running physics and the 8800GTS crushed the dedicated PPU, there wasn't any competition.

You can hack around with drivers and get the Ageia card to work, but it will be so slow with modern games that the effort required to make it work isn't worth it.
 
Yeah, I'm pretty sure an Ageia wouldn't be able to run Borderlands 2 even if you could get it to work. But if you find one for cheap and only want to play a Mirror's Edge and Arkham Asylum... then go for it.
 
If you have the space, you can put an x16 GPU into an x1 slot and it will work - I don't know if it will impact PhysX ability the same way it impacts the GPU's general ability (bottleneck with the 1x slot), but it will work...

All you have to do is remove the small bit of plastic on the back of the x1 slot that stops an x16 from fitting (Or have a motherboard with that spot already open, I saw a few a while back, but I don't know if any newer ones have it)

I've run a 7900GT and a GTX285 this way in the past, but as GPU's, not for PhysX - There was definetly a performance loss from using the x1, but it works. May not be enough of a performance hit to notice using it in PhysX mode.
 
I also have a gtx260 I may try but at 1x it may do no better than a gt 430 64bit:D
 
The ageia branded cards are garbage by todays standards anyway. A $20 GT410 beats the shit out of it with PhysX, which still isn't really THAT adequate for the job.
 
This card was the biggest waste of $299 I ever paid for a computer product.

Seriously. You had a few demos and Unreal 3 to use it with. By the time Mirror's Edge came out there were already better options.

To be honest though, at the time I thought physics cards were the wave of the future. They still could be, but I think most developers are looking the CPU to do the heavy lifting.
 
so @ pci 1x will a gt430 hold me back I see conflicting results. alot of gtx 650ti on sale lately but at 1x may be a waste
 
Back
Top