How much to expect from these two GPU2 boxes?

natermeister

Limp Gawd
Joined
Dec 26, 2005
Messages
463
Box 1:

AMD Opteron 165 @ 2.0GHz
ASUS A8N32-SLI
2GB DDR-500
XFX 9800GTX XXX 740MHz/2280MHz (not sure of shader clocks)
Windows Vista Home Premium 32-bit

Box 2:

AMD Opteron 170 @ 2.2GHz
DFI Ultra D
2GB DDR-400
XFX 9800GTX XXX
Windows Vista Home Premium 32-bit

The big question I have is how much of a hit I'm going to take by using such old CPUs. I've heard that GPU2 is fairly CPU independent, especially under Vista. Can anyone give me a good idea as to how well these setups will perform?

Thanks,

-Nate
 
You should be getting about 5500 ppd each.

 
The GPU2 client is only CPU dependent with ATI cards, with Nvidia cards it doesn't matter much at all.
 
With my Athlon64 4000+(2.4ghz), Win XP, with a 8800GT I get between 3,900 - 4,300ppd depending on WU.

 
I got about 5800ppd out of my 9800GTX, you might get 6k if you abuse the shaders ;)

 
Isn't that what RMAs are for? Starts laughing at IF

 
What a pussy... If is a unknown word in the land of the [H]orde !

/me choke while laughing on the floor...

 
Ok..... I can't take this.

There are three lvls as far as I'm concerned.

OC - bumping up the clocks a bit
Abuse - Even the highest factory OC does not come close, (and you probably got the base model)
Torcher - The card is only stable because of some [H]ard core improvised cooling, and if you took the clocks just 5 mhz more you would be unstable (and to hell with the temps, RMA imminent)




:p

 
So I should expect at least 5000ppd per GPU and then maybe another 100-150 from a single console. Nice, 10,000ppd plus two quads (one Linux SMP, one Windows SMP) and then maybe my E6700. My production should finally start going up again instead of it's downward slide over the last eight months.
 
I think you have something mixed up a little bit. nVidia cards are a bit CPU dependent (not as much as ATI currently) under XP but not so under Vista. I needed almost a full core under XP for my 8800GT to get 5k+ PPD with the shaders running 1800+ on a [email protected]. I'm not sure about any shader speed other than that since I didn't run the shaders slower than that for long. Under Vista64 I need about 4% of one core with everything else the same. I'm guessing the reason for the big difference is the newer driver model that Vista uses.

 
Am I? I figured I'd set the affinity of the GPU client to core 0. Then I'd install a console client and set it's affinity to core 1. From what I've heard this works quite well in Vista, but not so much in XP. I will be running Vista (Home Premium 32-bit) since it's actually a bit cheaper compared to XP.
 
Back
Top