8800GTS 320 performance in Asrock 939Dual Sata2

markintosh13

[H]ard|Gawd
Joined
Jan 7, 2006
Messages
1,092
Hey,

I have an Asrock 939Dual Sata2 mobo. I have an X2 4200+ inbound (it was cheap cheap) and plan to OC it to 2.6 - 2.8ghz.

I was wondering if anyone has any experience with an 8800GTS 320MB in this type of config?

Benchmarks?
 
why would it be slower?

youre on amd. board means nothing normally. i have the board, run x1900aiwonder card. my numbers are normally above test numbers for similar configurations. have the same chip actually, 4200. havent oc'd though. I might as well, right?
 
I've got a ASRock 775Dual-VSTA with a 6400 C2D. I recently bought a BFG8800GTS 320MB OC. It works great although I do get a few slow downs in the odd game (some outdoor scenes in FEAR for example, still have to try STALKER).

If I recall correctly, the PCI-Express slot on my board only works at 1x instead of 16x, although I'm still trying to find where I read this.

I cant remember any benchmark numbers but suffice to say they are a stack better than my previous 6600GT. And I can turn most settings in games to max, with AA and AF.
 
mine works fine,i have the same set up as you i get 7,850 in 3dmark 06.
 
Pretty sure it's either 4x or 8x electrical on that 16x slot, metallibloke -- at 1x you'd be in pain.
 
Yeah I was thinking along similar lines, silent_circuit. I cant find where I read what it does run at, the manual is no help!
 
I've got a ASRock 775Dual-VSTA with a 6400 C2D. I recently bought a BFG8800GTS 320MB OC. It works great although I do get a few slow downs in the odd game (some outdoor scenes in FEAR for example, still have to try STALKER).

If I recall correctly, the PCI-Express slot on my board only works at 1x instead of 16x, although I'm still trying to find where I read this.
It's 4x not 1x. At 4x we are talking around 5-7% loss of performance at worst.

The 939DualSata-2 dosen't have that limitation it use a different chipset (ULI) and the intel version of the board use a Via Chipset. The 939 has a full 16x slot, any limitation would be from the processor not pushing enough polygon to the gfx card. I had a 939 DualSata-2 + a Dual core Opteron 165 and was as fast as my other socket 939 friends. When I upgraded to Core 2 then the 8800GTX realy started to fly hehhe.
 
Thanks Cyant. 4x sounds a lot more reasonable that 1x!! I dont know too much about PCI-Express, but I'm a little (pleasantly) surprised that the performance drop is only around 5%
 
I don't know that the 5-7% is entirely correct, but I'm not going to belabor things. I'll just say that I've seen similar numbers quoted for dropping from a 16x to an 8x electrical slot, so there may be more of a hit (it would follow, after all) dropping from a 16x to a 4x electrical slot. Only way to find out is to test, though. ;)
 
Actually it can be more than that but most of the time the limitation will be elsewhere like the CPU. There will be a big hit on apps like Maya which use heavy number of polygons etc... but in most games it would not mater too much.
 
Ok, cool.

That would be a significant improvement over my single core - 7900GS 3DMark06 score of 5K. I think if I can get the cpu OC up and the card OC'd a bit, I'd aim for an 8K 06.
 
Well I picked up a second hand BFG 8800GTS 320 OC (two months old - saved a fair bit of cash and taxes).

Couldn't wait for my X2 4200+ to arrive (Juice set me up with a guy who sold me one for $55 shipped), and installed the 8800GTS yesterday.

With my 3700+ at 2.82ghz, and lots of casefans, I increased the overclock from 550mhz core to 625mhz (multiple reviews of this card seemed to get to 650 - 655mhz) with AtiTool and kept an eye on the temps.

I ran some benchmarks - I now know the meaning of the term "cpu bottlenecked". There was a noticeable improvement in image quality, and things are definately smoother - no doubt. But benchmark scores were slightly underwhelming.

Now, it's very true that my previous gpu - a 7900GS - was voltmodded and very overclocked (from it's stock 450mhz core, I increased it to 681mhz), so perhaps this contributes to the slight feeling of underwhelmed I have.

My previous 7900GS benchmarks:
Aquamark3: 100,900 trimark
3DMark06: 5003
Counter Strike Source Stress Test (1280x1024, 4AA, 16AF, high settings): 185fps average

My new 8800GTS 320 (625mhz core, 3700+ @ 2.82ghz) benchmarks:
Aquamark3: 104,500 trimark
3Dmark05: 7233
CSS Stress Test (as above): 205fps average

I expect that when I get my dual core X2, my Aquamark3 score may actually decrease (unless I can overclock it to 2.82ghz - AM3 is a singlethreaded app), I expect the 3Dmark06 will probably go up 1K, and the CSS stress test will probably stay about the same. Aquamark is almost very cpu dependant - my previous scores with this cpu were a max of 83K with a 6600GT, 91K with a 7600GT and the above 100.9k with a 7900GS.

Now the 8800GTS is an improvement, don't get me wrong. Counter Strike Source looks better. I can turn up the 4X AF settings alot higher. The image quality is better. The game plays noticeably smoother (the extra video ram seems to help).

But there's no doubt that I should continue with my long term plan to convert my AMD box to a TV personal video recorder / streaming media server and acquire a Core 2 Duo for gaming as soon as funds allow. I needed the 8800GTS anyway (because a C2D with a 7900GS wouldn't be living up to it's potential either), but eventually I'll be returning the 7900GS to this box (or even going back to my agp 6600GT) and relegating it to a dark corner.

All need now is a cheap c2d mobo (the 650i Ultra is looking good) a cheap c2d, a cheap cpu cooler, a good case with 120mm fans ('cause I'm tired of loud 80mm fans), better get a new good PSU, a fast sata2 hard drive, and a sata optical drive. Bleh.
 
uh hm.

you are testing the limits of the cpu in your tests. you arent testing how it works with modern games at high resolutions. cpu means nothing if you max out the vid card with aa/af and high (1920x1200) resolutions.

single core = ghetto.
 
uh hm.

you are testing the limits of the cpu in your tests. you arent testing how it works with modern games at high resolutions. cpu means nothing if you max out the vid card with aa/af and high (1920x1200) resolutions.

single core = ghetto.

well thank you Captain Obvious.

One works with what one has.

As I said, I have an X2 4200+ inbound, and am still planning to go C2D when I can afford to.

If I could afford a 1900x1200 monitor, I'd probably also have had the cash to already made the jump to C2D.
 
playing a game without lag and image looking nice is a frame of reference

no it sux and has crap frames

yes it rox and is decently smooth

2 frames of reference down.

8800gts/x series best out for gaming.

there's a third.

there you go.
 
Yeah, well I'll report my observations the way I see them, but thanks for the observations.

And the next time I'm writing for the ADD crowd, I'll be sure to do it like you suggest:

Subject: My h4rdwar3 review, lol

It like rox, srsly - lol.

'cept for the crap that suxd.

teh end.

;)
 
Back
Top