8800 GTS Overclock

diablo111

[H]ard|Gawd
Joined
Jul 26, 2004
Messages
1,734
Well, with one day left for step-up, I decided to trade in my eVGA 7900 GTO for an 8800 GTS. I'm really impressed so far...this card overclocks like a monster!! I have it at 660/2000 (stock is 500/1600) with no sweat! Along with a very modest overclock on my CPU, this thing is cruisin'!! I'm only a few points off from my brother's FX-57 @ 3GHz w/ 2 x 7950 GT's in SLI! I think soon I'll upgrade to eother an X2 CPU or go Core Duo...

8800gts.jpg


Image quality is awesome and a free copy of Dark Messiah...the only thing that sucks? No driver for my Vista RC2 install :(

However, great card, I've bought eVGA on my last 4 cards (6800nu, 7800gt, 7900GTO, 8800gts), they're keeping my business for sure.
 
It's just luck of the draw. Doesnt neccesarily means evga overclocks better than the others.
 
nice clocks. i'm at 650/1940 right now stable, but i'm still doing more oc'ing. been running overnight atitool and bumping it up a little every night.
 
It's just luck of the draw. Doesnt neccesarily means evga overclocks better than the others.

Maybe not, but most the EVGAs I've seen can do ~650 / 2 ghz. Other brands top out much earlier on average.
 
Some of you guys might want to see if your memory is actually being overclocked. There is a verified driver bug in the newer drivers that does not change the mem clocks to the rated speeds. ATITool and Rivatuner on default settings indicate the memory is changing...only it is not.


Read here, plenty of info... http://www.hardforum.com/showthread.php?t=1144790

quote:

Originally Posted by Unwinder View Post
All

I've just finished investigating the problem with constant 400MHz memory clock frequency reading on G80 boards, which doesn't reflect memory overclocking.

Status: not RivaTuner issue, NVIDIA driver's bug, two of 3 memory clock frequency generators are really running at 400MHz

G80 boards use 3 independent memory clock frequency generators (1 generator for each 2 memory channels). When the PC is booting, core/memory clocks are set to much lower clocks comparing to the clocks set in Windows GUI, and at that time BIOS programs all 3 memory generators to 400MHz. When the OS finish loading, driver must switch all 3 generators to target memory clock. 97.02 driver does it properly, 97.28 - doesn't and leaves 2 of 3 memory clock generators running at BIOS defined 400MHz clock.
 
I'm hit with the memory bug too...

It's why going from 1900 to 2100 gives me about 45 more 3dmark06 marks.
 
NTune, from Nvidia.

You are correct sir!!

Here is another screen shot...Half-Life 2: Lost Coast video stress test. Everything is cranked up as high as it will go, 1680X1050 (looks awesome on my 20.1" Benq wide screen :D

8800-4.jpg
 
I dont think 6x aa works with nvidia, I thought that was an ATI thing. I don't know what nvidia reverts to with that. I'm sure someone will correct my if I am wrong...
 
Hmm...I would think your 3dmark 05 scores would be better. At first I thought it said 3dmark 06 and I was like damn. But I get 13900 in 3dmark 05 with my x2 4400 and 2 7900GTOs so I guess getting a single 8800GTS wouldn't really be a good move for me. That's too bad..
 
Hmm...I would think your 3dmark 05 scores would be better. At first I thought it said 3dmark 06 and I was like damn. But I get 13900 in 3dmark 05 with my x2 4400 and 2 7900GTOs so I guess getting a single 8800GTS wouldn't really be a good move for me. That's too bad..


Basing video card purchases on 3DMark is...not very smart.

In real games, the 8800GTS is going to basically beat your SLI setup soundly. Even more so if you run anything close to a high resolution with lots of eye candy.
 
3dmark 06 is very CPU dependent.

For me, adding a 2nd 8800 GTS to run SLI only increased my score by 1000 marks. But when playing games, they run so much smoother with the SLI setup.

Basically, a single core cpu with 8800 sli will score around 8 to 9k marks. Core2Duo with 8800 sli will score 11 to 12k marks. Core2Quad with 8800 sli will score about 15 to 16k marks.

Rough estimate of #'s depending on a number of factors, but you get the point.
 
Basing video card purchases on 3DMark is...not very smart.

In real games, the 8800GTS is going to basically beat your SLI setup soundly. Even more so if you run anything close to a high resolution with lots of eye candy.

I also read through some reviews with in game benchmarks taken with alot of "vs" cards and hte 7900GTX sli setup destroyed a single 8800GTS and even beat the 8800GTX on most tests. I have 2 7900GTOs which are just slightly dumbed down 7900GTXs. So I'm basing it on that. If I could find a good price on a single 8800GTX and sell off both of my 7900GTOs, I would still do it just to get more space plus the dx10 factor and the ability to go SLI with 8800GTXs later on.
 
I also read through some reviews with in game benchmarks taken with alot of "vs" cards and hte 7900GTX sli setup destroyed a single 8800GTS .


At lower resolutions mostly. At say, 1920x1200, and usually 1600x1200, the 8800GTS is going to win in most real games provided you use high IQ settings, and the GTX will "destroy" your SLI setup. Look around. The 3dmark scores are cpu dependent and do not give a true representation of real world.

If you are gaming at 1280x1024, fine, go another card for SLI, but if you play really high res and IQ settings, the GTS and GTX beat your SLI setup.


....and not that it matters much at this point, but DX10 games ARE coming soon, and an 8800GTS will "destroy" your SLI setup in that regard, so buying another fairly expensive DX9 card this late in the game does not make much sense if you plan on taking advantage of DX10 and Vista. DX10 is not coming for XP....
 
even setting your speed at 660mhz, its only actually running at 648mhz if you check rivatuner. it only overclocks in increments of 27mhz. same with the ram
 
I had mine @ 600/1000 (didnt try going higher yet cuz it gets pretty damn warm)

Idle I sit at about 60deg full load about 85
i do get some curruption in oblivion, and only oblivion, supreme commander runs perfect with those settings =p

are these temps normal? or should i re-apply the AS5 and try again
 
I had mine @ 600/1000 (didnt try going higher yet cuz it gets pretty damn warm)

Idle I sit at about 60deg full load about 85
i do get some curruption in oblivion, and only oblivion, supreme commander runs perfect with those settings =p

are these temps normal? or should i re-apply the AS5 and try again

those are pretty much exactly the same temperatures I get with my bfg 8800gts oc in my sig

I think the BFGs seem to have higher temperatures than the other brands. at 100% fan, the load temp is only about 75 though... on automatic its pretty much the same temps you are reporting
 
At lower resolutions mostly. At say, 1920x1200, and usually 1600x1200, the 8800GTS is going to win in most real games provided you use high IQ settings, and the GTX will "destroy" your SLI setup. Look around. The 3dmark scores are cpu dependent and do not give a true representation of real world.

If you are gaming at 1280x1024, fine, go another card for SLI, but if you play really high res and IQ settings, the GTS and GTX beat your SLI setup.


....and not that it matters much at this point, but DX10 games ARE coming soon, and an 8800GTS will "destroy" your SLI setup in that regard, so buying another fairly expensive DX9 card this late in the game does not make much sense if you plan on taking advantage of DX10 and Vista. DX10 is not coming for XP....

Ok so this quoting "destroy" every time you say it gets annoying after awhile.
I'm also not sure where this "if, then" scenario came from. I'm not thinking about adding another card for SLi setup. I already have SLI setup. Been had it for a little while now.

The tests I was talking about were IN GAME tests as I already said in my previous posts. And these in game tests were done at different resolutions. The higher resolutions did yield less of an advantage to the 7900 GTX sli setup, but it was still quite alot better than the 8800GTS. The single 8800GTX did beat out the 7900GTX sli setup at higher resolutions 1600x1200 and above.
However, when looking at pricing and things like that, I guess it would then be up to what your display can provide. No use having a card capable of better than 1600x1200 if the display can't handle it :p

They did tests on FEAR and a few other games...

And I WILL be taking advantage of vista and dx10 for games in the future. I was just saying that I don't think the 8800gts would be the best way to go for ME. For other folks I'm sure the 8800GTS would be a great upgrade option for them compared to what they are currently using. I won't even think about getting vista to play any dx10 games until I've got something at least comparable to the 8800gtx.

Like you said that I agree with, it is COMING, but not here yet. As I see things at the moment, there are no dx10 games that I feel like I just have to play. In the future there may be at which time I will probably upgrade my junk. Hopefully by then, any vista, dx10 and video card driver bugs will be worked out and the overall gaming experience will be better. Personally, I prefer to spend more time gaming than trying to troubleshoot and figure out workarounds to problems. ;)
 
Back
Top