Does overclocking an 8800 GTS 512 help in Crysis?

neubspeed

Gawd
Joined
Feb 19, 2005
Messages
518
I just returned my 3870 X2 to the store last night, good riddance - I never got a stable framerate with it.

But to temporarily replace it, I picked up a BFG 8800 GTS 512 OC. I've been playing Crysis at 1920x1200 with everything medium and 8x AF. Everything else maxes out at my resolution without a problem. I've never overclocked a video card before because I heard it makes no difference. But will pushing this card to as high a core and memory as possible have any effect on being able to turn more things up in Crysis?

Any insight would be appreciated. Also, what should you be using for overclocking Nvidia cards?
 
Wow I did the same exact thing. I dumped my 3870x2 and grabbed the Evga 8800gts from newegg. Well I used rivatune to overclock mine to 800 Mhz and 1000 Memory and left the shader clock stock. Gets around 13100 3dmark 06. Seems to get unstable at higher clocks maybe its my motherboard or something else not sure. I've heard Ati Tools works the best at overclocking it.
 
i think voltage is keeping the core from passing 800, cuz mine cant get passed 799.

Overclocking does help! I took mine from 670mhz core to 790-799

Beautiful, gained FPS.
 
No, it won't make much of a difference. I've overclocked an 8800GT from 600/1800 to 740/1000 including a 300MHz overclock on shaders from 1500 to 1800MHz.

I did the Crysis GPU benchmark several times. The framerate increase was negligible for the noise/temperature tradeoff. I got a 1 to 3 FPS increase averaged over several benchmarks. But I guess with Crysis, that could mean a lot. Just run the Crysis GPU benchmark right now with your current setup and then overclock and run it again.
 
People assume Ati tool is only for ATi cards, thats why.

Getting back to the OC issue, I have problems with the gts 512mb as well. It scales very poorly in OC'ing, the clocks will go high, but performance hardly improves. Unlike my 640 gts which gets a big performance boost in OC'ing. I'm seeing others having the same problem.
 
Are you SURE the ATI Tool works with the NEW G92 based GTS? I have tried it...nada. It does not report back the clock speeds, stock or otherwise. I tried several different drivers for fun...no luck. This may be a Vista issue as well....are you guys using XP + ATI Tool to overclock?

That article he liked is using it on the G80 GTS...not the same thing. I quote:

************************************

"Test Subject: FOXCONN 640MB GeForce 8800 GTS
By default, the FOXCONN 8800 GTS operates with a 575MHz G80 GPU core clock speed, a 1188MHz shader, and a 900MHz (1800MHz) GDDR3 RAM speed. I will utilize the free overclocking utility ATITool v0.26 to search out the best clock speeds and simulate heavy graphic loading to establish stability."
 
Like others have said it doesn't help very much. I did the default time demo and found the following...

Core: 675-792: 17.3% increase
Shaders: 1675-1890: 12.8% increase
Memory: 972-1065: 9.56% increase

Performance Increase: 6.03%

Temperature increase is negligible so I do it anyway.
 
I know this is about the 8800GTS G92 but my 8800GT hardly shows any difference in performance when overclocked over its stock SC speeds. Although, things do tend to run smoother until it artifacts and locks up. I haven't tested this extensively though. The limiting factor for these G92's seem to be the ROPs and memory bandwidth. It seems nVidia has a real killer GPU on their hands but castrated it a bit so it gives people a reason to upgrade to the 9xxx series. Also, about ATI Tool, I haven't been able to get it to report the clock speeds at all with my GT.
 
ati tool does not currently work on g92 GTS.

i also don't think memory bandwidth is the limiting factor, at least in crysis. @1680, with gpu core of 800mhz, i get about 3 fps increase (about 10%).

keeping the gpu @ 670 and increasing the memory clocks to 1.1Ghz results in a 1fps avg. very negligible difference and could even just be within margin of error....

that would indicate to me that increasing gpu clocks has a more significant impact than increasing memory, and that bandwidth is not the limiting factor - at least w/ crysis. i haven't tested anything else.
 
I get about a 10-15% boost, most of it coming from the memory. Core has almost no impact, shader has moderate impact.

Also, don't pay attention to people who say they get "only a 1-3 FPS increase" when they don't bother to mention what their overall framerates are. If they're running Crysis in the 20's, that's a 10% boost, which is pretty good. And I know I at least can percieve a difference between say 22FPS and 25FPS.
 
Before I reformatted, i could overclock my 8800gt 700/1750/1950 from vanilla, and noticed a little boost in performance... maybe 3-6 fps in crysis, nothing too great. However, after my reformat, for some reason, I can't overclock at all, but I havn't noticed any major perfomance dips at all, and temps are a bit better.
 
No, it won't make much of a difference. I've overclocked an 8800GT from 600/1800 to 740/1000 including a 300MHz overclock on shaders from 1500 to 1800MHz.

I did the Crysis GPU benchmark several times. The framerate increase was negligible for the noise/temperature tradeoff. I got a 1 to 3 FPS increase averaged over several benchmarks. But I guess with Crysis, that could mean a lot. Just run the Crysis GPU benchmark right now with your current setup and then overclock and run it again.

3800+ X2 @ 2.6 GHz
The problem is your CPU isnt fast enough to use the extra power.

With my old 4200+ at 2.8GHz, I found that Crysis didnt perform any better when the gfx card was clocked.
Moving to a fast C2D, I see a fair performance jump now.
My max performance was gained at around 80% of default gfx clocks, higher clocks hardly made a difference.

Here is some core speed testing I did a few days ago with my C2D @ 4GHz and 8800GT
Note these are just core clock improvements, with a CPU boost as well the increase is even greater
(at max gfx clock, I saw approx 50% performance boost in Crysis at 1280 res with a very high config, moving from [email protected] to C2D @ 4GHz)

% of default clock, actual clocks used, fps, performance %.

80% 500/1252/900 37.36fps 84.6%
100% 625/1566/900 44.14fps 100%
120% 750/1890/900 49.57fps 112.3%

To calculate the performance difference from min 80% to max 120% clocks use this formula...
(fps max / fps min) -1 = (112.3 / 84.6) - 1 = 32.7%

Try doing similar to see how high a gfx clock gets you your max then you can see how much more you will get with a faster CPU and the highest clocks your card will do stable.
 
Lol I doubt it. I noticed with this forum there's quite a few that act like its the wow forums.

As you can see, no I wasnt kidding, and no, Im not clueless either on the matter. It seems I was a little ahead of the info curve on this one (most likely because Ive been looking at the 8800 GTS's) and my response could have meant more than one thing, so sorry I didnt explain myself better. ATI tools dont seem to work on the G92's though, so thats why I mentioned it. I wasnt shocked about ATI tools on nVidia, but you will note that I listed the 8800 GTS 512 in particular.

I just picked up two BFG 8800GTSOC 512's from Best Buy for $503 online though. Considering the price, it seems simply hooking up two in SLI would be the best increase in performance... the same if not better than the new fangled GX2 (I just wont be able to run 4 G92's like dual GX2's would be in SLI). Overclocking these buggers gives such limited results. I think its because of the memory myself.
 
Overclocking does help but what we need is a next generation video card (especially at 1680*1050 and higher).
 
I don't know if there is any real good basis for this belief but I've always felt nvidia provided better drivers than ATI. But I must admit I haven't owned an ATI card for many years.
 
My experience of an ATI X1800 card and the drivers was exemplary.
The only downer was that I couldnt make decent use of my 3D stereo glasses but NVidia have now dropped the ball with their Stereo support.
G92 cards have zero support, if you try very hard you might get some 3D games to work in stereo but its not pleasant.

ATI are much much better than when they had a bad reputation for drivers but neither company is perfect.
 
Are you SURE the ATI Tool works with the NEW G92 based GTS? I have tried it...nada. It does not report back the clock speeds, stock or otherwise. I tried several different drivers for fun...no luck. This may be a Vista issue as well....are you guys using XP + ATI Tool to overclock?

That article he liked is using it on the G80 GTS...not the same thing. I quote:

************************************

"Test Subject: FOXCONN 640MB GeForce 8800 GTS
By default, the FOXCONN 8800 GTS operates with a 575MHz G80 GPU core clock speed, a 1188MHz shader, and a 900MHz (1800MHz) GDDR3 RAM speed. I will utilize the free overclocking utility ATITool v0.26 to search out the best clock speeds and simulate heavy graphic loading to establish stability."
It works for GT (G92 8800GT 512 MB) so there is a very good chance it works for GTS.
I'm using XP tho, so it may be Vista related.
 
Back
Top