Official GTX 680 overclocking Thread!

Probably just by flashing the BIOS. Why would you need to though? Mine never runs at the base clock.

You can't edit clocks via the bios anymore. Kind of like with recent AMD cards. You could flash a digital signed evga sc bios or something like that though.
 
On my 4-way GTX 680 setup using EVGA reference cards with EK water blocks, my stable overclock is 1202 MHz core and 3534 MHz memory. Pretty much spent all day testing those clocks using Heaven 3.0, BF3, Crysis 2, Metro 2033 and Skyrim with some mods and HD texture pack would crash at the lowest frequency. So modded Skyrim set my ceiling. I found though that performance scales linearly with increased memory clocks and there is no "sweet spot". Get that memory frequency as high as it can go!

Those number are still pretty good considering 4-Way SLI is usually a bit harder to highly overclock. 4-way 680 just destroys Skyrim in Surround:

http://www.youtube.com/watch?v=FvSh...DvjVQa1PpcFOpwHlB70YlOALsTk3pZxTxt4tbGt5H9Yk=

I am just astounded by how smooth the GTX 680 handles Surround using PCI-E 3.0 slots. There is no micro-stuttering, pauses or jitters. I don't think I could tell the difference between a single GPU running a single monitor and this 4-Way SLI Surround setup. Combine that with the CRT's and this is by far the smoothest and best Surround/Eyefinity setup of mine to date. ;)

It will be interesting to see if any of these new EVGA non-reference cards will allow the voltage to rise above software 1.175 (I think really the only way to get more clocks out of these and just not more power phases).
 
On my 4-way GTX 680 setup using EVGA reference cards with EK water blocks, my stable overclock is 1202 MHz core and 3534 MHz memory. Pretty much spent all day testing those clocks using Heaven 3.0, BF3, Crysis 2, Metro 2033 and Skyrim with some mods and HD texture pack would crash at the lowest frequency. So modded Skyrim set my ceiling. I found though that performance scales linearly with increased memory clocks and there is no "sweet spot". Get that memory frequency as high as it can go!

Those number are still pretty good considering 4-Way SLI is usually a bit harder to highly overclock. 4-way 680 just destroys Skyrim in Surround:

http://www.youtube.com/watch?v=FvSh...DvjVQa1PpcFOpwHlB70YlOALsTk3pZxTxt4tbGt5H9Yk=

I am just astounded by how smooth the GTX 680 handles Surround using PCI-E 3.0 slots. There is no micro-stuttering, pauses or jitters. I don't think I could tell the difference between a single GPU running a single monitor and this 4-Way SLI Surround setup. Combine that with the CRT's and this is by far the smoothest and best Surround/Eyefinity setup of mine to date. ;)

It will be interesting to see if any of these new EVGA non-reference cards will allow the voltage to rise above software 1.175 (I think really the only way to get more clocks out of these and just not more power phases).
jesus christ @ that video

I think I will be upgrading to a PCI-E 3.0 setup, it seems to make a difference with these from what people are saying.
 
I'm sure someone will break the digital signature.

Its never been done with AMD and thats been the case since the 6xxx series. I'm not sure that the benefits are really worth the time it would take to do so. You would have to hack the drivers somehow.
 
On my 4-way GTX 680 setup using EVGA reference cards with EK water blocks, my stable overclock is 1202 MHz core and 3534 MHz memory. Pretty much spent all day testing those clocks using Heaven 3.0, BF3, Crysis 2, Metro 2033 and Skyrim with some mods and HD texture pack would crash at the lowest frequency. So modded Skyrim set my ceiling. I found though that performance scales linearly with increased memory clocks and there is no "sweet spot". Get that memory frequency as high as it can go!

Those number are still pretty good considering 4-Way SLI is usually a bit harder to highly overclock. 4-way 680 just destroys Skyrim in Surround:

http://www.youtube.com/watch?v=FvSh...DvjVQa1PpcFOpwHlB70YlOALsTk3pZxTxt4tbGt5H9Yk=

I am just astounded by how smooth the GTX 680 handles Surround using PCI-E 3.0 slots. There is no micro-stuttering, pauses or jitters. I don't think I could tell the difference between a single GPU running a single monitor and this 4-Way SLI Surround setup. Combine that with the CRT's and this is by far the smoothest and best Surround/Eyefinity setup of mine to date. ;)

It will be interesting to see if any of these new EVGA non-reference cards will allow the voltage to rise above software 1.175 (I think really the only way to get more clocks out of these and just not more power phases).

Wow. Your setup is sick bro, very jealous.
 
i dremeled my asus gtx 680 and i think i broke it but the company says its was working 100% when they tested it they are sending back getting a replacement msi 680 should be here next week anyways

in the interim i bought a gigabyte 680gtx off the forums and have been able to overclock the gpu higher and than asus

this is highest i could get to go without the driver crashing in bf3

gpu offset +200
mem offset +715

idle temps are 22 c
load 49c

nvidia 301.10 drivers

iam using the arctic cooler twin turbo II and its pretty awesome and silent the fan never ramps up higher than 45% and iam using EVGA precision to overclock the card seem to work will with any card unlike the asus gpu tweak app which only works with asus cards.

i dunno if i should run sli or sell the other gtx 680 when comes in ?
 
i dremeled my asus gtx 680 and i think i broke it but the company says its was working 100% when they tested it they are sending back getting a replacement msi 680 should be here next week anyways

in the interim i bought a gigabyte 680gtx off the forums and have been able to overclock the gpu higher and than asus

this is highest i could get to go without the driver crashing in bf3

gpu offset +200
mem offset +715

idle temps are 22 c
load 49c

nvidia 301.10 drivers

iam using the arctic cooler twin turbo II and its pretty awesome and silent the fan never ramps up higher than 45% and iam using EVGA precision to overclock the card seem to work will with any card unlike the asus gpu tweak app which only works with asus cards.

i dunno if i should run sli or sell the other gtx 680 when comes in ?

What were you doing that you cut it with a dremel???
 
Anyone else find on their 680 anything over 60% on the fan and the noise is unbearable?

Using Precision's Fan Curve with my EVGA 680 SC+ and I won't let it go over 60% unless the temps reach 80 degrees or more, which it never does anyway.
 
this gigabyte 680 seems to be made better at the factory than the asus cuz all the T6 screws came out really easy they were not torqued down so hard

i had to clock down my gpu clock to 190 to make 100% stable
 
my temps are under 50c but it feel like its a luck of draw on these cards as far as overclocking its really up to the indivual chips and its like the bining really matters
cuz no matter what i do my chip crashes anything about 193 core doesn;t matter the temps
thats with mem set to defualt too
 
Guy's, remember to talk about the core frequency and not the off-set value. The off-set value will be different for everyone and it quite meaningless. I have one card that uses a +91 offset to get to 1240 MHz and one that uses a +153 offset to get to 1240 MHz.
 
Anyone else find on their 680 anything over 60% on the fan and the noise is unbearable?

Using Precision's Fan Curve with my EVGA 680 SC+ and I won't let it go over 60% unless the temps reach 80 degrees or more, which it never does anyway.

Yeah, same boat here.

I have my SC+ running on idle on 40% instead of 30% the default has but when the fan hits 60% then you can start getting annoyed.

Since you have the SC+ :D how is your OC experience so far?

Mine is set under Precision X like this:

Power: 115%
GPU Clock: +25
Memory clock: +500

I'm hitting 1200Mhz which is pretty good on default air cooler.

Here are my settings along with the fan curve:

gtx680ocfan.jpg
 
Last edited:
Yeah, same boat here.

I have my SC+ running on idle on 40% instead of 30% the default has but when the fan hits 60% then you can start getting annoyed.

Since you have the SC+ :D how is your OC experience so far?

Mine is set under Precision X like this:

Power: 115%
GPU Clock: +25
Memory clock: +500

I'm hitting 1200Mhz which is pretty good on default air cooler.

Here are my settings along with the fan curve:

gtx680ocfan.jpg

I haven't increased any of the offsets...yet. I simply put the power target @ 132%. From what I have found, this makes sure you always boost the highest you can.

My SC+ hits 1201 with just my power target @ 132%.

For my fan curve, I have it set up that I max out at 60%, unless 80 degrees or more. I have it set up so to keep the card under 70 degrees at max load. Because if you reach 70 degrees the card throttles your settings down.
 
I need some help undrstanding why my card keep fluctuating.

Im playing the free tribes, and my card goes all over the place 700to 900 to 1249 back to 1100. Its stable at those speeds in bf3. Is there a setting that will keep the clock more stable? Im confused.
 
I need some help undrstanding why my card keep fluctuating.

Im playing the free tribes, and my card goes all over the place 700to 900 to 1249 back to 1100. Its stable at those speeds in bf3. Is there a setting that will keep the clock more stable? Im confused.

It'll fluctuate with the load on the card, when the load is low (like desktop or loading screens) it'll drop the clocks speeds down. Sounds like normal behavior for what might not be a very demanding game. How does it behave in BF3? Keep the same clock speed there all the time?
 
Few weeks out and the GTX680 is still a great card; though every driver except the 300.83 create stuttering in BF3 and Batman: AC

Batman AC stutter problems are the game. Not the card or the drivers. Drop to DX9 and its much smoother. Just another confirmation DX11 is crap, as anything extra it added to the game it takes away from a smooth experience.
 
Batman AC stutter problems are the game. Not the card or the drivers. Drop to DX9 and its much smoother. Just another confirmation DX11 is crap, as anything extra it added to the game it takes away from a smooth experience.

DX11 isn't crap. Batman was just a crappy port of DX11.
 
DX11 isn't crap. Batman was just a crappy port of DX11.

How many DX11 games run good? I think other than Battlefield 3... I've always forced DX11 games to DX9. I haven't been impressed by a single thing DX11 has ever brought us. IMO DX11 has been a huge disappointment.
 
How many DX11 games run good? I think other than Battlefield 3... I've always forced DX11 games to DX9.

Off the top of my head:

Aliens vs Predator
Battlefield 3
Civilization 3
Dirt 2/3
Dragon Age 2
Homefront
Lost Planet 2
LA Noire
Need for Speed The Run
Red Faction Armageddon
HAWX 2
 
How many DX11 games run good? I think other than Battlefield 3... I've always forced DX11 games to DX9. I haven't been impressed by a single thing DX11 has ever brought us. IMO DX11 has been a huge disappointment.
Metro 2033...still one of the best looking games of all time IMO. Runs great on high with FXAA in CP....I still cant use DOF and the Ultra setting yet, maybe with one more 680.
 
Last edited:
Off the top of my head:

Aliens vs Predator
Battlefield 3
Civilization 3
Dirt 2/3
Dragon Age 2
Homefront
Lost Planet 2
LA Noire
Need for Speed The Run
Red Faction Armageddon
HAWX 2

Dirt 2 and 3 randomly crash in DX11. Thats a known game engine issue and was never fixed. The Run uses FB2 like BF3 and yes BF3 was designed for the PC first and DX11 works great (great game once the FPS cap was removed). HAWX 2 I also ran in DX9 cause it was smoother. Homefront ran better in DX9 too.

Notice I asked "How many DX11 games run good?" not how many games have you played in DX11. I stand by my statement and will compare and evaluate the worth of DX11 before I play through a game using it.
 
Dirt 2 and 3 randomly crash in DX11. Thats a known game engine issue and was never fixed. The Run uses FB2 like BF3 and yes BF3 was designed for the PC first and DX11 works great (great game once the FPS cap was removed). HAWX 2 I also ran in DX9 cause it was smoother. Homefront ran better in DX9 too.

Notice I asked "How many DX11 games run good?" not how many games have you played in DX11. I stand by my statement and will compare and evaluate the worth of DX11 before I play through a game using it.
I haven't had a single problem playing a game in DX11. Saying the game "runs better" or "runs smoother" in DX9 means nothing - you're turning on additional features most of the time, so it's not an apples to apples comparison. Maybe you just don't have a powerful enough GPU? Compare you actual frame rates.

And I never had random crashes in Dirt2/3 with either NVIDIA or AMD GPUs in DX11.
 
Got mine today. Running +105 / +450 which puts me at 1200/6900

Crashes if I go to +115 core. Any higher on memory yields negative performance.

3dmark2011 - P10200 not too shabby. Beats anything I could get with my 7970 and beat my dual 6970 score.
 
Last edited:
I didn't have "issues". DX9 just runs better. Like it does for everyone that still forces DX9 over DX11.
DX9 runs better in most games because you turn off the DX11 specific features as noted in a few posts earlier. Without actual frame rate #s at the same settings that argument is moot.

Regarding the Dirt2 crash issue, I dunno, I never experienced any system crashes that I remember. Maybe I didn't play the game enough, or maybe I just don't remember, I just don't recall having any issues and I always play games in DX11 when available.
 
Is there a way to lock the clock speeds? I heard there was a 'performance' mode setting but I can't seem to find it.
I want to lock in my card at the highest possible settings.
 
Is there a way to lock the clock speeds? I heard there was a 'performance' mode setting but I can't seem to find it.
I want to lock in my card at the highest possible settings.

There's an option in the nvidia control panel default 3d settings, but I haven't messed with it since I bought the 680 specifically for downclocking when not needed.
 
Is there a way to lock the clock speeds? I heard there was a 'performance' mode setting but I can't seem to find it.
I want to lock in my card at the highest possible settings.

There doesn't appear to be any way - but just like with Intel's turbo, why bother? The card clocks up when it can, so I'm not sure I see the reason to want to lock it? When fully loaded (playing BF3 for instance) mine runs at the top clock speed all the time, so it may as well be locked.

Edit: I tried playing with the settings in the control panel before and it didn't seem to make any difference. But I didn't try too hard as I like it to downclock.
 
Back
Top