GTX 680 SLI - Unequal Boost Clock Speeds

Milena

Limp Gawd
Joined
Aug 9, 2010
Messages
444
Hi

I was running a single GTX 680 for a couple days and got another today. Performance is outstanding no doubt but i'm curious why both cards run at different boost speeds.

Why is one capping out at 1163 and the other at 1084?
Both are Asus GTX680-2GD5 and according to GPU-Z 0.6.0 they both have BIOS Version 80.04.09.00.0F. I didn't overclock either card yet and this is MSI Afterburner 2.2.0 Beta 15 monitoring.

heaven_2012_04_11_14_10_28_080.jpg


Any suggestions?
 
As far as I know the GTX 680s in SLI can both clock independently with GPU boost so each card only clocks as much as it thinks is needed.
 
Same thing happened to me dude. ASIC quality affects boost speed, although in my case the difference was within 10-20mhz.
 
I've always found GPU1 to run hotter than GPU2 in SLi setups. So given that GPU boost is determined by things like temperature and TDP, it doesn't surprise me that GPU1 would clock higher than GPU2 in a 680 SLi setup.
 
Thanks for the reply's.

I tried both cards alone and they seem to be different indeed (Completely removed from my PC, not just disabled SLI). The one I bought last week goes up to 1163 and the one I got today to 1084 only. Maybe its really ASIC quality or just a bad card, not defective but just not as good as the other.
 
Thanks for the reply's.

I tried both cards alone and they seem to be different indeed (Completely removed from my PC, not just disabled SLI). The one I bought last week goes up to 1163 and the one I got today to 1084 only. Maybe its really ASIC quality or just a bad card, not defective but just not as good as the other.

Have you tried GPU-Z to read your ASIC quality on the cards ? On the main window, right click the upper left border and choose ASIC
 
Kepler ASIC can't be read yet as far as I know, both cards are showing this result:
gpuz060gtx680asic.jpg
 
That sux, I knew the results on the 5xx series was over-shooting the results, I thought they would have updated it by now !
 
Tested the cards for a few more days now, somethings wrong with the higher clocking 680 (not sure what but I have a bad feeling). The card going to 1163 Mhz without any OC is running 10-12°C hotter than the other one after long gaming or benchmark sessions. This is in a Silverstone FT-02 with vertical mounting, I switched PCIe slots a couple times and tested both cards alone as well. I don't like this whole GPU Boost thing....

Anyone else is running 2-Way SLI with 680's and has boost clocks differing that much?
 
You could overclock the cards independently in precision or afterburner unsynced and even up the boost clocks if it really bothers you. That is a little more than I would feel comfortable with.
 
Yea that works well, can force the cards to run more in sync with AB. I wonder if Nvidia will offer an option to disable the boost in an upcoming driver update.
 
Why is one capping out at 1163 and the other at 1084?

avoid the marketing slides from NV's press deck on the definition of "boost" and analyze it, you will see that it's geared at maximizing bins. The board components and the chip all come at variable performance levels. The chips have different leakage levels, default VID states, and efficiency metrics (all within the binned range to qualify as GK104-400 samples). The final product of chip package paired with sample board together have a goal of drawing power through the 6+6+pci-E, excluding the memory asic draw and memory phases, and for that power to fit within an acceptable range.

When the board hits its stock defined clock of 1006 and 1058mhz to meet the "gtx 680 qualification requirements" it has some headroom left which can vary depending on sampled combination of board + pwm + asic. That leftover headroom is wisely capitalized in the variable amount of boosting the final product does to maximize the performance of the 680 product within the TDP of the board. ASIC sample quality, efficiency and leakage variations are why different 680's boost to ~1060 and some to ~1270. Not everyone's GK104 boosts the same.

Your statement is exactly what I had in mind, thinking "it's only going to boost when the gpu is not stressed hard". But since all samples have a nominal voltage of 1175mv in their loaded clock states, the boost maximizes performance of various samples. Look at it as: You're guaranteed 1056, high clocks represent the lottery factor of higher quality bins.

The +offset and +power control the designed 170W and 195W qualification and limitations, but the principle still applies that the "boostclock" state will continue increasing core frequency to the maximum within the desired TDP.
 
Thanks a lot for the very good explanation :)
I'm probably making the 1163 one the primary card to get more out of it in situations where SLI is disabled.
 
Back
Top