GTX 680: VRM modification

bloodypulp

Gawd
Joined
May 17, 2010
Messages
903
Looking at the PCB, there's empty space in the VRM section.
One can see where additional components were planned to be mounted. Were more phases planned by the engineers?
Would adding an additional inductor and mosfets improve the voltage quality of the board for higher OC potential?

morePower.jpg
 
Nope. adding them won't do anything. The little chip they use is programmed for whatever power configuration it has and can only be programmed once.
 
There's also a third 6-pin connector spot up there, I think it is just to give the AIBs some flexibility in how they want to lay out the card (side-by-side connectors for aftermarket coolers, for example). However EVGA, for one, has already announced a 6+8 card. Wonder how that will impact GPU boost.
 
ya, I can't wait to see what the hell that is supposed to do for a user.
My guess is that you will be able to increase the power target much higher than 132%, as they also use a 14 phase VRM setup on the Classified, versus 4 phase on the reference.
 
ya, I can't wait to see what the hell that is supposed to do for a user.

Thats for people benching with liquid nitrogen. Its useless for 24/7 normal usage.

There's a pretty extensive 680 modding article over on overclockers. It was done by one of the engineers at evga, so he obviously knows his stuff. That said, there's no way I'd try this on mine, but it's a fun read none the less.

http://www.overclockers.com/guide-to-nvidia-gtx680-modifications/

Correct me if I'm wrong but is this all that you really need to do to override the voltage limit? It almost looks too easy.

vidmod3-640x495.jpg
 
Last edited:
You can try writing your own software. This is how Afterburner and Precision, all riva-based work. Same with trixx, nibitor, asus gputweak, evga tool, etc. Although you will likely need a VID hardmod like posted here in post 8 with this new pcb.

http://www.xtremesystems.org/forums...-you-want-without-hardmods!-(1.3-1.6v-or-more)
1. Find the I2C bus of your card by running the following CLI commands:
RivaTuner.exe /ri0,70,1A
RivaTuner.exe /ri1,70,1A
RivaTuner.exe /ri2,70,1A
RivaTuner.exe /ri3,70,1A
Three of these four commands will return "invalid" take note of the one that doesn't (for me it was /ri3,70,1A)
This will find the I2C bus (highlighted in red in my example)

2. Get your voltage register values
Using the I2C bus number found above, (0-3, highlighted in red) run the following CLI commands, replacing "#" with the I2C bus number.
RivaTuner.exe /ri#,70,15
RivaTuner.exe /ri#,70,16
RivaTuner.exe /ri#,70,17
RivaTuner.exe /ri#,70,18
Take note of the return value for each.

3. Convert voltage register values to actual voltage
For each of the values returned in step 2 do the following:
A. Convert the value to decimal format (the returned values are in hexidecimal)
B. Calulate actual voltage by the formula: voltage =
(VID * 0.0125) + 0.45
C. Compare the 4 resulting actual voltages to the voltage reported in 3D mode in Rivatuner hardware monitoring.
D. The closest value should be your 3D voltage (ex: for me Rivatuner showed 1.13v, I got 1.250v
E. Take note of the register that is associated with that value. (highlighted in red in step 2)

4. Calculating the voltage to use
A. Decide what voltage you want to set.
B. Find the VID for that voltage using the formula VID = (voltage - 0.450) / 0.0125
C. Convert the VID to hexadecimal

5. Setting a new voltage
You can set the voltage by writing the new VID in hexadecimal form to the register.
A. Run the CLI command: (replace # with IC2 bus number, and VID with the VID in hexadecimal form)
RivaTuner.exe /wi#,70,17,VID

The new voltage should now be set!



Example: GTX 260, desired voltage = 1.2v

1. All the commands return "invalid" except RivaTuner.exe /ri3,70,1A which returns "0A"

2. I get the following values:
RivaTuner.exe /ri3,70,15 returns 3B
RivaTuner.exe /ri3,70,16 returns 31
RivaTuner.exe /ri3,70,17 returns 36
RivaTuner.exe /ri3,70,18 returns 2F

3. Calculating the voltages of each:
Hex: Decimal: Voltage:
3B......59......1.1875v
31......49......1.0625v
36......54......1.1250v
2F......47......1.0375v
Rivatuner was reporting 1.13v in 3D mode so the third one is my 3D voltage register.

4. I wanted 1.2v so:
VID = (1.2 . 0.45) / 0.0125 = 60
60 = 3C in hexadecimal

5. I set the new voltage by running:
RivaTuner.exe" /wi3,70,17,3C

Originally Posted by Unwinder
A few tips and tricks:

1) Once you've determined index of I2C bus containing VRM on some display adapter (e.g. I2C bus 3 on GTX 200 series), the same index can be safely used on the same display adapter model by others. Display adapters have a few I2C buses assigned for differed functions (e.g. for connecting DDC and for external devices like VRMs, thermal sensors, fan PWM controllers and for on), VRM's I2C bus is defined by PCB design so it is fixed for the same display adapter families.
2) Don't try to scan more I2C buses than the GPU actually has (there was some posting with attempt to scan buses 0-99 in hope to find VRM on G92). Each GPU architecture supports fixed number of I2C buses, e.g. G80 and newer GPUs have only 4 I2C buses, pre-G80 supports 3 buses, pre GF4 supports just 2 buses and so on.
3) I see that many users started to enable VT1103 plugin now. Please pay attention to the following info from RivaTuner's release notes and always remember about it when using this plugin:

"Please take a note that Volterra voltage regulators are rather sensitive to frequent polling and may return false data under heavy load, so it is not recommended to use VRM monitoring in daily monitoring sessions"

4) There were some questions about finalizing these new VRM settings in NVIDIA VGA BIOS. You cannot use Nibitor for that because the tool knows nothing about VRMs and works with BIOS voltage tables only, it is only allowing you to change associations between performance levels (i.e. 2D/3D modes) and 4 fixed voltages stored into VRM registers 15-18 by default. However, you can easily edit your BIOS with any hex editor to reconfigure initialization scripts writing these 4 fixed voltages to VRM during POST. It is rather simple task, taking my 65nm EVGA GeForce GTX 260 as example the following script command in VGA BIOS is configuring VT1165:

4D 80 E0 06 15 3B 16 31 17 36 18 2F 1D 55 19 01

The command uniquely identifies I2C serial byte write operation, encodes target I2C device address (E0 is 8-bit encoding of VT1165's 7-bit address 70 including read/write flag in the first bit), tells script processor how many bytes have to be written (06) and finally defines register addressed and data to be written to each register (register 15 -> 3B, register 16 -> 31 and so on).
The voltages can be different for different VGA BIOS images, so the easiest way to locate this command in any GTX200 BIOS image is to search for 4D 80 E0 byte chain.

Please read my previous comment about I2C bus indices, altering I2C bus indices is not a correct way. You should try to probe difference i2c device addresses instead. Each I2C device address is defined by developers (e.g. ADT7473 fan controller use fixed address 2E, VT11xx VRMs use fixed register 70 by default, but it can be strapped to 7x AFAIR, etc).
You must have Primarion PX3544 datasheet to know where it normally resides. I've peeked inside reference 9800 GTX BIOS to see if it is initializing any I2C devices and it does seem to write something to I2C device 6A (writes single byte 86 to register 80 of this device). It can be Primarion PX3544, but I've no strict info about it because I don't have any info on this VRM I2C interface.
 
I think the big limitation right now is the 2x 6 pin power connectors. with 2x 8 pin and a beefy VRM, oc potential should go up nicely.
 
That is a limitation. The VID states are purposefully limited to a voltage range that corresponds with target clocks binned for this SKU. It keeps the BOM low for the board, because since you aren't going above ~1350mhz you aren't going to need more mosfets, capacitors, chokes and pwm phases, and you don't have to use higher-specced master buck controllers. They're saving those goodies for all the aftermarket (toxic/classified/pcs+/matrix/lightning/CU2) hooplah. Plus this boost feature requiring extra pieces of hardware onboard the card (like that daughterboard). Maybe their die costs are fixed, and the pcb design a variable cost, so keep it low, while still giving some limited overclocking capability. NV likely had to do more work just to make overclocking possible at all with the boost/offset stuff & the new software, so thanks at least for that. Wouldn't make sense for a high-end sku to not have OCing. GK104 core is capable of sampling 1900mhz, and King & Monstru both did ~1500 on air, unfortunately they both used that EVGA add-on board technically making the 680 PCB have more VRM phases. The retail board with only 4 phases and no low profile caps actually has one less phase and a missing 5 caps than the pre-productions from early pictures. That controller limiting VID state profiles to 1179-1219mv can be circumvented with a cut trace from the looks of Kingpincooling pictures. Then just have to use RT to poll the I2C to that controller to find hidden VID states 6 or 7 or whatever is greater than 1220mv, and use that forumla to set the state in 3D. I thought wizzard say on TPU that you cannot use I2C (I'm probably wrong) - but something is controlling voltage since you have granular voltage control in EVGA Precision. Looking at ASUS 680 bios is probably a good start.

I'm all for modding for more voltage, since most of the complaints in the 680 overclocking thread have to do with someone's particular board not even approaching the ~132% power profile with max stable clocks while watercooled at 45C or similar. Thats an ideal situation to increase the voltage and go for higher clocks. There is an ASUS bios you can use together with ASUS' GPUtweak that increases the clock limit to 1336mhz baseclock - I saw VRZone do a review of this board. That bios might be signed for higher voltage states, might be worth flashing to take a look at. Maybe you can do a little nvflash/nibitor work to get some of the higher VIDs. The 680 is designed with limited voltage on purpose is the main thing. They gotta keep it cheap, and still make room for faster models. They're smarter than AMD in that regard to build a low cost pcb and keep everyone from getting 1400-1500 out of their reference part. Savvy enthusiasts might not like that, but the majority will think they're overclocking to the moon.

If u get a higher voltage, you run the risk of overstressing those 4 rinky-dink phases. I wouldn't expect more than ~1380ish Mhz anyway, so if your boost clock is already at 1250+, is 120 extra mhz really worth it? Just flash the asus bios and go for 1335. This is only really worth it for the shitty samples that dont OC well even within the limits. And it's also good for the people not using near the 132% power profile who want more power pushed through to attempt higher speed. Nvidia wont be happy if an easy mod is discovered and adopted large scale. It disrupts the strategy of selling us aftermarket cards with more relaxed limits and higher cost. Everyone already know this tho :)

So yeah, go for it. Someone cut the trace, flash their card and poll the I2C, use a higher vid, get an extra 100 or 200mhz and risk blowin a mosfet! :^p
 
Yeah crazy. These pro clockers get parts by the tray load and don't care if they fry things. I'm not soldering anything and risking a perfectly good peice of HW I worked hard for,
 
I think the big limitation right now is the 2x 6 pin power connectors. with 2x 8 pin and a beefy VRM, oc potential should go up nicely.

I kind of doubt that its that big of a limitation. I mean GTX260 and GTX570 both draw a lot more than GTX680 especially once you start overvolting. I kind of doubt that overvolting the gpu an extra 10-15% would be that big a deal for the vrms.
 
I kind of doubt that its that big of a limitation. I mean GTX260 and GTX570 both draw a lot more than GTX680 especially once you start overvolting. I kind of doubt that overvolting the gpu an extra 10-15% would be that big a deal for the vrms.
Yeah, and 570s are notorious for blowing VRMs.
 
Yeah, and 570s are notorious for blowing VRMs.

Yeah, with an early batch with people running furmark with 1.2v core voltage. Thats 20% more voltage on a chip that uses more juice to begin with. I haven't heard of someone popping a GTX570 in a really long time.
 
I kind of doubt that its that big of a limitation. I mean GTX260 and GTX570 both draw a lot more than GTX680 especially once you start overvolting. I kind of doubt that overvolting the gpu an extra 10-15% would be that big a deal for the vrms.

I don't know. The one review I saw of a non-reference card that had 6+8 power also had an option for 150% power level in Precision, instead of just 132%. That card also went to 1354 core.

http://www.hardwareheaven.com/reviews/1461/pg20/palit-geforce-gtx-680-jetstream-edition-graphics-card-review-maximum-overclock.html
 
Yeah, with an early batch with people running furmark with 1.2v core voltage. Thats 20% more voltage on a chip that uses more juice to begin with. I haven't heard of someone popping a GTX570 in a really long time.
The 680 runs at 1.175v, and people have already pushed them to 1.2-1.4v with mods. They seem to like voltage to run at high clocks, and the stock VRM system is definitely not up to those levels. Not sure if the 6+6 pin requirements are a bottleneck at those levels or not, but I think an 8+6 combo along with more VRMs on a custom PCB would go a long way to making these more OC friendly.
 
The 680 runs at 1.175v, and people have already pushed them to 1.2-1.4v with mods. They seem to like voltage to run at high clocks, and the stock VRM system is definitely not up to those levels. Not sure if the 6+6 pin requirements are a bottleneck at those levels or not, but I think an 8+6 combo along with more VRMs on a custom PCB would go a long way to making these more OC friendly.

Why not? I've seen a guy on OCN pushing 1.3v on the GTX680 with a volt mod on LN2. I can't imagine that even 1.25v would be a big deal for the vrms. I highly doubt that a couple of extra ground wires is any bottleneck at all.
 
Love that brand name on the parts! "If you want to make your card radiate brightly enough to give you a tan, buy Suntan for your overvolting needs!"
 
Back
Top