Up GPU Volts to 1.5 on 7800GT (no hardware mod)

Budwise

[H]ard|Gawd
Joined
Dec 7, 2004
Messages
1,833
This is not my discovery, but it sure is sweet.

You will need:
NiBiTor BIOS Editor
http://www.mvktech.net/index.php?option=com_remository&func=fileinfo&filecatid=1421

NvFlash
http://www.mvktech.net/component/op...func,fileinfo/filecatid,1354/parent,category/

and the courage to do it of course ;)


The directions are listed in a few places, its VERY simple, and well worth it!

Here's where i originally stumbled across it:
http://www.xtremesystems.org/forums/showthread.php?t=81441

Here's another guide thats has pictures etc. However this uses an older version of NiBiTor, but its very easy to follow anyway.
http://www.planetamd64.com/lofiversion/index.php/t15247.html


I successfully flashed mine to 1.5v, and the results are great! My max GPU OC went from around 485 to 505Mhz. Most people are reporting a 20-30Mhz increase on the GPU, and a small increase on the memory as well for some reason (maybe the volts increase a tad on the vmem too?). My load temps did not change at all, however i am using an NV Silencer, not stock cooling. I am extremely happy that i took the time to do this, as the results are amazing.

Here's my new 3dMark05: 8551
http://service.futuremark.com/compare?3dm05=1678013

Compared to my old score of 8119

As usual, do this at your own risk! For those of us with eVGA cards, dig in as BIOS flashing does NOT void the warranty as far as ive heard!

Not sure if this is a repost, sorry if it is...
 
no, not at all. Its actually very minor compared to hardware volt mods. As i said, im seeing NO changes in temps at all. If you dont like it, you can always flash back...

Stock volts for 3D load is 1.4, so its only a .1v change.
 
Well I remember taking a 6800 GT from 1.3 to 1.4 volts damn near made the thing melt down! I'll try this as soon as I get a real video card...
 
I saw a small 1 or 2 degree bump but that could very well be dure to ambient inconsintencies,
 
Wow, I had always believed that it wasn't possible to alter the core/mem voltage on 7800 cards. I'll definately be trying this on my VF700cu equipped eVGA 7800GT. :)

I'll let you guys know the results...
 
you will be pleasantly surprised at the gains such a simple mod will give you...
 
do you know if it works for the evga 7800gt co? I would assume so, but I thought I might as well ask.
 
Budwise said:
yes. I have done it to 2 7800 GT CO's already both successfully.
what kinds of gains did you get on them? I just bought a 7800 GT CO today and I'm looking forward to trying this mod.
 
did you use a floppy? i'm still looking to up my 6600GT's voltage as i had reached the core's limit of 575MHz and it wouldn't go higher without artifacting and my temps were still great (VF700Cu) i haven't taken a ride out to fry's since i made the thread asking about the nvidia bios editor. :((need a floppy drive)
 
Thanks for the links above. I'll try this tomorrow and post some before and after speeds/temps.
 
Isn't the stock score for a 7800GTX in the upper 8k range? If so, this voltmod could put the 7800GT within striking distance of a GTX.
 
uh... so you get more frames per second when you enable antialiasing and antistroping filtering?
 
I fixed my 2 7800GT's with this mod and it detected 462/1222 as the optimal speed, just to be safe I set them to 450/1200 and will see how they work this way with F.E.A.R.
:p :D

Update: I just tried playing F.E.A.R. for a while and the computer crashed, so I flashed the cards back to default and set them to 450/1100 that has worked fine for the last month, ooohh well at least I tried. LOL!!! :( :eek:
 
with my GT CO ive been playing all kinda of games, no issues. Still one of my favorite mods :)
 
Il try this when I get my zalman on, but for now I'll keep to my oc with stock cooling.
 
maybe better than stock cooling is a good idea with this mod... Still, i have not seen any temp gains though...
 
Any way to do this without a floppy? I dont have one of the gay thing in my comp.
 
i guess you could try to do it on CD. I know you can make windows boot CD's, ive just never done it. I keep a floppy drive in my closet, and only break it out for such occasions.
 
Im not sure. Read through a few of the pages of the other threads ive posted. They give alot more info and results..
 
I finally did the volt mod.

before @ 465/1.18GHz:
idle 41C
load 62C (rthdribl 1022x766 window until temp stabilized)
CS:S 1600x1200, everything high, reflect all
0xAA, 2xAF 139fps
4xAA, 16xAF 118fps

after @ 490/1.2GHz:
idle 42C
load 66C (rthdribl 1022x766 window until temp stabilized)
CS:S 1600x1200, everything high, reflect all
0xAA, 2xAF 142fps
4xAA, 16xAF 121fps

96redformula said:
I was able to go from 505 to 530 but it would artifact.....so the old bios went back on!
I also gained 25MHz on the core and almost 200MHz on the memory, but it's not worth it. The gain is pretty insignificant overall. I flashed back to the original BIOS. Maybe if I upgrade from the reference cooler in the future i'll try again.
 
OK so I did the volt mod right after my post in this thread the other night.

The card is an eVGA 7800GT (p/n ends on 516) that I got as part of the infamous Newegg eVGA nFiorce 4 SLI + 7800GT combo. The stock cooler has been removed and replaced with a Zalman VF700 cu (all copper). The thermal interface material is Arctic Silver 5. The BGA memory modules have anodized aluminum ramsinks on then courtesy the VF700 retail kit.

It should be noted that this card ships at stock speeds of 445/1070, already higher than the reference 7800GT speeds.

The BIOS mod went smoothly and without issue, aside from the fact that I had to make a bootable DVD-RW with MS-DOS on it so that I could use nvflash. I used the new version of nBitor to edit the BIOS. The new version doesn't require you to swap the card ID around in order to get the volt mod.

Before the BIOS volt mod the card would reach a max safe overclock of 492/1.165 (+ or - a few mhz) while using the coolbits 'Detect optimal frequencies' button. After the BIOS volt mod the 'Detect optimal frequencies' button returned around 508/1.195 (again + or - a few mhz). I decided to really put this to the test and started rthdribl.exe yesterday morning before I left for the day... risky, yes. I'm happy to say that after ~20 hours of running there was not one sign of artifacting or instability. My computer room was hot as hell though. :D

Why use rthdribl.exe to burn-in and test for artifacts? Well I find it to be a great stress on the GPU, even after 3 years. Leaving it running @ max AA will really torture even the newest most badass GPU. Because of the effects used in the demo it also has this amazing tendency to hilight even the smallest amount of artifacting, For instance when you are pushing the core too far you will start to notice bright white 'sparks' appearing in the 3D environment. And let's face it, the demo is friggin cool as hell to look at. :)

I also ran 3DMark 03 and 05, and Half-Life 2 at highest detail to look for artifacting.


/Anyway, now for something a bit more interesting. Apparently the 'burn in' effect is in full force - because after the 20 hour stress period the 'detect optimal frequencies' button now consistantly returns a value around 513/1.210. This sticks through reboots, returning to stock speeds, etc. Pretty friggin cool.

Yes, I realize that the automated overclocking features of Coolbits aren't really to be relied on, and that in fact the card is probably cable of a few more mhz. I plan to push it more over the next few days. :)
 
What is this GPU stress test you talk of Blue Falcon? Link?

I have the same card as you but still only stock cooling, should I go for it anyway?

have it oc'ed to 475/1.12 right now and temps are fine. Like 40 idle and 60 load I believe maybe a little less.
 
iRoNeTiK said:
What is this GPU stress test you talk of Blue Falcon? Link?
http://www.daionet.gr.jp/~masa/rthdribl/
The same one i used in the post above his. :p

Make the window as large as possible if you're monitoring the temperature in the nvcpl at the same time. Small windows running rthdribl aren't as stressful as running it larger. It also supports full screen mode higher than 640x480 if you use the configure display options. It won't run at 1600x1200 with default options on 256MB video cards.
 
The .1v increase means a 15% increase in power draw...plus a smaller increase from the clock rate boost. Which makes me wonder how accurate the figures are from those reporting only a 1-2 degree rise in load temps.
 
masher said:
The .1v increase means a 15% increase in power draw...plus a smaller increase from the clock rate boost. Which makes me wonder how accurate the figures are from those reporting only a 1-2 degree rise in load temps.
I had a 4C increase in load temperature with reference cooling after the mod, a 15% increase. Along with the 5% increase in power from the extra 25MHz overclock i'm going to guess that the "1.5v mod" is giving slightly less than 1.5v on 7800 GT cards.

Mine makes sense. ;)
 
Blue Falcon said:
OK so I did the volt mod right after my post in this thread the other night.

The card is an eVGA 7800GT (p/n ends on 516) that I got as part of the infamous Newegg eVGA nFiorce 4 SLI + 7800GT combo. The stock cooler has been removed and replaced with a Zalman VF700 cu (all copper). The thermal interface material is Arctic Silver 5. The BGA memory modules have anodized aluminum ramsinks on then courtesy the VF700 retail kit.

It should be noted that this card ships at stock speeds of 445/1070, already higher than the reference 7800GT speeds.

The BIOS mod went smoothly and without issue, aside from the fact that I had to make a bootable DVD-RW with MS-DOS on it so that I could use nvflash. I used the new version of nBitor to edit the BIOS. The new version doesn't require you to swap the card ID around in order to get the volt mod.

Before the BIOS volt mod the card would reach a max safe overclock of 492/1.165 (+ or - a few mhz) while using the coolbits 'Detect optimal frequencies' button. After the BIOS volt mod the 'Detect optimal frequencies' button returned around 508/1.195 (again + or - a few mhz). I decided to really put this to the test and started rthdribl.exe yesterday morning before I left for the day... risky, yes. I'm happy to say that after ~20 hours of running there was not one sign of artifacting or instability. My computer room was hot as hell though. :D

Why use rthdribl.exe to burn-in and test for artifacts? Well I find it to be a great stress on the GPU, even after 3 years. Leaving it running @ max AA will really torture even the newest most badass GPU. Because of the effects used in the demo it also has this amazing tendency to hilight even the smallest amount of artifacting, For instance when you are pushing the core too far you will start to notice bright white 'sparks' appearing in the 3D environment. And let's face it, the demo is friggin cool as hell to look at. :)

I also ran 3DMark 03 and 05, and Half-Life 2 at highest detail to look for artifacting.


/Anyway, now for something a bit more interesting. Apparently the 'burn in' effect is in full force - because after the 20 hour stress period the 'detect optimal frequencies' button now consistantly returns a value around 513/1.210. This sticks through reboots, returning to stock speeds, etc. Pretty friggin cool.

Yes, I realize that the automated overclocking features of Coolbits aren't really to be relied on, and that in fact the card is probably cable of a few more mhz. I plan to push it more over the next few days. :)


NICE results!
 
Budwise said:
NICE results!

Thanks...

Looking at your sig shows that you've got quite the nice memory overclock going on your 7800GT. Pretty impressive for sure. :)
 
I did this (FINALLY) and got a full 7mhz out of it! :rolleyes:

Man my card sucks... At least it's rock solid (I think) at 484 insted of 475. Still sucks seeing as my stock speed was 470 AND I have it on water.
 
let it sit for a little bit with its newfound voltage, then try again. Mine started out autodetecting at 493, and now it does 500...
 
Back
Top