HOWTO: Edit and flash your video card's BIOS in Windows (nVidia/G80 Specific)

so aslong as i have a pci vid card im good? wicked.
btw whats better with havin ultra bios then gtx OC?

Absolutely nothing. You're better off setting new clocks with your original BIOS -- less chance of failure, fewer things to go wrong.
 
And in case anyone was wondering, this works fine with SLi too. I just flashed both my cards from with Vista64 and did not have to remove them.

When you issue the command, it searches for hardware matches for the bios and asks you to proceed, one at a time. Just hit 'y' at the first request, then hit y again at the second request (you only have to issue the nvflash command once).

I then saved the new flashed rom from both cards and verified that they both indeed took the new clock speeds.
 
I highly recommend that too.

Save your original BIOS then mod that BIOS from your card to where you wish it to be. Works great. I've flashed my card many times that way with no problems.
 
Why not just use RivaTuner? Fine a stable OC then just save for it to be set @ startup and you're good to go.
 
because,
does that work with sli?
you have to have it on start-up?
if you re-install windows, you have to clock again?
theres lots of things :)
 
because,
does that work with sli?
you have to have it on start-up?
if you re-install windows, you have to clock again?
theres lots of things :)

Good Points, so what works for some might now work for others. Got it!!
 
So i am about to overclock my 8600gts from its stock 710 core to 730, and from 2000mem to 2200. i tried it using riva tuner and ntune, and it seems stable. heat has not increased (maybe 1 degreee), i am tired though of vista warning about riva tuner loading at statrtup and shutting down ntune also...

do you think this is a safe overclock?

thanks,
 
Sure did!! I did it but BE SURE to lower the memory speed to 1025 to keep it stable.

Mine is running 650/1650/1025 with no issue.

I have a question(s) for you guys. I just got my EVGA 8800GTX, and I was using Rivatuner. And, I put it up to Ultra settings and ran 3DMark06, and noticed artifacting. I was recording the temps of the card while it was doing the test, and under the lower FPS drops, the card ran to about 80-82 degrees Celcius.

So, I just dropped it down to 600 Core, and 1003 memory, and now the card temp is reading 66 degrees, but the most is goes up to now is maybe 77 degrees. What would be a dangerous temperature for the card, what are average temps for the GTX, and is it different just dialing in the clocks say with Rivatuner, or flashing the BIOS?

P.S. Also, the huge temperature differences (11degrees) between idle and load kind of worried me. Is that normal?
 
80 degrees max i would say, whatever the highest clocks you can get :)
idle and load will be big diff. myns 50/80 degrees idle/load!
 
thanks for the guide!

Now, I have 2 G80-Ultra Heatsink/fans from work, and so I want to OC my GTX's to Ultra settings but I was worried about cooling. I assume the G80-Ultra Heatsinks would be adequate on my GTX's?
 
thanks for the guide!

Now, I have 2 G80-Ultra Heatsink/fans from work, and so I want to OC my GTX's to Ultra settings but I was worried about cooling. I assume the G80-Ultra Heatsinks would be adequate on my GTX's?

There's nothing special about the ultra cooler. You propably can reach default ultra clocks on the core and shaders but perhaps not the memory.
 
Well I edited my stock bios to 576/1400/900 and I have it OCed to 655/1566/1080 for 24/7 use. For benching I can get it up to 675/1620/1125. I might vmod the gpu so I can get core over 700. I'm watercooled and my temps are 40c idle and 45c load.

E6600 @ 3.896Ghz. (I can't wait for either Q6600 or E6850) w/ 8800GTX @ 675/1125

3DMark06=13,730

Really nice guide BTW. ;)
 
So the fan settings in the bios editor won't work?

I'd like to flash my card, but if the fan won't go to 100% while under use I don't want to bother and will just stick with rivatuner for both.
 
I have a question(s) for you guys. I just got my EVGA 8800GTX, and I was using Rivatuner. And, I put it up to Ultra settings and ran 3DMark06, and noticed artifacting. I was recording the temps of the card while it was doing the test, and under the lower FPS drops, the card ran to about 80-82 degrees Celcius.

So, I just dropped it down to 600 Core, and 1003 memory, and now the card temp is reading 66 degrees, but the most is goes up to now is maybe 77 degrees. What would be a dangerous temperature for the card, what are average temps for the GTX, and is it different just dialing in the clocks say with Rivatuner, or flashing the BIOS?

P.S. Also, the huge temperature differences (11degrees) between idle and load kind of worried me. Is that normal?

Thing is, Im on water not on air. ;)
 
So the fan settings in the bios editor won't work?

I'd like to flash my card, but if the fan won't go to 100% while under use I don't want to bother and will just stick with rivatuner for both.

I've never gotten them to work, and I've heard from techs at EVGA that the fan speed is set in the drivers once Windows loads anyway. Now they could have been blowing (no pun intended) smoke up my ass but I tend to believe them based on my own results.
 
Really nice guide BTW. ;)


Thank you sir, thank you much. :)

I've still got a few updates I want to make to the original post but I've been really busy with some medical issues of my own, plus dealing with Dad's cancer. Both of which are very serious business. :(

But I'll get to it, promise!
 
hi there, i have posted a few times in this thread, but i have just been inspired.
after seeing an 8800ultra retail for just £20 more than a gtw, i want one!!!
but, my gtx wont do sli, so would flashing it with the ultra bios sort this?
cheeers
 
hi there, i have posted a few times in this thread, but i have just been inspired.
after seeing an 8800ultra retail for just £20 more than a gtw, i want one!!!
but, my gtx wont do sli, so would flashing it with the ultra bios sort this?
cheeers

I don't advice do it since Ultras are hand picked to meet a certain criteria. What do you mean your GTX won't do SLI? You bought an Ultra already and want to do SLi with a flashed GTX to Ultra?
 
i got a gtx, and want to buy an ultra, can i flash my card with ultra bios so i can do sli?
 
Hmmm, do a quick research to see if Ultras run a higher gpu voltage. I know you could do it when the 7900 GTOs came out and people were flashing them to 7900 GTXs so this can work. So, make sure that your GTX can do the same OC numbers as the Ultra you're getting and Good Luck!!
 
turns out the 9800gtx will own all, so ill get that;.
i no the gtx can run ultra bios, but would sli like that work?
 
hi there, i have posted a few times in this thread, but i have just been inspired.
after seeing an 8800ultra retail for just £20 more than a gtw, i want one!!!
but, my gtx wont do sli, so would flashing it with the ultra bios sort this?
cheeers


Speaking strictly technically, the cards are the same except I think the Ultra might have reduced latency memory on it. You have two choices in getting this to work:

1. Flash the Ultra with a normal GTX BIOS. Thereby both cards would work in SLI.
2. Flash the GTX with the Ultra BIOS - if it can't support Ultra clocks then use NiBitor to lower the clocks in the Ultra BIOS before flashing it.

When operating in SLI mode both cards will run at the slower card's speed anyway.

This is all theoretical and assumes that there is nothing in the Ultra BIOS to prevent it from working properly on normal GTX cards.
 
People have flashed GTX's with Ultra BIOSes before -- it doesn't break anything. Just make sure the GTX runs stable at Ultra stock clocks beforehand via Rivatuner and a nice long testing session, or edit the Ultra BIOS before flashing as Blue Falcon said.

I'd also suggest running the GTX in the top slot on your motherboard, as the secondary card tends to run hotter. Be careful even after you've tested and both cards are running in SLI, though -- both cards tend to run hotter in SLI than one card alone, but the secondary card runs hottest, hence putting the Ultra in the bottom slot because it should take the added heat better, having run cooler to begin with.

Also, they /may/ play nice out of the box -- the Ultra may just downclock itself to match your GTX... but I kind of doubt it.
 
Why would you flash your GTX with an Ultra BIOS? You're just lowering your max overclock due to the high memory speed of an ultra.
 
Why would you flash your GTX with an Ultra BIOS? You're just lowering your max overclock due to the high memory speed of an ultra.

For the express purpose of making it play nice with your new /real/ UItra in SLI? Can't think of any other good reason, really.
 
I'm using Vista Home Premium 64 Bit and I get this error message when trying to backup my old bios: "ERROR: No NVIDIA display adapters found"
I've disabled driver signature enforcement and run the command prompt as administrator.

When trying to read the bios from the card in Nibitor I get the error message "Can't start driver 1275".

ATI Tool sees my card in current configuration so I know that driver signature enforcement is disabled.

This is my setup:
CPU: Opteron 175
Mobo: DFI Lanparty UT NF-4 SLI-DR
Video: EVGA 8800 GTS 640

I'm all out of ideas, can anyone figure out what I"m doing wrong?
 
Bump. This ought to be sticky...

I haven't put together my new system yet, but I'm curious if there is any way to change the voltage on Gf8 series via BIOS. I asked over at the host site for the BIOS editor and the developer said he thinks there are just two VIDs, and the 8x00 cards use the higher one.(here) In theory that would mean that one could undervolt the card with a BIOS mod. This could be very useful for a HTPC/server setup that needs some graphics capability for gaming but usually sits around doing DVR stuff.

So, has anyone tried modding the voltage via BIOS yet?
 
Bump. This ought to be sticky...

I haven't put together my new system yet, but I'm curious if there is any way to change the voltage on Gf8 series via BIOS. I asked over at the host site for the BIOS editor and the developer said he thinks there are just two VIDs, and the 8x00 cards use the higher one.(here) In theory that would mean that one could undervolt the card with a BIOS mod. This could be very useful for a HTPC/server setup that needs some graphics capability for gaming but usually sits around doing DVR stuff.

So, has anyone tried modding the voltage via BIOS yet?

I don't believe the VID values in the G80 BIOS actually do anything - the only way to make core or memory voltage adjustments is via a hardware modification.
 
First of all nice manual.

Second, I've heard that my gainward 8800gtx has the same hardware components as the gainward 8800gtx, is that correct? The only real difference is the cooler and the voltage settings?

Third, does anyone have 8800gtx or where can I find the 8800gtx ultra bios?
 
First of all nice manual.

Second, I've heard that my gainward 8800gtx has the same hardware components as the gainward 8800gtx, is that correct? The only real difference is the cooler and the voltage settings?

Third, does anyone have 8800gtx or where can I find the 8800gtx ultra bios?

Don't bother using an ultra bios since the ultra bios is mostly just OCed GTX bios. Use your original bios and change the clocks (to stable ones) then flash it using a modified gtx bios that you create.
 
I want to flash my 7900 GSOC from BFG to GT level to unlock the pipelines that are turned off! Everywhere i check states that there not laserlocked and that there only
disabled via the bios, rivatuner doesnt seem to like doing much with them!

I tried flashing the bios to a BFG 7900 GT-OC and i was able to boot!
but it artifacted about 95% from the initial VGA BIOS screen announcement!

Its watercooled and ramsinked so i know it was not overheating! it was just goin nuttz!

I would like to use my stock bios and just flip the bits to open those pipelines, but does anyone know which bits, and what to set them to!!!!

My card makes 1600Mhz memory easily with not even a care for temps and no artifactos!
Core clocks i have pushed from 540 to 570 without issue, dont know exactly how far it will go however!

HOW DO I ACTIVATE THOSE PIPERS IN THE BIOS!
 
Backing up the bios can be a lot more easier with NiBiTor. Simply run NiBiTor (in Vista, make sure you run it as adminstrator).

Tools --> Read BIOS --> Select Device... (select your video card then hit OK)

once that's done, go to:

Tools --> Read BIOS --> Read into NiBiTor

then from there you can save your bios to a .rom file.

NiBiTor vesion 3.6 just came out yesterday :)
 
i also have a question that i cannot find an answer to. which one do you think better:

A) modifying my 8800GT Superclocked bios and change the speed to the Super Superclocked (SSC), then nvflash the new bios.

B) nvflash an 8800GT Super Superclocked(SSC) bios directly onto my SC card.

in general these two methods should be identically the same, but I just realize that the SSC card may have a little bit higher voltage than the SC. Maybe that extra of voltage is the main factor to the card stability. In another word, method B would be more effective because it also changes the voltage of the card.

so can somebody explain why my theory is true or not? thanks :)
 
i have an sc 8800gt running at almost ssc speeds. isnt that the same as flashing the bios?
 
Backing up the bios can be a lot more easier with NiBiTor. Simply run NiBiTor (in Vista, make sure you run it as adminstrator).

At the time I originally wrote the howto Nibitor (v3.4a IIRC) wouldn't fetch the BIOS from any card in Vista, at least not in the 64-bit version of the o/s. I haven't tried it again in Vista since I switched back to XP for the time being.

I haven't had time to update and/or revise the howto lately as I've been busy in Real Life. :eek:
 
Latest NiBitor won't read an 8800GT Bios from Vista 64.

You can save it from nvflash though.
 
i also have a question that i cannot find an answer to. which one do you think better:

A) modifying my 8800GT Superclocked bios and change the speed to the Super Superclocked (SSC), then nvflash the new bios.

B) nvflash an 8800GT Super Superclocked(SSC) bios directly onto my SC card.

in general these two methods should be identically the same, but I just realize that the SSC card may have a little bit higher voltage than the SC. Maybe that extra of voltage is the main factor to the card stability. In another word, method B would be more effective because it also changes the voltage of the card.

so can somebody explain why my theory is true or not? thanks :)

The GPU voltage on the 8800 cards isn't controlled by the BIOS or via any software method - it's hardwired. Yes, Nibitor will appear to let you alter the GPU core voltage in the G80 BIOS but it actually doesn't do a thing. As far as I know this also holds true for the G92 based 8800GT as well.

With that being said either option 1 or 2 should net you exactly the same results so just do whatever one is easiest for you. :)
 
Back
Top