AMD Radeon HD 6950 to HD 6970 Mod

Godmachine and Mancannon, are you running 6970 bios or unlocked 6950?

Whats scary about the ram being permanently damaged is that it is essentially unprecedented that higher memory speeds without higher voltage or heat would cause physical damage. Typically you see some artifacts, clock it back down a bit and everything is golden.

6970 BIOS. No issues to report at all.

Well maybe the vendors of Nvidia and AMD are sick of gamers overclocking the piss out of there cards increasing RMAs and also allowing them to buy a lessor card and gain near 90-95 percent of the speed of the far more expensive cousin.

Of course thats just wild speculation.
 
Last edited:
Just got my 6950 installed. It successfully unlocked the core at 6950 speeds. Not gonna do the 6970 bios, not needed for my setup.
 
I flashed to an unlocked asus 6950 bios, currently doing 925/1300 at stock 6970 volts of 1.17. I will wait for additional reports on the memory before I go whole hog on it.

On a side note, stress testing your overclock is a PITA because the ATI driver doesn't crash gracefully like Nvidia's does. If I pushed my 470 too far in OCCT, some white artifacts appear in the stress test screen and I turn the OC down or voltage up. If I push the 6950 too far, BAM, fucking crash.
 
Just got my 6950 installed. It successfully unlocked the core at 6950 speeds. Not gonna do the 6970 bios, not needed for my setup.

When I flashed mine the clocks remained the same as the 6950 but the shaders were unlocked. Increased the clocks to 880/1375 in Afterburner and everything ran fine until a few loops in Heaven GSOD and freeze in 3DMark11. Turned down the ram to 1350 and everything is fine but I'll turn down to 1312 since that's about a 5% OC over 6950 ram clocks(1375 is 10% OC). People with weaker 6950s that can't run those higher vram clocks cause damage to their cards. I recommend everyone lowering the ram clocks before benchmarking the card after flashing until MSI enables voltage tweak. I bought the MSI brand with better caps and the cards seems more stable than the other stories I've heard.
 
Last edited:
I'm running a Sapphire 6950 with unlocked shaders (not 6970) @ 865/1320 and seems to be fine. My concerns are:

1) Does someone have knowledge of the spec on the Hynix H5GQ2H24MFR T2C memory used on these? Not the rated speeds as the 5gig is well reported but rather what does the industry use as planned safety net? Is it rated @5 but expected to perform at 10% more or whatever without failure? Seems odd that the 6970 mem is rated at 6 and running at 5.5 but the 6950 is rated and running at 5.

It's to give OC headroom on the memory I suppose.

2) Should I back the mem off to closer to 1250 since it is not "rated" to do more than 5g to give the card an opportunity to gracefully age instead of flaming out?

I'd go very easy on the memory. I killed my 6950 by clocking the memory too high (1350+).

3) I have a second card due to arrive today and will x-fire and unlock as well. I've read so many reviews the past few days I'm afraid I'm not retaining all of the information..but my PSU is 650w will this be sufficient? I'm guessing yes but not by a whole lot.

2 unlocked 6950's will pull as much power as 2 6970's. Say 500w at max load for both of the GPU's, then a quad core CPU at 140w ... and you're pretty close to your psu limit. I didn't even consider the RAM, HDDs, mobo, or other components.

I'd pick up a nice 850w PSU if I were you.
 
Is this a decent enough upgrade from a 5770 to warrant upgrading to a 6950 (stock)? I wouldn't particularly mind if I couldn't unlock it, just another form of overclocking for me (gravy).
 
Is this a decent enough upgrade from a 5770 to warrant upgrading to a 6950 (stock)? I wouldn't particularly mind if I couldn't unlock it, just another form of overclocking for me (gravy).

What resolution do you play at?
 
What resolution do you play at?

1280x1024, which is usually fine unless I crank up all the settings to Ultra/Quality/SuperDuperUltimateVideoSettings, then I can see dips into the 15-20 FPS range.

I might upgrade my monitor too, though...it's not even DVI
 
Is this a decent enough upgrade from a 5770 to warrant upgrading to a 6950 (stock)? I wouldn't particularly mind if I couldn't unlock it, just another form of overclocking for me (gravy).

I woiuld imagine at that low of a resolution your able to play all games maxed out with a 5770...though if you can't a 6950 would def. be up to the task...though I would think a 5870 would be able to do it and be cheaper at the same time.
 
well I was flashed at 6970, but just flashed back to an unlocked 6950 via ASUS BIOS. Now I am stuck at 840/1325 max overclocks in every program (Smart Doctor, MSI, and CCC). How are you all getting higher OC's? I was stable at 900/1410 with the 6970 bios, so I was going to try and get close again.
 
1280x1024, which is usually fine unless I crank up all the settings to Ultra/Quality/SuperDuperUltimateVideoSettings, then I can see dips into the 15-20 FPS range.

I might upgrade my monitor too, though...it's not even DVI

You need to upgrade your monitor first, definitely.

No point in spending so much money on video cards just to watch a mediocre or ugly quality image on a small screen.
 
well I was flashed at 6970, but just flashed back to an unlocked 6950 via ASUS BIOS. Now I am stuck at 840/1325 max overclocks in every program (Smart Doctor, MSI, and CCC). How are you all getting higher OC's? I was stable at 900/1410 with the 6970 bios, so I was going to try and get close again.

You're only unlocking the shaders not the clocks. AMD limits the core clocks on the 6950 to 840.
Just use the 6970 bios and downclock the vram.
 
Using the sapphire bios and stock voltage I've been able to get up to a gaming stable 915/1450 on my card. I can't wait to get legitimate voltage control. I feel these cards can hit 950 on the core easily, as my temps are in the low 80's @ 40% Fan (Which is exactly the same noise level as my room).
 
so re-flash to 6970, but use Smart Doctor (afaik that is the only program I've seen so far with voltage control) to down clock back down to 1.1v?
 
1280x1024, which is usually fine unless I crank up all the settings to Ultra/Quality/SuperDuperUltimateVideoSettings, then I can see dips into the 15-20 FPS range.

I might upgrade my monitor too, though...it's not even DVI

The 6950 will certainly max out anything at that res... The 6870 will also do the same for much less.

Please at least get a 20" widescreen 1680x1050 monitor. How can you subject yourself to 1280x1024!
 
so re-flash to 6970, but use Smart Doctor (afaik that is the only program I've seen so far with voltage control) to down clock back down to 1.1v?

Trixx and Smart Doctor has voltage control so you will have to use the ASUS or Sapphire 6970 bios.
I'm wait for MSI to update AB before I raise my clocks.
 
well the card is a Sapphire card to begin with, only flashed with Asus because of the voltage control. I wasn't aware that Trixx was available yet with voltage control for the 6 series yet.
 
Trixx 3.0.2b with 6900 support is on their website. Sapphire 6970 bios's are working for me on both cards, so I'm happy. Not looking to do more than 880/1375
 
nice, yeah I just tried out TriXX. It doesn't allow any over volting, but it does let you clock higher than the limited values on CCC. Using an unlocked ASUS bios (might flash back to Sapphire eventually) I am using the stock 6970 clocks of 880/1375. Furmark causes drivers to crash going much higher on either of the clocks, so I am pretty intent on those. Not too shabby I suppose for a little peace of mind about the ram not being over volted.
 
Trixx 3.0.2b with 6900 support is on their website. Sapphire 6970 bios's are working for me on both cards, so I'm happy. Not looking to do more than 880/1375

I am running at this speed too on the same bios. Like you I am not looking to push any further, Judging from 3D mark scores and Heaven benchmarks there doesn't seem to be too much improvement in higher clocks.

I am still hoping that the first driver set of this year will bring a performance increase and when better optimized for use with the 69xx series cards We might see more improvements at higher clocks.
 
I am running at this speed too on the same bios. Like you I am not looking to push any further, Judging from 3D mark scores and Heaven benchmarks there doesn't seem to be too much improvement in higher clocks.

I am still hoping that the first driver set of this year will bring a performance increase and when better optimized for use with the 69xx series cards We might see more improvements at higher clocks.

I get about 4fps more out of Heaven compared to 6950 at 840/1325. Metro 2033 is NOTICEABLY smoother at 5996x1200, High/AAA/tesselation as well. Unsure whether it's due to clockspeed, more shaders, or both ;)
 
At 840 core I was noticing in MSI afterburner that my gpu was actually bouncing around frequencies a lot and not really staying at 840mhz. Went as low as 700mhz at times.

First time with an ATi card since my 9800pro 256mb (that is still in the closet). I was fairly confident it was being throttled by drivers so I upped the power setting 1% at a time until it stopped bouncing. Needed to go to 3% extra power.

I thought other people might need to know this or not whatever its out there. Check your clocks!
 
First time with an ATi card since my 9800pro 256mb (that is still in the closet). I was fairly confident it was being throttled by drivers so I upped the power setting 1% at a time until it stopped bouncing. Needed to go to 3% extra power.
Be a man. Drag it to 20%.
 
Using 2 XFX 6950s in crossfire, FYI.

Successfully flashed to the Asus 6970 bios on my XFX 6950, but it was definitely running marginally hotter at the stock speeds. I left it at the 880/1350 speeds with 0% in powertune and then tried 10% and 20% bumps - almost no difference whatsoever in Unigine Heaven 2.1. I left it on 0% because of this.

After a few days of gaming with zero issues, I decided I didn't like having to run the fan a little louder in order to maintain my temps, and I was a little concerned with the 1350 MHz RAM speeds, so I reflashed with the XFX 6950 bios with unlocked shaders. Kept the stock speeds (800/1250) and benched again in Unigine Heaven and got results only about 4-5 fps below where I was with the stock 6970 settings (78.X avg vs 82.X avg). I then bumped the speeds to the max that powertune and MSI afterburner would allow (840/1325) and got 81 fps.

So here are my findings:

1. I couldn't see any significant result from playing with the powertune, whether it was with the stock (locked) 6950, the 6970, or the unlocked 6950 bios.

2. The increase in voltage from the 6970 bios did raise my temps by about 5-10 degrees under heavy load, resulting in louder fan speeds that were necessary. This, of course, went back to stock temps with the unlocked 6950 bios.

3. The difference between 840/1325 and 880/1350 with unlocked shaders wasn't significant at 1920x1200 with Tessellation/DX11/4xAA/16xAF. It might have been just outside of statistical variance.

4. I saw no artifacts at all in either bios condition after 4-5 days of light-moderate gaming in the following: Mass Effect 1, Crysis Warhead, Neverwinter Nights 2, Metro 2033, and Unigine Heaven bench. The only thing that I did see was a _slight_ bit of stutter when games or new areas loaded. I can't be sure whether it was hitching or stutter (as defined by Tweakguides)

5. Initially I followed Wizzard's suggestion of flashing each bios in the crossfire setup with only one card in the system at a time, but this didn't work AT ALL. The quick, painless way to do it that worked perfectly was to just do them both at one time in the DOS prompt, using -0 for one card and -1 for the other.


As an aside, I see a lot of people talking about overclocking the 6950 bios to stock 6970 levels (880/1350). How is this possible? My powertune and MSI afterburner apps only let me go to 840/1325 as a maximum.
 
Using 2 XFX 6950s in crossfire, FYI.

Successfully flashed to the Asus 6970 bios on my XFX 6950, but it was definitely running marginally hotter at the stock speeds. I left it at the 880/1350 speeds with 0% in powertune and then tried 10% and 20% bumps - almost no difference whatsoever in Unigine Heaven 2.1. I left it on 0% because of this.

After a few days of gaming with zero issues, I decided I didn't like having to run the fan a little louder in order to maintain my temps, and I was a little concerned with the 1350 MHz RAM speeds, so I reflashed with the XFX 6950 bios with unlocked shaders. Kept the stock speeds (800/1250) and benched again in Unigine Heaven and got results only about 4-5 fps below where I was with the stock 6970 settings (78.X avg vs 82.X avg). I then bumped the speeds to the max that powertune and MSI afterburner would allow (840/1325) and got 81 fps.

So here are my findings:

1. I couldn't see any significant result from playing with the powertune, whether it was with the stock (locked) 6950, the 6970, or the unlocked 6950 bios.

2. The increase in voltage from the 6970 bios did raise my temps by about 5-10 degrees under heavy load, resulting in louder fan speeds that were necessary. This, of course, went back to stock temps with the unlocked 6950 bios.

3. The difference between 840/1325 and 880/1350 with unlocked shaders wasn't significant at 1920x1200 with Tessellation/DX11/4xAA/16xAF. It might have been just outside of statistical variance.

4. I saw no artifacts at all in either bios condition after 4-5 days of light-moderate gaming in the following: Mass Effect 1, Crysis Warhead, Neverwinter Nights 2, Metro 2033, and Unigine Heaven bench. The only thing that I did see was a _slight_ bit of stutter when games or new areas loaded. I can't be sure whether it was hitching or stutter (as defined by Tweakguides)

5. Initially I followed Wizzard's suggestion of flashing each bios in the crossfire setup with only one card in the system at a time, but this didn't work AT ALL. The quick, painless way to do it that worked perfectly was to just do them both at one time in the DOS prompt, using -0 for one card and -1 for the other.


As an aside, I see a lot of people talking about overclocking the 6950 bios to stock 6970 levels (880/1350). How is this possible? My powertune and MSI afterburner apps only let me go to 840/1325 as a maximum.

You can enable the higher overclocking range in the Afterburner config file by changing the value for unofficial overclocking to 1. Interestingly with my Asus bios and smartdoctor, as soon as I enabled the added range in smartdoctor I was able to scale to those speeds with CCC Overdrive as well.

Since you have a similar display setup to me and I have only been running eyefinity for 3 days, could you tell me if you ever see what seems like slight vertical screen tearing in games? This happens at stock clocks as well as OC, and didn't go away when forcing vsync and triple buffering with D3D Overider. I see it about 1/3 of the way in from the left side of my center monitor and its pretty suble. I haven't run anything but Black Ops and Heaven 2.1 but I'm pretty sure I saw it in both. This is with 10.12a if that makes any difference.
 
rUmX, thanks for the reply. I've been following several forums the past few weeks trying to keep my cards just short of danger and it continues to elude me on the ram timings. Some mention volts as the problem and others the frequency. I assume it is actually a bit of both in that the 6950 ram isn't "rated" above 5gig (yet it is running at 5g so is that the "real" high end of the spec?) and the 6970 ram is "rated at 6g but is running @ 5.5. So the headroom is built in there but not on the 6950 knowing full well we all push these cards at least a little. Throw in the dual bios switch (which is friggin awesome in its own right) and it is either an outright invite to have our way with it or a come hither look and a whisper to our own ruination. I have no industry knowledge or anything but I'd hazard a guess that there is a built in error of around 10% in the same manner that if my car engine will absolutely explode at 10,000rpm and the red line is set at 8500 rpm then to me anyway they're telling me what is optimal and safe and showing me where the absolute limit is. Approach or exceed that number at your own risk. I was clocking the 6950 with shaders at 860/1370. Then I started hearing horror stories (including yours) about ram just frizzing out. I'm waiting patiently for AB to get updated so I can slowly up voltages and such but I'm afraid to get to heavy on the ram timing slider.

On a side note...my cards are idling at 32C which is about 12C below my 5770's even after I upgraded the tim on them. With the fan curve in AB keeping them cooled down I don't notice too much noise either.
 
Throw in the dual bios switch (which is friggin awesome in its own right) and it is either an outright invite to have our way with it or a come hither look and a whisper to our own ruination.

I lol'd at this.

I have also been awaiting clarification on what is causing the ram issues. Everything I know about overclocking PC ram and video card ram over the last 10 years suggests that frequency alone should not kill memory. Furthermore, DDR5 has its own error correction routines built in that kick in when pushed past maximum speed. What I believe is happening here is that the 6gbps ram on the 6970 requires more voltage to hit its rated speed, similar to how most DDR2 1066 needed 2.1V versus the 1.8V limit of the jedec spec. When you flash the 6970 bios then, you are overvolting the 6950 ram and it either can't handle it or it is not adequately cooled in order to handle the added heat load. If it simply can't take the voltage then we should all avoid the 6970 bios like the plague as it is shortening the life like we saw with BH-5 DDR. I personally think its likely that the overvolted ram simply wasn't being cooled adequately and burned out. I haven't read the threads all that thoroughly, but has anyone seen a correlation between people who burned out their cards and them not having applied custom fan profiles?
 
Asus 6950 flashed to unlocked Asus shaders bios. My Q6600 is holding me back in RE5 benchmark as I went from 4xAA to 8xAA with no change in frames.
 
Last edited:
Asus 6950 flashed to unlocked to Asus shaders bios. My Q6600 is holding me back in RE5 benchmark as I went from 4xAA to 8xAA with no change in frames.

Pics for piccy goodness:

img0445pm.jpg


img0450mr.jpg
 
Nice pictures man.

I was unprepared for the size of this beast, it dwarfs my reference 470.
 
Yes its the same length as my gtx 280, perhaps I worded it wrong.

That should read "I wasn't expecting the size of this card due to my experience with Nvidia's comparatively petite Fermi cards and the GTX 280 having been the last full length high end card I had purchased" ;)
 
There was plenty of room in my HAF-X for these beasts.

It's keeping them cool two, Hottest I've gotten was 79c on the top card (that was in Furmark, so it could have been throttling back)

Hottest I've seen the top card in gaming was 71c so far, bottom card averages prob. 8 degrees cooler
 
I'm also looking forward to a 6950 BIOS that is shader-unlocked, core voltage unlocked, and core clocks unlocked. I'm not sure what good it will do, though, if Powertune is going to place an artificial TDP ceiling, which you will presumably hit earlier with overvolting.

Just installed a pair of shader-unlocked 6950s with EK blocks, and temps maxed out at around 40C in Heaven benchmark. That's 15C lower than my 470s when I had them heavily OC'd and overvolted.
 
You can enable the higher overclocking range in the Afterburner config file by changing the value for unofficial overclocking to 1. Interestingly with my Asus bios and smartdoctor, as soon as I enabled the added range in smartdoctor I was able to scale to those speeds with CCC Overdrive as well.

Since you have a similar display setup to me and I have only been running eyefinity for 3 days, could you tell me if you ever see what seems like slight vertical screen tearing in games? This happens at stock clocks as well as OC, and didn't go away when forcing vsync and triple buffering with D3D Overider. I see it about 1/3 of the way in from the left side of my center monitor and its pretty suble. I haven't run anything but Black Ops and Heaven 2.1 but I'm pretty sure I saw it in both. This is with 10.12a if that makes any difference.

Thanks for posting this. Is the script for the unofficial overclocking in the config file obvious? I didn't get a chance to take a look last night, but I plan to look at it after work today.

Actually, I'm in the same boat - I've only been running this setup for a week or so (I upgraded the monitors with the video cards). That said, I haven't noticed what you're talking about. But I thought I read somewhere that Vsync doesn't work with eyefinity, so it wouldn't surprise me if this were possible and even likely.

Oddly enough, I noticed the phenomenon that you're talking about with Firefox yesterday. I have the Hydra grid set up on my three monitor eyefinity display such that I can maximize a window to each of the displays (though it's still only taking up a part of one "virtual display" or group). When I quickly scroll up and down in a lengthy webpage, I get a very clear vertical line that progresses from left to right across the page and repeats from the left again when it reaches the right. I haven't been concerned about this though as I've seen it before with other video cards, so I don't think it's unique to crossfire or eyefinity. I'm sure it's completely unrelated to what you're referring to with games though. If I see anything else in Crysis warhead or one of the others, I'll let you know.
 
I'm looking into unlocking the shaders with the 6950 bios but I wanted to do some tests first.

My card was at stock and when I ran OCCT GPU test I noticed my GPU clock were downclocking from 800MHz all the way down to 550MHz. I had to turn up Power Control to 15% to get it to stay at 800MHz while being stressed. I tried 5% and 10% and it still downclocked but not as low.

Is this normal for a stock card?
 
I'm looking into unlocking the shaders with the 6950 bios but I wanted to do some tests first.

My card was at stock and when I ran OCCT GPU test I noticed my GPU clock were downclocking from 800MHz all the way down to 550MHz. I had to turn up Power Control to 15% to get it to stay at 800MHz while being stressed. I tried 5% and 10% and it still downclocked but not as low.

Is this normal for a stock card?
In GPU stress tests like those, yes it might clock down from PowerTune (See Anandtech's review).
You should be able to leave it at 0% and it won't do that in actual games. If it does (have seen it before), then try running it up to a higher value and see what happens.

As far as I know, the card should never downclock during actual use.
 
Thanks for posting this. Is the script for the unofficial overclocking in the config file obvious? I didn't get a chance to take a look last night, but I plan to look at it after work today.

Actually, I'm in the same boat - I've only been running this setup for a week or so (I upgraded the monitors with the video cards). That said, I haven't noticed what you're talking about. But I thought I read somewhere that Vsync doesn't work with eyefinity, so it wouldn't surprise me if this were possible and even likely.

Oddly enough, I noticed the phenomenon that you're talking about with Firefox yesterday. I have the Hydra grid set up on my three monitor eyefinity display such that I can maximize a window to each of the displays (though it's still only taking up a part of one "virtual display" or group). When I quickly scroll up and down in a lengthy webpage, I get a very clear vertical line that progresses from left to right across the page and repeats from the left again when it reaches the right. I haven't been concerned about this though as I've seen it before with other video cards, so I don't think it's unique to crossfire or eyefinity. I'm sure it's completely unrelated to what you're referring to with games though. If I see anything else in Crysis warhead or one of the others, I'll let you know.

Hydra Grid sounds like exactly what I've been looking for since grouping my displays. Is this included in CCC?
 
Back
Top