6800/6900 Overclock Results

Have you tried adjusting the voltage for the VRAM, in isolation, if that's still possible on the Radeon 6000 series? Also, how are the temperatures of your VRAM and your VRMs?
No have not tried, not sure if possible or not, I would assume it should be. Vram when I checked while mining Ethereum, something that really stresses the Ampere cards, it was rather cool, like less than 70c if I remember right. As for VRMs no clue, never checked while under load with HWinfo.
 
Same FC5 settings and system as before:
  • Default fan curve, max temperature on second run after first, GPU Cur Temp/Junc temp 64c/79c. Max fan speed 1257rpm
  • Custom fan curve, max 55c/71c. Max fan speed 2971rpm.
Love how hitting crl + shft + o gives monitoring overlay in a game. Plus Alt + R gives Radeon Settings right in the game where you can fool around with Performance Tuning (OCing) as a note. Something I miss being able to do on my RTX cards.

Yeah, I'm definitely having weird results in FC5. Chances are because it is not official and likely running an old patch from around launch time. (on the settings screen it says v 1.011)

I installed Metro Exodus Enhanced edition (which I own on Steam), and I am getting pretty damn great results in it.

(and by great I mean relative to other hardware out there)

There is no built in repeatable benchmark though.

I started out by setting everything to 4k and ultra and VRS off. That resulted in ~35fps, below what I would play at.

I'm not really sure what VRS, supposedly some sort of scaling tech, but it did nothing for me. Tried 2x and 4x but nothing about the framerate changed.

Then I lowered RT to normal, leaving everything else where it was, this resulted in a range of 59FPS to 70FPS depending on scene, completely playable if you ask me, and I don't know if I can even tell much difference between Ultra and Normal RT.

GPU Utilization is pinned, and the GPU is REALLY consuming lots of power for a game. 350 to 385W. Not as much as in Timespy, but still quite a lot.


I also tried running Deus Ex: Mankind Divided. I remember playing this recently, and it bringing my system to its knees at 4k. Apparently that was 5 years ago. (feels like at most 1-2, but that's neither here nor there) In order to hit 50fps vsynced on my old JS9000 (which actually had a 50hz display mode) using my Pascal titan X I had to create a custom letterboxed 21:9 (3840x1646) resolution.

With the 6900xt and its overclock, at 4k with everything maxed to ultra (except motion blur which is turned off) using Temporal AA and sharpening it averages about 120fps. Turn off Temporal AA and sharpening and and instead turn on 2x MSAA and it takes a huge hit, going down to 65-70fps. 4x MSAA is in the 40's or below.

The 6900xt is chugging along at ~2670mhz, with utilization pinned and power use at about 300W with occasional spikes up to 350.

Crazy how large of a difference a few years can make.

So it really sounds to me like I just have shitty FC5 results, presumably due to the build being really old and unofficial, and for that I can blame no one but myself. There is a cost to my launcher/store bloat and spyware protest after all.

Still need to figure out if there is something wrong with the cooler mount. If I get in there and find protective plastic between the water block and the GPU I am going to laugh my ass off ans share pictures.

I'm waiting for Cyberpunk to go on sale. (Still $60, bleh) There was a sale a few weeks ago, but that was before I had a GPU that could run it at 4k, so I decided to wait. If too much time passes, I may cave and buy it at full price. When I am able to get it, I'll share some numbers from that too.
 
Last edited:
Metro Exodus Enhanced does have a benchmark, open up where the game is and you will see the benchmark exe file where you can benchmark. It has a UI where you can set the settings, resolution etc.
 
Metro Exodus Enhanced does have a benchmark, open up where the game is and you will see the benchmark exe file where you can benchmark. It has a UI where you can set the settings, resolution etc.

Ah, thank you. I didn't think to actually look for a separate executable. I was poking around the GUI. I'll look for that now.
 
Then I lowered RT to normal, leaving everything else where it was, this resulted in a range of 59FPS to 70FPS depending on scene, completely playable if you ask me, and I don't know if I can even tell much difference between Ultra and Normal RT.
Metro Exodus Enhanced does have a benchmark, open up where the game is and you will see the benchmark exe file where you can benchmark. It has a UI where you can set the settings, resolution etc.
Ah, thank you. I didn't think to actually look for a separate executable. I was poking around the GUI. I'll look for that now.

Looks like my early game sample was not reflective of the rigor the benchmark subjects the system to.

Sheesh.

I accidentally ran a huge set of benchmarks, 3 replicates of each, and they finally finished. I'm not sure if this is good or not, but one thing is certainly true. This game is heavy.

1080p High:
Average Framerate (99th percentile): 88.08
Max. Framerate (99th percentile): 150.08
Min. Framerate (99th percentile): 58.44

1080p Ultra:
Average Framerate (99th percentile): 80.72
Max. Framerate (99th percentile): 144.01
Min. Framerate (99th percentile): 51.60

1080p Extreme:
Average Framerate (99th percentile): 67.96
Max. Framerate (99th percentile): 106.37
Min. Framerate (99th percentile): 45.52

4k High:
Average Framerate (99th percentile): 60.05
Max. Framerate (99th percentile): 78.51
Min. Framerate (99th percentile): 44.28

4k Ultra:
Average Framerate (99th percentile): 46.60
Max. Framerate (99th percentile): 63.82
Min. Framerate (99th percentile): 35.68

4k Extreme:
Average Framerate (99th percentile): 39.53
Max. Framerate (99th percentile): 52.98
Min. Framerate (99th percentile): 28.90

4k Custom: (Extreme, but with Raytracing Normal)
Average Framerate (99th percentile): 45.54
Max. Framerate (99th percentile): 59.42
Min. Framerate (99th percentile): 32.05

Looks like I'd need FSR support to make anything but "High" playable at 4k...
 
Looks like my early game sample was not reflective of the rigor the benchmark subjects the system to.

Sheesh.

I accidentally ran a huge set of benchmarks, 3 replicates of each, and they finally finished. I'm not sure if this is good or not, but one thing is certainly true. This game is heavy.

1080p High:
Average Framerate (99th percentile): 88.08
Max. Framerate (99th percentile): 150.08
Min. Framerate (99th percentile): 58.44

1080p Ultra:
Average Framerate (99th percentile): 80.72
Max. Framerate (99th percentile): 144.01
Min. Framerate (99th percentile): 51.60

1080p Extreme:
Average Framerate (99th percentile): 67.96
Max. Framerate (99th percentile): 106.37
Min. Framerate (99th percentile): 45.52

4k High:
Average Framerate (99th percentile): 60.05
Max. Framerate (99th percentile): 78.51
Min. Framerate (99th percentile): 44.28

4k Ultra:
Average Framerate (99th percentile): 46.60
Max. Framerate (99th percentile): 63.82
Min. Framerate (99th percentile): 35.68

4k Extreme:
Average Framerate (99th percentile): 39.53
Max. Framerate (99th percentile): 52.98
Min. Framerate (99th percentile): 28.90

4k Custom: (Extreme, but with Raytracing Normal)
Average Framerate (99th percentile): 45.54
Max. Framerate (99th percentile): 59.42
Min. Framerate (99th percentile): 32.05

Looks like I'd need FSR support to make anything but "High" playable at 4k...
At 1440p it plays good enough with the eye candy on with the 6900XT. I don't have this now on the system with the 6900XT, have it on the 3090 system where it really shines well. DLSS is about the best I've seen. You want me to show you the 3090 numbers?
 
If you get to FC 6, here is some data for 4K

Max settings with motion blur off, RT Reflections and Shadows on, AMD CAS on
  • FSR off
    • Far Cry® 62021-10-15-2-41-35.jpg
  • FSR Ultra Quality (please note GPU was not fully loaded at times due to CPU limited conditions, game just does not effectively use full capability of CPU)
    • Far Cry® 62021-10-15-2-35-17.jpg
  • FSR Quality (even further limited by CPU, GPU at times 400mhz lower than without FSR)
    • Far Cry® 62021-10-15-2-39-14.jpg
VRam usage at times is over 15gb during the benchmark which I find usual. Not sure an update changed this, before the max I saw was just over 13gb. Maybe explains why the 3080Ti performed poorly. Since my 4K monitor is 60hz, 40hz-60hz FreeSync range, I just use it without FSR (looks better) and use Radeon Chill to keep it in the FreeSync range. Anyways on the 6900XT this is a very well playing games, looks great -> at least for me. On the 3080Ti I was getting hitches, RT shadows would cause crashes within seconds at 4K resolutions.
 
If you get to FC 6, here is some data for 4K

Max settings with motion blur off, RT Reflections and Shadows on, AMD CAS on
VRam usage at times is over 15gb during the benchmark which I find usual. Not sure an update changed this, before the max I saw was just over 13gb. Maybe explains why the 3080Ti performed poorly. Since my 4K monitor is 60hz, 40hz-60hz FreeSync range, I just use it without FSR (looks better) and use Radeon Chill to keep it in the FreeSync range. Anyways on the 6900XT this is a very well playing games, looks great -> at least for me. On the 3080Ti I was getting hitches, RT shadows would cause crashes within seconds at 4K resolutions.

Good data, thanks.

It's crazy how CPU limited the FC series is.
 
To get back to gaming performance, I ran some tests and played around a little in Far Cry 5 last night, and I am more confused than not.

I'm getting very inconsistent results, but here is what I got with all settings maxxed in the benchmark.

View attachment 403084View attachment 403083View attachment 403082

View attachment 403081

These results are a bit worse than what I have seen published in reviews. (though they don't mention their settings) but that may just be CPU limits. This game is notoriously CPU dependent.

But sometimes when I run the benchmark I get way worse results. Having a window open (like rivatuner stats) open on a second screen seems to have a much larger impact on performance than I am used to. That may just be a peculiarity in this title, but I do not remember that problem from when I played it the first time around.

At my overclock performance also seems highly temperature dependent. Performance goes way down if I change the fan settings to allow the coolant to increase even a few degrees. I may have to rework the overclock to keep things cooler. Maybe I'll even take the block off and re-apply the paste, as I have little to no faith that the manufacturer has done an adequate job there.

I will have to test in some more titles as well. This was just the first one I tried.

Side note: Do you guys enable any of these?

View attachment 403095
anti-lag should be on all the time, unless you have a game which shows micro-stutters. Then you can try turning it off, to see if that is the culprit. Anti-lag does things to keep input lag as low as possible. Its similar to Nvidia's "low latency mode".

enchanced sync is how you activate free-sync/VRR support

morphological anti-aliasing is post process MLAA which should work with just about any game. useful for titles which have no AA built in (Nioh 2 would be a recent example. Also, RE:Village has AA. but its less good on PC, compared to the console versions).

There's also a setting somewhere, where you can set a "minimum" clock speed for the GPU. This is a way to help keep GPU performing its best, with games where it might be clocking down or not pulling as much power, due to lower utilization, etc.
 
anti-lag should be on all the time, unless you have a game which shows micro-stutters. Then you can try turning it off, to see if that is the culprit. Anti-lag does things to keep input lag as low as possible. Its similar to Nvidia's "low latency mode".

enchanced sync is how you activate free-sync/VRR support

morphological anti-aliasing is post process MLAA which should work with just about any game. useful for titles which have no AA built in (Nioh 2 would be a recent example. Also, RE:Village has AA. but its less good on PC, compared to the console versions).

There's also a setting somewhere, where you can set a "minimum" clock speed for the GPU. This is a way to help keep GPU performing its best, with games where it might be clocking down or not pulling as much power, due to lower utilization, etc.

Interesting info. I never used Low Latency Mode on my Nvidia cards either. I always figured it was something for esports type of stuff and would sacrifice quality to accomplish lower lag. I always prioritize quality.

I'll give it a try and see how it looks.
 
low latency mode and anti-lag do not affect visual quality. it does things to organize/optimize frame delivery. Its a driver side tweak. Nvidia's reflex tech are game and driver optimization done in coordination with the game developer, to further optimize frame delivery and reduce input latency further.
 
anti-lag should be on all the time, unless you have a game which shows micro-stutters. Then you can try turning it off, to see if that is the culprit. Anti-lag does things to keep input lag as low as possible. Its similar to Nvidia's "low latency mode".

enchanced sync is how you activate free-sync/VRR support

morphological anti-aliasing is post process MLAA which should work with just about any game. useful for titles which have no AA built in (Nioh 2 would be a recent example. Also, RE:Village has AA. but its less good on PC, compared to the console versions).

There's also a setting somewhere, where you can set a "minimum" clock speed for the GPU. This is a way to help keep GPU performing its best, with games where it might be clocking down or not pulling as much power, due to lower utilization, etc.
Enhanced sync is separate from FreeSync, it can work along with it, if framerate is above the FreeSync range it tries to time the frame output to monitor refresh rate.

FreeSync has to be enabled on the Display Tab: If this is not on as well as the monitor having it on, you won't have FreeSync
  • Menu Gaming tab -> Global Graphics Button -> Menu Display tab
FreeSync.png
 
Enhanced sync is separate from FreeSync, it can work along with it, if framerate is above the FreeSync range it tries to time the frame output to monitor refresh rate.

FreeSync has to be enabled on the Display Tab: If this is not on as well as the monitor having it on, you won't have FreeSync
  • Menu Gaming tab -> Global Graphics Button -> Menu Display tab

I think I'd rather just have double buffered v-sync kick in at that point to prevent runaway heat and power. No need for rendering over my monitors 120hz IMHO.
 
I think I'd rather just have double buffered v-sync kick in at that point to prevent runaway heat and power. No need for rendering over my monitors 120hz IMHO.
Use Chill, it works magically to keep frame rate in the FreeSync range, just adjust the lower number up higher closer to the max refresh rate and high number at max refresh rate. VSync does not work the same as Nvidia with AMD, you will loose FreeSync/VRR. Nvidia works with GSync.

Another feature AMD has is FrameRate Target Control -> past experience with it was pretty bad. I recommend you experiment.
 
Use Chill, it works magically to keep frame rate in the FreeSync range, just adjust the lower number up higher closer to the max refresh rate and high number at max refresh rate. VSync does not work the same as Nvidia with AMD, you will loose FreeSync/VRR. Nvidia works with GSync.

Another feature AMD has is FrameRate Target Control -> past experience with it was pretty bad. I recommend you experiment.

Ahh, that might actually explain a thing or two. I was definitely trying to use vsync like on Nvidia, where it kicks in at the top of the G-Sync range.

Going to have to re-try a few things now.
 
Good data, thanks.

It's crazy how CPU limited the FC series is.
This is on Windows 11 which has a current issue with Ryzen, about 3x more latency on L3 Cache and not choosing the fastest core. Once that is fixed I expect better results from the FSR performance.

Got the 3080Ti working on the Samsung TV, 5800x system, looks like user error (me) was not in game mode which caused flashing with VRR, plus maybe the newest driver xxx.13, in game mode it is working in HDR with VRR. Anyways it has about the same performance without FSR at 4K same settings and much higher than the 3900x/6900XT with FSR (as would be expected with this game heavy dependence on one core and lightly using others) which it too is suffering from Windows 11. Game is very fun especially on the big screen.
 
Looks like my early game sample was not reflective of the rigor the benchmark subjects the system to.

Sheesh.

I accidentally ran a huge set of benchmarks, 3 replicates of each, and they finally finished. I'm not sure if this is good or not, but one thing is certainly true. This game is heavy.

1080p High:
Average Framerate (99th percentile): 88.08
Max. Framerate (99th percentile): 150.08
Min. Framerate (99th percentile): 58.44

1080p Ultra:
Average Framerate (99th percentile): 80.72
Max. Framerate (99th percentile): 144.01
Min. Framerate (99th percentile): 51.60

1080p Extreme:
Average Framerate (99th percentile): 67.96
Max. Framerate (99th percentile): 106.37
Min. Framerate (99th percentile): 45.52

4k High:
Average Framerate (99th percentile): 60.05
Max. Framerate (99th percentile): 78.51
Min. Framerate (99th percentile): 44.28

4k Ultra:
Average Framerate (99th percentile): 46.60
Max. Framerate (99th percentile): 63.82
Min. Framerate (99th percentile): 35.68

4k Extreme:
Average Framerate (99th percentile): 39.53
Max. Framerate (99th percentile): 52.98
Min. Framerate (99th percentile): 28.90

4k Custom: (Extreme, but with Raytracing Normal)
Average Framerate (99th percentile): 45.54
Max. Framerate (99th percentile): 59.42
Min. Framerate (99th percentile): 32.05

Looks like I'd need FSR support to make anything but "High" playable at 4k...

So, I was looking for comparison benchmarks to see how I was doing, and surprisingly they were rather difficult to find for Metro Exodus: Enhanced Edition.

Finally I found an article on Computerbased.de

My German is somewhat rusty (it's been almost 25 years) but if my reading is accurate they did the same custom bench I did (4k, Extreme preset, except RT=Normal)

They got an average of 34.3FPS on their 6900xt. I got 45.54. So that's 33% better for me. Not too shabby.

So I'd say yes, these overclocks DO translate into better gaming performance. In this case - however - it's just not quite enough. :p

Though it might be better with a custom letterboxed 21:9 res, like 3840x1646. I've had great success in using this method to squeeze some extra performance out of challenging games in the past. It can give you a ~30% boost in framerates in many cases, which would put me right at a 60fps average. I prefer to ahve minimums at 60 or above, but on a single player game with Freesync, this isn't strictly necessary.
 
I'm pretty sure we are cooler and power limited. That's about all I can get.
Yeah not surprising to me, but still pretty good overall. I likely will replace the stock cooler with a water block at some point, so I might get a bit more out of it but I expect the power limit to not allow much more.
 
Yeah not surprising to me, but still pretty good overall. I likely will replace the stock cooler with a water block at some point, so I might get a bit more out of it but I expect the power limit to not allow much more.

You can MTP it when you get a water block, but based on what I was seeing on my core, I was almost temp limited on hotspot. It doesn't really matter though, since I don't run any overclocks while gaming. lol.
 
Sapphire nitro + 6800xt. Pushing clocks much higher resulted in successful passes of 3dmark however I could not loop it for more than 5-10 mins. These settings were much more stable.
 

Attachments

  • 6800xt_2.JPG
    6800xt_2.JPG
    148.2 KB · Views: 0
  • 6800xt_1.JPG
    6800xt_1.JPG
    160.2 KB · Views: 0
Sapphire nitro + 6800xt. Pushing clocks much higher resulted in successful passes of 3dmark however I could not loop it for more than 5-10 mins. These settings were much more stable.

I'm pretty sure I'm stuck on a power delivery wall. Not quite sure I want to mess with MPT though. Either way getting 3080Ti numbers on a card In paid $659 for is a good, good feeling.
 
I'm pretty sure I'm stuck on a power delivery wall. Not quite sure I want to mess with MPT though. Either way getting 3080Ti numbers on a card In paid $659 for is a good, good feeling.

Probably power and possibly cooling. I definitely wouldn't complain.
 
Probably power and possibly cooling. I definitely wouldn't complain.

Definitely was power, little tweaking there:

1636256665987.png


Temps are still acceptable, 68 max GPU temp and 91 hot spot after a few TimeSpy runs.
 
What's the average clock on that run? Pretty high for sure.

Ran again because I didn't take a snip of the graph.....and got a higher score....average clock 2,542 MHz. Not sure if I just hit the silicon lottery here or not....this is a 100% stock reference card, reference cooler. Only mod is removing the backplate.


1636262893410.png

1636262932934.png
 
Ran again because I didn't take a snip of the graph.....and got a higher score....average clock 2,542 MHz. Not sure if I just hit the silicon lottery here or not....this is a 100% stock reference card, reference cooler. Only mod is removing the backplate.


View attachment 410382
View attachment 410383

Nice, I need to reapply my heatsink. My temps are pretty high at 15 power (max junction temperature hot). 21500 is about the max I can get on my 6900xt. At stock settings though, it doesn't go past 95 on junction.

(Reference card)
 
Been tweeking for a couple days finally got it stable with a good score!

XFX MERC 319, no MPT adjustment and on air
 

Attachments

  • 20211107_113937.png
    20211107_113937.png
    624.9 KB · Views: 0
  • 20211107_113909.png
    20211107_113909.png
    164.2 KB · Views: 0
For those with the reference 6900xt, what is the default boost clock set to?

(max, min clock rate)
 
Last edited:
Just realized I forgot to update.

After a little bit more tweaking, this is the best I've been able to achieve.

I may be able to walk up the core clock by single digits, but I don't think I'll be able to do much better, at least not without extreme cooling.

23411.PNG


link
 
For those of you hitting 21-23,XXX TS scores, are y'all running XTXH-spec cards or cranking up the power limit to 400ish watts (so presumably under water)? Or have driver updates helped since early in the year when we were all getting 20,XXX scores?

My 20.3k score from a few pages back seemed about mid pack at the time (Feb or so).
 
For those of you hitting 21-23,XXX TS scores, are y'all running XTXH-spec cards or cranking up the power limit to 400ish watts (so presumably under water)? Or have driver updates helped since early in the year when we were all getting 20,XXX scores?

My 20.3k score from a few pages back seemed about mid pack at the time (Feb or so).

No, my 21400/21500 is just with standard power limit max. Also reference card.

Edit

I don't know about the drivers, since I haven't had my 6900xt very long.
 
Last edited:
No, my 21400/21500 is just with standard power limit max. Also reference card.

Edit

I don't know about the drivers, since I haven't had my 6900xt very long.
Interesting. When I get home next week I'll update the drivers and load the previous profile and see if the score improves.
 
For those of you hitting 21-23,XXX TS scores, are y'all running XTXH-spec cards or cranking up the power limit to 400ish watts (so presumably under water)? Or have driver updates helped since early in the year when we were all getting 20,XXX scores?

My 20.3k score from a few pages back seemed about mid pack at the time (Feb or so).

I upped power just a bit on my 6800XT to get the 22k score. 298 watts @ max draw. It's an AMD reference black edition card, and while the fans are....not quiet....at those settings the temps are within reason. I don't keep my card pegged at that extreme OC, but I'll game all day with a pretty high OC on the card with zero issues.
 
Back
Top