6800/6900 Overclock Results

So as I have posted in it's own thread, I got the new badass XFX Speedster Zero WB Radeon RX 6900 XT that ships with an EK Waterblock preinstalled.

So, I haven't started overclocking yet, but I already but I figured I'd post my baseline runs here.

I flashed my BIOS to allow for Resizeable BAR / SAM support, and decided to run a Timespy with everything at stock settings, and well, it was BAD.



View attachment 402377

Then I remembered what some people said above about how Timespy REALLY doesn't like SMT on the Zen2 Threadrippers, so I tried disabling SMT and trying again.

View attachment 402378

Now, keep in mind. These are out of box numbers. No overclocking, no tweaking, no lowering voltage, no increasing power limit.

I feel like an over 22k graphics score out of the box without overclocking is almost too good to be true?


So, a few questions.

1.) Is this too good to be true? Could something be off? it is suspicious that my out of box results are above some of your max ovcerclock scores, especially since the core clock seems to be running at about the same ~2500mhz. I appreciate any thoughts.

2.) This thing runs hot. Like not "air cooler" hot, but much hotter than my old Pascal Titan X under water. For these runs I set the fan profiles to keep the coolant at ~33C. On my Pascal Titan X this resulted in in game temps of about 38C core temp. This beast is hovering at about 50C to 51C. Part of this is not unexpected. The Pascal Titan was a 250W TDP card. The 6900XT stock is 300W, but who knows how high they set it stock at XFX? The specs do not say does not say. I'll have to upgrade the monitoring software and measure during a run to see what it registers.

I'm wondering if they did a shitty mounting/pasting job at the XFX factory, and if I should take off the block, and give it some Kryonaut goodness. (Really not looking forward to that...) Appreciate thoughts.

3.) What do you guys recommend I do next? I had a pretty good idea which steps to take with a reference card (set 2600mhz, find minimum stable voltage max out power limit), but now that I've already hit 22k graphics score, maybe I should shoot for more? I'd appreciate any suggestions. I don't have enough experience with AMD GPU overclocking to really know where to start.

4.) Is this Threadripper SMT problem mostly a 3DMark thing, or is it better to leave it off in games as well?
Your card is specially binned by AMD "Navi 21" XTXH silicon mated to a much more capable power section.

Overclocking Navi 2 is ineffective if there isn't also stable, sufficient power to back it up. Your card has a more capable VRM phase config than most regular 6900xt. Also, its watercooled on top of that. And its "stock" clock is an overclock that regular 6900XT "hope" to get. With that power section, its going to get every bit of benefit from those extra clocks.
 
Updated score with the 5900X, 19,799

Screenshot 2021-10-11 231336.png
 
So, I did another run while looking at the metrics in the Radeon software.

Based on the Radeon software's measured wattage during a Time Spy run, the out of box Power Limit seems set to 335W. That's the max power used I saw throughout the benchmark run, and it was fairly consistent, rarely dipping below 330.

I moved the power slider all the way to the right to add my 15% and re-ran the benchmark. As would be mathematically expected, this resulted in max power used at 385W, though not as consistent, it bounced up and down more, which suggests that with the 335W limit it was hitting the limit a lot, but at 385W it is sometimes, but not other times.

Increasing the power (but touching nothing else) yielded slightly higher numbers:

timespy_no_SMT_max_power.png


It also resulted in a 3C increase in core temps to a max of 54C.

I do want to continue tweaking, but I'm not 100% sure what my next best bet is. I'd appreciate suggestions on what to try.

My gut is telling me to try to go for a higher max clock, as this one seems to be flatlining at the stock max.

Maybe go for a higher core clock, while also trying to reduce voltage a little? And then getting MorePowerTool to override the power limit maybe?

Here are the defaults for reference:

Defaults.png
 
Decided to run my system and see how it compared, not too far off from your 5900X run.
 

Attachments

  • Screenshot_2021-10-11 I scored 19 542 in Time Spy.png
    Screenshot_2021-10-11 I scored 19 542 in Time Spy.png
    434.1 KB · Views: 0
So, I did another run while looking at the metrics in the Radeon software.

Based on the Radeon software's measured wattage during a Time Spy run, the out of box Power Limit seems set to 335W. That's the max power used I saw throughout the benchmark run, and it was fairly consistent, rarely dipping below 330.

I moved the power slider all the way to the right to add my 15% and re-ran the benchmark. As would be mathematically expected, this resulted in max power used at 385W, though not as consistent, it bounced up and down more, which suggests that with the 335W limit it was hitting the limit a lot, but at 385W it is sometimes, but not other times.

Increasing the power (but touching nothing else) yielded slightly higher numbers:

View attachment 402388

It also resulted in a 3C increase in core temps to a max of 54C.

I do want to continue tweaking, but I'm not 100% sure what my next best bet is. I'd appreciate suggestions on what to try.

My gut is telling me to try to go for a higher max clock, as this one seems to be flatlining at the stock max.

Maybe go for a higher core clock, while also trying to reduce voltage a little? And then getting MorePowerTool to override the power limit maybe?

Here are the defaults for reference:

View attachment 402387

Did some more playing around with settings, but admittedly I don't really know what I'm doing. Here is the best I came up with:

Min Freq: 2450
Max Freq: 2650
Voltage: 1130 (anything lower would crash the driver or 3DMark or both)
Ram: 2125
Power Limit: +15% (which results in 385w)

Unlike other boards, the RAM slider on this one goes up to 3000. Doesn't help much through. At 2200 I have stuttering and some mild artifacts and a SIGNIFICANLY reduced score. At 2175 the stutter and artifacts are gone, but the score is still very bad. 2150 scores very slightly below 2100. The sweet spot for me seems to be 2125 on the RAM.

Not sure if the score losses are due to the RAM not performing well due to being pushed too hard, or if it is taking power away from the core due tot he power limit.

Here is my best score thus far:

clock_2650_Volt_1130_Mem_2125.png


I'm obviously happy with this, but it would be fun to push the graphics score into the 23000's

That's all I have time for tonight. I'm considering playing around with MorePowerTool, but since I'm already at 385w, I'm not sure how much higher it is wise to go. I still have plenty of thermal headroom though. Core temp is 52C in my latest run, with Junction Temp at around 67C.

Again, I'd appreciate any suggestions.
 
Did some more playing around with settings, but admittedly I don't really know what I'm doing. Here is the best I came up with:

Min Freq: 2450
Max Freq: 2650
Voltage: 1130 (anything lower would crash the driver or 3DMark or both)
Ram: 2125
Power Limit: +15% (which results in 385w)

Unlike other boards, the RAM slider on this one goes up to 3000. Doesn't help much through. At 2200 I have stuttering and some mild artifacts and a SIGNIFICANLY reduced score. At 2175 the stutter and artifacts are gone, but the score is still very bad. 2150 scores very slightly below 2100. The sweet spot for me seems to be 2125 on the RAM.

Not sure if the score losses are due to the RAM not performing well due to being pushed too hard, or if it is taking power away from the core due tot he power limit.

Here is my best score thus far:

View attachment 402410

I'm obviously happy with this, but it would be fun to push the graphics score into the 23000's

That's all I have time for tonight. I'm considering playing around with MorePowerTool, but since I'm already at 385w, I'm not sure how much higher it is wise to go. I still have plenty of thermal headroom though. Core temp is 52C in my latest run, with Junction Temp at around 67C.

Again, I'd appreciate any suggestions.

2120-2130 is pretty common sweet spot on memory.

The rest will have to be answered by someone with more cooling than a reference card.
 
What are the actual in-game improvements are those of you with a 6900 XT seeing from your overclocking? Timespy scores are nice, but don't necessarily track to in-game performance.

I have the opportunity to trade my 3080 Ti for a 6900 XT netting a bit of cash back. With performance pretty much at parity at stock on these cards, I'm wondering if the 6900 XT nets better improvements from overclocking (whereas the 3080 Ti gets maybe 3-5%, if you're lucky).
 
What are the actual in-game improvements are those of you with a 6900 XT seeing from your overclocking? Timespy scores are nice, but don't necessarily track to in-game performance.

I have the opportunity to trade my 3080 Ti for a 6900 XT netting a bit of cash back. With performance pretty much at parity at stock on these cards, I'm wondering if the 6900 XT nets better improvements from overclocking (whereas the 3080 Ti gets maybe 3-5%, if you're lucky).

I have not started playing games yet, but I can update you once I do.

As a long time [H]er I agree that canned benchmarks are maybe not the most representative of real world use. I started down the track of TimeSpy just because there is so much data out there to compare to to figure out if I hvae a card that is working the way it is supposed to, and how I am doing on squeezeing every drop out of it, but for me to, ultimately it is the game performance that matters. I still have yet yo buy CyberPunk, but I understand I am going to need every little drop of performance I can squeeze out of this thing to make 4k and RT playable (I'll probably even need to use FSR) so to me it is important to squeeze the absolute most out of this thing.

It's difficult to post stock vs overclocked performance figures though, because there is such a difference between the different models. I mean, this has been the case for a long time, but especially in this gen there are a million different AIB pertner versions with different clocks and default settings, and the way RDNA/RDNA2 works its not easy to predict based solely on achieved clocks, like it has been in the past.

This is evidenced by my 6900xt at stock settings outperforming others best overclocks. I mean, I could post you my stock game framerates and then compare them to my overclocked framerates, but that may not be of a huge amount of value as they may not be reflective.
 
Does anyone know if there is a safety concern in upping the power limit too much with MPT?

I have a good amount of thermal headroom, and I think I could benefit from MORE POWAH, but I don't want to overdo it and wreck anything...

That said, my question may not be able to be answered, since the power stage on my board is so different from reference...

I've heard of people bringing 6900xt Red Devil's up to 400W, so maybe if they can, I can too, as long as I keep an eye on temps?

I mean, I have to hope that the three 8pin power connectors are not just there for show? At the PCIe spec (which is a low estimate these days, as most use beefier cables) these can each provide 150W, plus the 75W from the PCIe slot, that puts me at a theoretical max of 525W. I'm not sure I'd test that, but it IS a data point.
 
Does anyone know if there is a safety concern in upping the power limit too much with MPT?

I have a good amount of thermal headroom, and I think I could benefit from MORE POWAH, but I don't want to overdo it and wreck anything...

That said, my question may not be able to be answered, since the power stage on my board is so different from reference...

I've heard of people bringing 6900xt Red Devil's up to 400W, so maybe if they can, I can too, as long as I keep an eye on temps?

I mean, I have to hope that the three 8pin power connectors are not just there for show? At the PCIe spec (which is a low estimate these days, as most use beefier cables) these can each provide 150W, plus the 75W from the PCIe slot, that puts me at a theoretical max of 525W. I'm not sure I'd test that, but it IS a data point.
I run my blocked 6900XT reference at 325W in MPT which gives ~330-360W in games with the power slider maxed and have had no issues. Your card is way overbuilt relative to the reference design...
 
What are the actual in-game improvements are those of you with a 6900 XT seeing from your overclocking? Timespy scores are nice, but don't necessarily track to in-game performance.

I have the opportunity to trade my 3080 Ti for a 6900 XT netting a bit of cash back. With performance pretty much at parity at stock on these cards, I'm wondering if the 6900 XT nets better improvements from overclocking (whereas the 3080 Ti gets maybe 3-5%, if you're lucky).
I have not started playing games yet, but I can update you once I do.

As a long time [H]er I agree that canned benchmarks are maybe not the most representative of real world use. I started down the track of TimeSpy just because there is so much data out there to compare to to figure out if I hvae a card that is working the way it is supposed to, and how I am doing on squeezeing every drop out of it, but for me to, ultimately it is the game performance that matters. I still have yet yo buy CyberPunk, but I understand I am going to need every little drop of performance I can squeeze out of this thing to make 4k and RT playable (I'll probably even need to use FSR) so to me it is important to squeeze the absolute most out of this thing.

It's difficult to post stock vs overclocked performance figures though, because there is such a difference between the different models. I mean, this has been the case for a long time, but especially in this gen there are a million different AIB pertner versions with different clocks and default settings, and the way RDNA/RDNA2 works its not easy to predict based solely on achieved clocks, like it has been in the past.

This is evidenced by my 6900xt at stock settings outperforming others best overclocks. I mean, I could post you my stock game framerates and then compare them to my overclocked framerates, but that may not be of a huge amount of value as they may not be reflective.

Having through about this over lunch.

It seems like TimeSpy is a lot heavier of a load than your typical game.

Do you guys find that the same OC settings that maximize your TimeSpy results also do a fairly good job of maximizing in game performance, or do you maintain separate profiles?
 
I have not started playing games yet, but I can update you once I do.

As a long time [H]er I agree that canned benchmarks are maybe not the most representative of real world use. I started down the track of TimeSpy just because there is so much data out there to compare to to figure out if I hvae a card that is working the way it is supposed to, and how I am doing on squeezeing every drop out of it, but for me to, ultimately it is the game performance that matters. I still have yet yo buy CyberPunk, but I understand I am going to need every little drop of performance I can squeeze out of this thing to make 4k and RT playable (I'll probably even need to use FSR) so to me it is important to squeeze the absolute most out of this thing.

It's difficult to post stock vs overclocked performance figures though, because there is such a difference between the different models. I mean, this has been the case for a long time, but especially in this gen there are a million different AIB pertner versions with different clocks and default settings, and the way RDNA/RDNA2 works its not easy to predict based solely on achieved clocks, like it has been in the past.

This is evidenced by my 6900xt at stock settings outperforming others best overclocks. I mean, I could post you my stock game framerates and then compare them to my overclocked framerates, but that may not be of a huge amount of value as they may not be reflective.
Please do report back with results (and anyone else) as it would be great to compare actual in-game results.

And yes, some of the posts in this thread lean towards the sacrilegious for [H] :)
 
FWIW, settings that net 370W in timespy are like 330-340w in Cyberpunk. If you're stable in the benchmark, you should be golden in game.
 
I have not started playing games yet, but I can update you once I do.

As a long time [H]er I agree that canned benchmarks are maybe not the most representative of real world use. I started down the track of TimeSpy just because there is so much data out there to compare to to figure out if I hvae a card that is working the way it is supposed to, and how I am doing on squeezeing every drop out of it, but for me to, ultimately it is the game performance that matters. I still have yet yo buy CyberPunk, but I understand I am going to need every little drop of performance I can squeeze out of this thing to make 4k and RT playable (I'll probably even need to use FSR) so to me it is important to squeeze the absolute most out of this thing.

It's difficult to post stock vs overclocked performance figures though, because there is such a difference between the different models. I mean, this has been the case for a long time, but especially in this gen there are a million different AIB pertner versions with different clocks and default settings, and the way RDNA/RDNA2 works its not easy to predict based solely on achieved clocks, like it has been in the past.

This is evidenced by my 6900xt at stock settings outperforming others best overclocks. I mean, I could post you my stock game framerates and then compare them to my overclocked framerates, but that may not be of a huge amount of value as they may not be reflective.


Your stock settings uses more power than the default reference rx 6900xt at max power slider. That's not to say they will hit your clocks, but just saying.

In my case, the stock cooler isn't going to cool enough to really go past the slightly less than 300 watts on core at max slider. So, 21300-21400 is about all I'm getting on gpu score (from what I can tell).

I game at stock, so this score doesn't even matter

https://www.3dmark.com/spy/23505375
 
Alright, I think I've reached the limit for this one, at least without going to extreme measures.

Once I loaded up MorePowerTool I found that the stock power limit was 332W (odd choice) and since I had already tested +15%, that meant I had already tested up to 382W, so I just set 382 as the new base power limit (giving me the option to go +15% over that in the drivers.

Best results I've been able to get thus far are at:

MinFreq: 2575 Mhz
MaxFreq: 2725 Mhz
Volt: 1140mv
VRAM: 2126Mhz
Power Limit: +15% (439W)

clock_2725_Volt_1140_Mem_2126.png
link


Not too shabby if I may say so myself.

I kept an eye on the power draw and it actually did draw all the way up to 437w at one point! No wonder this room is getting warm.

I tried going up to a MaxFreq of 2750 but that resulted in crashes. I stepped the voltage all the way up to 1200mv but that didn't help, and I didn't really want to go above that.

So I think I've had my fill of canned benchmarks for a while. Now I need to find myself a game to play :p
 
Alright, I think I've reached the limit for this one, at least without going to extreme measures.

Once I loaded up MorePowerTool I found that the stock power limit was 332W (odd choice) and since I had already tested +15%, that meant I had already tested up to 382W, so I just set 382 as the new base power limit (giving me the option to go +15% over that in the drivers.

Best results I've been able to get thus far are at:

MinFreq: 2575 Mhz
MaxFreq: 2725 Mhz
Volt: 1140mv
VRAM: 2126Mhz
Power Limit: +15% (439W)
View attachment 402627
link

Not too shabby if I may say so myself.

I kept an eye on the power draw and it actually did draw all the way up to 437w at one point! No wonder this room is getting warm.

I tried going up to a MaxFreq of 2750 but that resulted in crashes. I stepped the voltage all the way up to 1200mv but that didn't help, and I didn't really want to go above that.

So I think I've had my fill of canned benchmarks for a while. Now I need to find myself a game to play :p
FarCry 6 plays exceptionally well on the 6900XT :)
 
I like the Far Cry series. I'd buy it if it were on Steam and didn't require me to create an account in a store I never asked for to play it. :p
Well if you have FarCry 5 and earlier on Steam, you already have Ubisoft installed. Just using Ubisoft directly vice via Steam. I am trying their $14.95/month deal for 100+ games available to play. Now I don't recommend buying from Epic in this case, just directly from the publisher. I too prefer Steam but at least Ubisoft has game forums. $14.95 you get to play FarCry 6 with all the DLCs available and everything else.

It would be a great test case for your 6900XT since it has a built in benchmark with graph.
 
Well if you have FarCry 5 and earlier on Steam, you already have Ubisoft installed. Just using Ubisoft directly vice via Steam. I am trying their $14.95/month deal for 100+ games available to play. Now I don't recommend buying from Epic in this case, just directly from the publisher. I too prefer Steam but at least Ubisoft has game forums. $14.95 you get to play FarCry 6 with all the DLCs available and everything else.

It would be a great test case for your 6900XT since it has a built in benchmark with graph.

Yep, when Far Cry 3 wanted me to install Ubisoft and create an account, I immediately requested a refund.

I want the developer to get paid, but I also don't want to reward shitty behavior like that with my money.

I don't necessarily feel awesome about it, but this has been an area for me to protest-pirate. I ahve toyed with the idea of buying them anyway, and just not installing them so the dev gets paid, but at the same time, that kind of defeats the purpose of the protest. :p

Anyway, way off topic...
 
Yep, when Far Cry 3 wanted me to install Ubisoft and create an account, I immediately requested a refund.

I want the developer to get paid, but I also don't want to reward shitty behavior like that with my money.

I don't necessarily feel awesome about it, but this has been an area for me to protest-pirate. I ahve toyed with the idea of buying them anyway, and just not installing them so the dev gets paid, but at the same time, that kind of defeats the purpose of the protest. :p

Anyway, way off topic...

Didn't you get farcry 6 and re village for free with your 6900xt?
 
Man finally got a chance to try out my reference 6900xt I picked up and it's a fast card but dang is it toasty under serious load, seeing about 104 on the hot spot. But on the bright side it's why I got the reference design, makes it easy to slap a water block on it and add it to my cooling loop.
 
Man finally got a chance to try out my reference 6900xt I picked up and it's a fast card but dang is it toasty under serious load, seeing about 104 on the hot spot. But on the bright side it's why I got the reference design, makes it easy to slap a water block on it and add it to my cooling loop.
When overclocked, I hit max hot spot in a warm/semi hot room (furmark). During normal gaming, it's exactly as suspected. Low 90's on the hotspot. It's like the thermal material doesn't keep up at max power.

However, you are not wrong about the reference card waterblock options. Easy to find.
 
Didn't you get farcry 6 and re village for free with your 6900xt?

Good point. XFX said it was included on their webpage, but they also said it depended on retailer participation.

Newegg did not list it on their page for this video card. I'm going to have to contact them and ask. It still doesn't solve the Ubisoft launcher problem, but at least I'll feel like less of an ass when they finally break Denuvo and I torrent it. :p
 
Last edited:
Going point. XFX said it was included on their webpage, but they also said it depended on retailer participation.

Newegg did not list it on their page for this video card. I'm going to have to contact them and ask. It still doesn't solve the Ubisoft launcher problem, but at least I'll feel like less of an ass when they finally break Denuvo and I torrent it. :p

So every 6900xt in stock at newegg except the one you bought, awsome. lol.
 
Edit: This is probably a Threadripper/3dMark thing, turned off SMT and got over 3000pts increase in both Graphics and in the total scores. While not a 6900 XT, these new high end cards push the system and maybe more can be gained from tweaking the system vice just the graphics card. Now I have to see if SMT affects games as well. Below is stock EVGA 3090 XC3 Ultra settings
3733 mem Sys, no SMT, stock 3090 settings
3200 mem Sys, stock 3090 settings
This is the comparison of the two, a 31% increase in graphics score by only changing system memory speeds and turning off SMT, no change in 3090 settings, all stock

Did you ever learn if it makes sense to keep SMT off in games as well with the Threadripper? or is this primarily a 3DMark thing?
 
Did you ever learn if it makes sense to keep SMT off in games as well with the Threadripper? or is this primarily a 3DMark thing?
I would say primarily a 3dMark issue back then. Could have been a Windows issue on using the CPU cores. I just leave SMT on, have not have any issues keeping 120fps on the Vive Pro 2, Alyx and the 3090 all maxed out settings. Been playing FarCry 6 on the 6900XT at 4K, Maxed out and it runs beautifully but that is on a 3900X system.

I did have the 6900XT on the Threadripper system, same issue with TimeSpy and Threadripper with SMT on.
 
Going point. XFX said it was included on their webpage, but they also said it depended on retailer participation.

Newegg did not list it on their page for this video card. I'm going to have to contact them and ask. It still doesn't solve the Ubisoft launcher problem, but at least I'll feel like less of an ass when they finally break Denuvo and I torrent it. :p

Mine came with it automatically, popped into the cart when I purchased it. Newegg sent me a email with the codes shortly after I purchased the card.
 
Mine came with it automatically, popped into the cart when I purchased it. Newegg sent me a email with the codes shortly after I purchased the card.

Well, I reached out to Newegg and showed them the link on the XFX website, that even specifically mentions Newegg by name. They are pointing to this line:

"If the game bundle is not advertised at checkout or on the retailer product page, your purchase may not qualify!" saying I should contact XFX.

I have not yet reachoued out to XFX, but I bet they will hang their hat on this line:
"Codes are not distributed by XFX. Codes are distributed by the retailer or place of purchase. Please confirm product eligibility and participation upon purchase."

This promotion has enough loopholes in it that they can get out of it pretty much any way they want. And sure, I should maybe have checked the page when I bought it, but honestly, whether or not some games were bundled was not really my priority. I was just happy to be able to get the card.

It is kind of weird though. This must be the only 6900 XT on the market that DOES NOT qualify for the promotion for some reason :p
 
What are the actual in-game improvements are those of you with a 6900 XT seeing from your overclocking? Timespy scores are nice, but don't necessarily track to in-game performance.

I have the opportunity to trade my 3080 Ti for a 6900 XT netting a bit of cash back. With performance pretty much at parity at stock on these cards, I'm wondering if the 6900 XT nets better improvements from overclocking (whereas the 3080 Ti gets maybe 3-5%, if you're lucky).

To get back to gaming performance, I ran some tests and played around a little in Far Cry 5 last night, and I am more confused than not.

I'm getting very inconsistent results, but here is what I got with all settings maxxed in the benchmark.

settings1.jpgsettings2.jpgsettings3.jpg

results01.jpg

These results are a bit worse than what I have seen published in reviews. (though they don't mention their settings) but that may just be CPU limits. This game is notoriously CPU dependent.

But sometimes when I run the benchmark I get way worse results. Having a window open (like rivatuner stats) open on a second screen seems to have a much larger impact on performance than I am used to. That may just be a peculiarity in this title, but I do not remember that problem from when I played it the first time around.

At my overclock performance also seems highly temperature dependent. Performance goes way down if I change the fan settings to allow the coolant to increase even a few degrees. I may have to rework the overclock to keep things cooler. Maybe I'll even take the block off and re-apply the paste, as I have little to no faith that the manufacturer has done an adequate job there.

I will have to test in some more titles as well. This was just the first one I tried.

Side note: Do you guys enable any of these?

Driver Settings.png
 
Last edited:
To get back to gaming performance, I ran some tests and played around a little in Far Cry 5 last night, and I am more confused than not.

I'm getting very inconsistent results, but here is what I got with all settings maxxed in the benchmark.

View attachment 403084View attachment 403083View attachment 403082

View attachment 403081

These results are a bit worse than what I have seen published in reviews. (though they don't mention their settings) but that may just be CPU limits. This game is notoriously CPU dependent.

But sometimes when I run the benchmark I get way worse results. Having a window open (like rivatuner stats) open on a second screen seems to have a much larger impact on performance than I am used to. That may just be a peculiarity in this title, but I do not remember that problem from when I played it the first time around.

At my overclock performance also seems highly temperature dependent. Performance goes way down if I change the fan settings to allow the coolant to increase even a few degrees. I may have to rework the overclock to keep things cooler. Maybe I'll even take the block off and re-apply the paste, as I have little to no faith that the manufacturer has done an adequate job there.

I will have to test in some more titles as well. This was just the first one I tried.

Side note: Do you guys enable any of these?

View attachment 403095
Windows 10 or 11? SMT on or off? I assume this is with the 3960x. FC5 and other FC push a single thread heavily while lightly others, so if Windows, particularly Win 11, is picking the slower core, using threads on less cores than available etc. can degrade performance. I will test the 6900x on the 3900x in Win 11 which Win 11 has issues with picking the best Ryzen core.
 
Windows 10 or 11? SMT on or off? I assume this is with the 3960x. FC5 and other FC push a single thread heavily while lightly others, so if Windows, particularly Win 11, is picking the slower core, using threads on less cores than available etc. can degrade performance. I will test on the 3900x in Win 11 which Win 11 has issues with picking the best Ryzen core.

This is Windows 10. I haven't bothered with Windows 11 yet. I'll give it 6 months to a year to mellow before I try it. IMHO, operating systems are definitely one area where there is no good reason to be first. Let them solve the inevitable issues with a new release first.

I did try both with and without SMT, and didn't notice much if any difference in this title. Windsows 10 seems to be doing a good job of clearing the best physical core for Far Cry's heavy thread, but I think that best physical core may just not be good enough to not be the bottleneck. I remember being CPU limited in Far Cry 4 on my old build (Sandy Bridge-E i7-3930K at 4.8Ghz with Pascal Titan X) at under 60fpsin some scenes! These games definitely load the CPU up to an unusual degree.

It might be that my 3960x is holding me back here. As you say there is one VERY highly loaded core. Most of these other benchmarks online were done on 9900k's or 5900x's which have quite a bit higher per core performance, so it may just be the CPU holding me back.

Also, this is where I should note, that I am not exactly using a legit copy of Far Cry 5, (part of my protest against the forced install and account in the Ubisoft store) so who knows how up to date with patches it is.

This is why I need to test in other titles too. It was just easy to test with Far Cry 5 because it has a built in reproducible benchmark for easy comparison. Most games I own don't have this. For instance, Deus EX Mankind Divided would be a great test, but I don't think it had a benchmark built in.
 
This is Windows 10. I haven't bothered with Windows 11 yet. I'll give it 6 months to a year to mellow before I try it. IMHO, operating systems are definitely one area where there is no good reason to be first. Let them solve the inevitable issues with a new release first.

I did try both with and without SMT, and didn't notice much if any difference in this title. Windsows 10 seems to be doing a good job of clearing the best physical core for Far Cry's heavy thread, but I think that best physical core may just not be good enough to not be the bottleneck. I remember being CPU limited in Far Cry 4 on my old build (Sandy Bridge-E i7-3930K at 4.8Ghz with Pascal Titan X) at under 60fpsin some scenes! These games definitely load the CPU up to an unusual degree.

It might be that my 3960x is holding me back here. As you say there is one VERY highly loaded core. Most of these other benchmarks online were done on 9900k's or 5900x's which have quite a bit higher per core performance, so it may just be the CPU holding me back.

Also, this is where I should note, that I am not exactly using a legit copy of Far Cry 5, (part of my protest against the forced install and account in the Ubisoft store) so who knows how up to date with patches it is.

This is why I need to test in other titles too. It was just easy to test with Far Cry 5 because it has a built in reproducible benchmark for easy comparison. Most games I own don't have this. For instance, Deus EX Mankind Divided would be a great test, but I don't think it had a benchmark built in.
Here is the 3900x + 6900XT, FC5 benchmark at 4K, Ultra but no motion blur, driver has Automatic Tuning OC GPU (basically 2619 max clock settings, all other Tuning options are default by this):

3900x6900XT.jpg

The 6900XT is a XFX Speedster MERC319 AMD Radeon RX 6900 XT ULTRA. Windows 11, 32gb of ram at 3600mhz.
 
Last edited:
Here is the 3900x + 6900XT, FC5 benchmark at 4K, Ultra but no motion blur, driver has Automatic Tuning OC GPU (basically 2619 max clock settings, all other Tuning options are default by this):

View attachment 403120

The 6900XT is a XFX Speedster MERC319 AMD Radeon RX 6900 XT ULTRA. Windows 11, 32gb of ram at 3600mhz.

Thank you for that.

Yeah, you are doing about 10fps better than me.

It may just be that the settings that maximize my TimeSpy do not give me the best Far Cry 5 performance. May have to revisit that :/

I'm seeing 2660-2670 Mhz in game, but I'm also seeing pretty bad GPU utilization.

I know it isn't hitting the power limit as I have that set at an insane 440w, but something is definitely kicking in and causing it to drop out of max boost. I was leaning towards either temps or the CPU, but with the same CPU you do better, so it must be temps I guess?

Maybe I'll try dropping the max clock down from 2700 to 2650, and easing off on the voltage a little bit and see if that helps.

I really don't want to, but I am starting to think I am going to have to take the block off and reapply paste. With my Titan I never saw more than about a 5C delta between loop temp and GPU core temp, but with this thing it is way higher, even when power use is closer to my old Titan levels. We are talking like a 16-18C delta, and that's just not right.

I have a flow meter, so I know I am getting adequate flow through the block (~1.1GPM) so the only way I can explain the high delta T's ois that maybe it is not properly installed. You never know with these OEM boards...

Because I ma curious, what kind of core temps on the GPU are you seeing in this run?

Also, it's been quite some time. Maybe I am just ready for a clean install of Windows. I did use DDU to get rid of the old Nvidia drivers, but you never know what old shit is hiding in the registry...
 
Thank you for that.

Yeah, you are doing about 10fps better than me.

It may just be that the settings that maximize my TimeSpy do not give me the best Far Cry 5 performance. May have to revisit that :/

I'm seeing 2660-2670 Mhz in game, but I'm also seeing pretty bad GPU utilization.

I know it isn't hitting the power limit as I have that set at an insane 440w, but something is definitely kicking in and causing it to drop out of max boost. I was leaning towards either temps or the CPU, but with the same CPU you do better, so it must be temps I guess?

Maybe I'll try dropping the max clock down from 2700 to 2650, and easing off on the voltage a little bit and see if that helps.

I really don't want to, but I am starting to think I am going to have to take the block off and reapply paste. With my Titan I never saw more than about a 5C delta between loop temp and GPU core temp, but with this thing it is way higher, even when power use is closer to my old Titan levels. We are talking like a 16-18C delta, and that's just not right.

I have a flow meter, so I know I am getting adequate flow through the block (~1.1GPM) so the only way I can explain the high delta T's ois that maybe it is not properly installed. You never know with these OEM boards...

Because I ma curious, what kind of core temps on the GPU are you seeing in this run?

Also, it's been quite some time. Maybe I am just ready for a clean install of Windows. I did use DDU to get rid of the old Nvidia drivers, but you never know what old shit is hiding in the registry...
Same FC5 settings and system as before:
  • Default fan curve, max temperature on second run after first, GPU Cur Temp/Junc temp 64c/79c. Max fan speed 1257rpm
  • Custom fan curve, max 55c/71c. Max fan speed 2971rpm.
Love how hitting crl + shft + o gives monitoring overlay in a game. Plus Alt + R gives Radeon Settings right in the game where you can fool around with Performance Tuning (OCing) as a note. Something I miss being able to do on my RTX cards.
 
Last edited:
Same FC5 settings and system as before:
  • Default fan curve, max temperature on second run after first, GPU Cur Temp/Junc temp 64c/79c. Max fan speed 1257rpm
  • Custom fan curve, max 55c/71c. Max fan speed 2971rpm.
Love how hitting crl + shft + o gives monitoring overlay in a game. Plus Alt + R gives Radeon Settings right in the game where you can fool around with Performance Tuning (OCing) as a note. Something I miss being able to do on my RTX cards.

Is this through the AMD Driver overlay? I disabled that as I thought it was bloat, but maybe that was a mistake.
 
Have you tried adjusting the voltage for the VRAM, in isolation, if that's still possible on the Radeon 6000 series? Also, how are the temperatures of your VRAM and your VRMs?
 
Have you tried adjusting the voltage for the VRAM, in isolation, if that's still possible on the Radeon 6000 series? Also, how are the temperatures of your VRAM and your VRMs?

I have only seen two temps reported, core and Junction. Junction is about 10-15C higher than core.

I have done some thinking, and I am definitely repasting this thing, but I think it will have to wait until tomorrow after work, as I want to have plenty of time to get everything back up and running again, if anything goes wrong. I need this machine for work Monday morning.

Meanwhile I ma going to reset the more power tool back to stock and do some other testing.
 
I have only seen two temps reported, core and Junction. Junction is about 10-15C higher than core.

I have done some thinking, and I am definitely repasting this thing, but I think it will have to wait until tomorrow after work, as I want to have plenty of time to get everything back up and running again, if anything goes wrong. I need this machine for work Monday morning.

Meanwhile I ma going to reset the more power tool back to stock and do some other testing.
10-15c junction difference is pretty much perfect at what gpu loads you had.
 
Is this through the AMD Driver overlay? I disabled that as I thought it was bloat, but maybe that was a mistake.
Yes, very useful and I have not seen any conflicts using it. You can open up a web page as well right in the game, like if one is stuck and using a walkthrough to get through a section of a game. Useful for adjusting sharpening and other settings in real time getting feedback directly vice coming in and out of a game or going Windowed. Also very useful for ingame OCing.
 
Back
Top