Looking for feedback: 6900xt vs 3080

I’ll give that a try. His examples were mostly MMOs- I don’t play those. Can’t judge new world or war frame or fallout money-grab edition.

It's not like I blame anyone who doesn't come across the issue. It's just my play style, is prone to playing less optimized and older games. Cause it's just how I roll :cool: lol
 
So I was able to at least get more performance in this game, but it still feels stuttery.
 
He gave an example as did I.

However, I just tested this on

Crysis 2 Maximum edition, shows good gpu usage... till you look at the frequency. 5-700mhz. *note I haven't played around with settings, but as is default this is what I got*

View attachment 417936

What framerate are you getting when this is happening?

Clock definitely dropped down to 800Mhz when I was playing Borderlands, but that's only because that is all that was needed to hit 120fps at max settings. It's not going to sit there cranking out the clocks if lower clocks are all that is needed at the time.

Have you tried setting the minimum clock to something higher in the overclocking settings?

I also wonder if there might be something going on with power management settings. I've had mine on "Prefer Maximum Performance" or whatever it's called forever, and have never seen it. Maybe that is a factor.
 
Last edited:
What framerate are you getting when this is happening?

Clock definitely dropped down to 800Mhz when I was playing Borderlands, but that's only because that is all that was needed to hit 120fps at max settings.

Have you tried setting the minimum clock to something higher in the overclocking settings?

This was left at default for that run. I was able to get a solid 140 fps, except you know the stuttering. Then I turned off every fps limiter (still stuttered) when I looked at the graph during that run I saw 1400mhz dips. I'll play around with it some more tomorrow and install the original crysis.
 
It might be defaulting to 24hz, thus the stutter. Have you tried creating a custom resolution? Such as 3840x2159 (1 pixel off 2160)?
 
It might be defaulting to 24hz, thus the stutter. Have you tried creating a custom resolution? Such as 3840x2159 (1 pixel off 2160)?

I play at 1440p, I did play in windowed mode, but that was infinitely worse.
 
This was left at default for that run. I was able to get a solid 140 fps, except you know the stuttering. Then I turned off every fps limiter (still stuttered) when I looked at the graph during that run I saw 1400mhz dips. I'll play around with it some more tomorrow and install the original crysis.
I play at 1440p, I did play in windowed mode, but that was infinitely worse.

Hmm.

Do you have Vsync on while also trying to use a FreeSync screen?

Vsync works differently on AMD when Freesync/VRR is enabled than it does on Nvidia when Gsync is enabled. On Nvidia many leave it on to cap the framerate at the max the screen can handle. This is not what it does on AMD, and it is highly recommended to keep vsync off while using FreeSync/VRR.

If framerate is running too high without Vsync there are other ways to address that.
 
Hmm.

Do you have Vsync on while also trying to use a FreeSync screen?

Vsync works differently on AMD when Freesync/VRR is enabled than it does on Nvidia when Gsync is enabled. On Nvidia many leave it on to cap the framerate at the max the screen can handle. This is not what it does on AMD, and it is highly recommended to keep vsync off while using FreeSync/VRR.

If framerate is running too high without Vsync there are other ways to address that.

Na, I let afterburner limit my fps. For the exact reason you said.

In this case I turned everything off
tried chill
tried unlocked
tried windowed
turned off enhanced sync & freesync
 
Na, I let afterburner limit my fps. For the exact reason you said.

In this case I turned everything off
tried chill
tried unlocked
tried windowed
turned off enhanced sync & freesync

Strange.

Try going in to manual overclocking settings. Overclock or not, your choice, but go to advanced settings, and set the minimum clock to about 100mhz lower than where the max clock is set, and apply that.

Maybe that will help.
 
In Afterburner you can also set force 3D clock in 2D. I know this isn't ideal typically but it could be helpful troubleshooting in your case.
 
3080 all the way. I gave up my 3070 when I picked up a 6900XT from the forum here and it was a bad choice. The card is great at high end games, but play anything a little older like revisiting titles you played in the past and it turns itself inside out trying to downclock to save power. Some old games I like to play are completely unplayable at all, the card will drop down to like 80mhz and you get 9fps in a game from 2011 that runs fine on intel integrated. Also, the AMD drivers are just plain weird, like if I turn my monitor go to sleep and then wake it up all the HDMI audio devices reinstall themselves and cause all sorts of havok. All of these things worked perfectly fine on the 3070, and in some games its performance was leagues above simply because AMD seems incapable of determining a game is actually running. Don't even try to come at me with the DDU crap, I reinstalled windows from scratch when I switched cards.

TLDR - unless you play benchmarks take the 3080
You do realize that there is a setting in the adrenaline drivers that fixes this. I have a 6900xt and I’ve never experienced what you are saying, mechwarrior online, Starcraft 2, war thunder, rising storm 2, etc all play flawlessly
 
You do realize that there is a setting in the adrenaline drivers that fixes this. I have a 6900xt and I’ve never experienced what you are saying, mechwarrior online, Starcraft 2, war thunder, rising storm 2, etc all play flawlessly

You want to be specific on the setting your talking about? That way we can either try it, let you know we tried it, or let you know that it didn't help.

Any help is appreciated.
 
You want to be specific on the setting your talking about? That way we can either try it, let you know we tried it, or let you know that it didn't help.

Any help is appreciated.
Oh sorry about that, I’m on a business trip so I’m away from my desktop and its been forever since I’ve been in the adrenaline settings. If I remember correctly it’s called “Radeon anti-lag” it basically forces the gpu to put out as many frames as possible while keeping output at the monitors refresh rate. There might be another setting but I won’t be able to check until I get home in a week.
 
Well, I just loaded up the oldest game I have installed.

Counter-Strike: Source.

Buttery smooth at 120fps. Core downclocked to ~250mhz due to not requiring any more to get the job done.

No stutter or choppiness at all.

Don't know what to say.
 
Last edited:
There was a driver bug awhile back with high refresh monitors of 165hz and above causing odd core clocking behavior. The temporary workaround was to set the monitors refresh to 120hz in windows. Don”t know if that helps but it certainly wouldn’t be the first regression ever to occur in a vid driver.
 
It's not like I blame anyone who doesn't come across the issue. It's just my play style, is prone to playing less optimized and older games. Cause it's just how I roll :cool: lol
120 FPS locked with Freesync. No issues. No performance problems. Crysis 2 Maximum runs perfectly on my 6800XT. Didn't bother following the clock speeds - but had ZERO performance issues. 1440P, 120 hz, 3960X + 6800 XT, 128G.

So - what do you have in your system? Monitors / refresh? how many? Last time I had micro stuttering it was because I had a low refresh monitor hooked up at the same time.
 
120 FPS locked with Freesync. No issues. No performance problems. Crysis 2 Maximum runs perfectly on my 6800XT. Didn't bother following the clock speeds - but had ZERO performance issues. 1440P, 120 hz, 3960X + 6800 XT, 128G.

So - what do you have in your system? Monitors / refresh? how many? Last time I had micro stuttering it was because I had a low refresh monitor hooked up at the same time.

Did you get to the actual game or just the beginning sequence?

1440p/144hz single monitor 6900xt-5900x-64gb3600mhz

I'm off today, so I can play with this a bit more.
 
Did you get to the actual game or just the beginning sequence?

1440p/144hz single monitor 6900xt-5900x-64gb3600mhz

I'm off today, so I can play with this a bit more.
Full game. I was wandering around in a warehouse after getting the suit. Didn’t go to far.
 
He gave an example as did I.

However, I just tested this on

Crysis 2 Maximum edition, shows good gpu usage... till you look at the frequency. 5-700mhz. *note I haven't played around with settings, but as is default this is what I got*

View attachment 417936
I get the same issue running GTA 5 on my 3080ti. GPU utilization is good.. but the darn thing is usually running at 700~800mhz! It keeps the 60fps I need with Ultra settings unless I turn the grass up to the moon and the frame rate will drop... GPU still running in the hundreds instead of 2Ghz or more.
 
I get the same issue running GTA 5 on my 3080ti. GPU utilization is good.. but the darn thing is usually running at 700~800mhz! It keeps the 60fps I need with Ultra settings unless I turn the grass up to the moon and the frame rate will drop... GPU still running in the hundreds instead of 2Ghz or more.

Have you tried setting minimum clock in overclocking settings?
 
I get the same issue running GTA 5 on my 3080ti. GPU utilization is good.. but the darn thing is usually running at 700~800mhz! It keeps the 60fps I need with Ultra settings unless I turn the grass up to the moon and the frame rate will drop... GPU still running in the hundreds instead of 2Ghz or more.

I can honestly say, I don't play gta5, that would drive me insane.

back to the 6900xt

so even though I'm getting 140 fps, i'm getting some terrible frame times in crysis 2. I figured I'd play it through, because I haven't beaten it yet. :D
 
I can honestly say, I don't play gta5, that would drive me insane.

back to the 6900xt

so even though I'm getting 140 fps, i'm getting some terrible frame times in crysis 2. I figured I'd play it through, because I haven't beaten it yet. :D
How are you tracking frame time? I’ll fiddle some over this weekend. Also, which card are you using (brand)?
 
How are you tracking frame time? I’ll fiddle some over this weekend. Also, which card are you using (brand)?

Reference, in this case I can just feel it, example in certain parts of the map you can scroll left to right and watch it stutter, even though fps shows 140 fps.

Maybe I'll set up something for real though, instead of "feel"

I've tried all the obvious settings, though.
 
Reference, in this case I can just feel it, example in certain parts of the map you can scroll left to right and watch it stutter, even though fps shows 140 fps.

Maybe I'll set up something for real though, instead of "feel"

I've tried all the obvious settings, though.
Afterburner can measure the frame time. Just let it run in the background on another monitor while playing or ALT+TAB if you use 1 display.
 
I can honestly say, I don't play gta5, that would drive me insane.

back to the 6900xt

so even though I'm getting 140 fps, i'm getting some terrible frame times in crysis 2. I figured I'd play it through, because I haven't beaten it yet. :D
Use the Radeon anti-lag feature. That should help the frame time issue.

Radeon anti-lag Adrenaline software description: “Dynamically adjusts frame timing to reduce the lag between user inputs and visual responses.”
 
I just set adaptive synch for ms flight sim and it really smooths out the stuttery mess that it is currently so give that a go as well. My monitor is freesynch but it also helped in vr as well.
 
I've observed the substantial downclocking in older titles but haven't had stuttering. I have had the audio issues that LigTasm mentioned tho.
 
y Plex/4K HTPC box does. It rarely games, but the 2080TI in it works fine for audio

If it rarely games, you could get slime surprising money for that 2080ti...

I used to have 720's and 1030's in my Kodi boxes.

Last time around I swapped them for Hard kernel Odroid ARM boxes. They have worked very well!
 
If it rarely games, you could get slime surprising money for that 2080ti...

I used to have 720's and 1030's in my Kodi boxes.

Last time around I swapped them for Hard kernel Odroid ARM boxes. They have worked very well!
I would but it does occasionally game, and it’s a 4k60 screen … so I need the horsepower for the times it does. Mostly console style games. I’ve thought about it though; especially if I swapped in something simpler that can still transcode on the fly, given the current insane pricing.
 
3080 all the way. I gave up my 3070 when I picked up a 6900XT from the forum here and it was a bad choice. The card is great at high end games, but play anything a little older like revisiting titles you played in the past and it turns itself inside out trying to downclock to save power. Some old games I like to play are completely unplayable at all, the card will drop down to like 80mhz and you get 9fps in a game from 2011 that runs fine on intel integrated. Also, the AMD drivers are just plain weird, like if I turn my monitor go to sleep and then wake it up all the HDMI audio devices reinstall themselves and cause all sorts of havok. All of these things worked perfectly fine on the 3070, and in some games its performance was leagues above simply because AMD seems incapable of determining a game is actually running. Don't even try to come at me with the DDU crap, I reinstalled windows from scratch when I switched cards.

TLDR - unless you play benchmarks take the 3080
There are settings in the driver.

Sounds like you had radeon chill on with some really low min settings set or something.

I play tons of old games with my now getting old 5700xt and use Radeon chill... cause ya if I only need 60-75fps in an older title why bother spinning the fans if you don't have to. Chill is my all time favorite GPU feature... I was very sick of listening to my GPUs whine away running old copies of Civ and the like.

I could be way off but what you are describing doesn't sound right... considering my 5700 can play most older titles without spinning the fans (or flipping them on so low I don't notice) without ever having any noticeable frame drops.
 
There are settings in the driver.

Sounds like you had radeon chill on with some really low min settings set or something.

I play tons of old games with my now getting old 5700xt and use Radeon chill... cause ya if I only need 60-75fps in an older title why bother spinning the fans if you don't have to. Chill is my all time favorite GPU feature... I was very sick of listening to my GPUs whine away running old copies of Civ and the like.

I could be way off but what you are describing doesn't sound right... considering my 5700 can play most older titles without spinning the fans (or flipping them on so low I don't notice) without ever having any noticeable frame drops.

I don't use chill. I don't actually care about saving power while I am gaming. I doubt he does either.
 
Use the Radeon anti-lag feature. That should help the frame time issue.

Radeon anti-lag Adrenaline software description: “Dynamically adjusts frame timing to reduce the lag between user inputs and visual responses.”

This can also cause stuttering in some games, for those who haven't used it.
 
People actually use GPU HDMI audio?

Only machines I've ever used that on have been my Kodi boxes.

I do and it’s annoying in Windows 11. With gsync on, hdmi audio turns into a stutter fest unless I turn on vsync. This was never an issue on Windows 10. Bleh
 
This is a difficult one. RTX 3080 only has 10GB VRAM, and it pains me to say this, but there are starting to be edge cases where this isn't enough VRAM at 4K. The 6900 XT has plenty of memory, but it's slower memory and seems to struggle at 4K at some points due to this. Also, Ray Tracing is lack luster on AMD's offerings this generation. Also, as much as people want to believe that AMD Super Resolution is the same as DLSS, it isn't. DLSS in some cases appears to improve image quality, while Super Resolution never improves image quality.

It's all down to what games you play. If your games uses DLSS and/or Ray Tracing, the RTX 3080 will be the clear winner. If not, then the 6900 XT will "probably" be the better choice.
The 6900xt "struggles" at higher resolution as much due to the smaller/cheaper bus width as memory speed.

100% concur, however, on your assessment. Cost being equal, if you don't want or play games with RT, 6900 xt is the best overall choice (even at higher resolutions). If you want raytracing (or DLSS which in my view is a degradation to IQ), get the 3080. Also, that Auros is definitely a step above that stock Zotac cooling solution and build, so that's also an advantage.

In any case, I've owned both cards, and the experience is quite close. Only reason I moved to a 3080 ti instead of sticking with a 6900 xt (or 3080) is I do play some games where RT shows some value, albeit small e.g. CP2077, SoTR.

Edit:. I never saw any issues with mins or low-power states on my Red Devil 6900 xt some are claiming. FPS was rock solid for me at all times and maxed (across the 9-10 games I was playing on it).
 
Last edited:
Power? Whatever.

Noise? It can certainly be nice to game in a quiet room.

If I really want quiet, I have a reference card. I'd just buy a water block and keep it at stock. While I loose 5-10 percent performance by not overclocking it, 300 watts is fairly easy to cool if you are not trying to maximize lowest temp.
 
I have a 6900xt and was experiencing this so called “old game” or “light load stutter”. It drove me nuts until I spent some time in the software and shut off all the power saving/adaptive crap and just let the card crank. Voila it was fixed.
 
Back
Top