Is the RX 6900 XT at $1500 a good buy now? Or wait?

David-Duc

[H]ard|Gawd
Joined
Dec 22, 2010
Messages
1,281
I haven't upgrade my GPU in 4 years (still on the trusty ole 1080 Ti)... Was going to get the 3080 when it debut but because of them miners and scalpers... No dice.

Just got some extra cash and I wonder if I should splurge it on this RX 6900 XT at MC? Or is the RX 6900 XT such a bad deal that no one want them? It seems like there are plenty of RX 6900/6800 around whereas 3080 and 3090s are nowhere to be seen.
1644614388552.png
 
As someone with a 6900XT, I wouldn't even buy one at MSRP now. The card is largely useless, old games don't run on it and in new games its not any faster than something half the price unless you're maxed out at 4k.
 
As someone with a 6900XT, I wouldn't even buy one at MSRP now. The card is largely useless, old games don't run on it and in new games its not any faster than something half the price unless you're maxed out at 4k.
Can you elaborate? Old games don't run on it? Driver issues? My last AMD card was a HD 6870 so I'm pretty out of date on current AMD driver's state.
 
Can you elaborate? Old games don't run on it? Driver issues? My last AMD card was a HD 6870 so I'm pretty out of date on current AMD driver's state.

The power management is so aggressive that in old games it ends up trying to idle, its been a huge pain in my ass. I deeply regret selling my 3070 after I got this.
 
The power management is so aggressive that in old games it ends up trying to idle, its been a huge pain in my ass. I deeply regret selling my 3070 after I got this.

Can you modify the power management like on an Nvidia card, something like "prefer max performance?"
 
Can you modify the power management like on an Nvidia card, something like "prefer max performance?"

Not really, theres like 10 different "workarounds" like using Radeon Chill to set the min frames the same as max, using "custom" gaming clock setups which raise the clock speed but not the usage, etc. Its extremely frustrating, and then if I play something like Cyberpunk 2077 that is a demanding AAA title, the fps is no better than my 3070 was at 1440p and if you use ray tracing its actually slower. I just don't see much point in it, I was super excited to try AMDs top of the line offering but after dealing with it for a while it wasn't worth it at all to me.
 
Not really, theres like 10 different "workarounds" like using Radeon Chill to set the min frames the same as max, using "custom" gaming clock setups which raise the clock speed but not the usage, etc. Its extremely frustrating, and then if I play something like Cyberpunk 2077 that is a demanding AAA title, the fps is no better than my 3070 was at 1440p and if you use ray tracing its actually slower. I just don't see much point in it, I was super excited to try AMDs top of the line offering but after dealing with it for a while it wasn't worth it at all to me.
Id rather have a 6900xt then my 3070. $1500 still seems way up there.
 
I guess I'll be sticking with the 1080 Ti then... It's been really long in the tooth, lol.
 
Not really, theres like 10 different "workarounds" like using Radeon Chill to set the min frames the same as max, using "custom" gaming clock setups which raise the clock speed but not the usage, etc. Its extremely frustrating, and then if I play something like Cyberpunk 2077 that is a demanding AAA title, the fps is no better than my 3070 was at 1440p and if you use ray tracing its actually slower. I just don't see much point in it, I was super excited to try AMDs top of the line offering but after dealing with it for a while it wasn't worth it at all to me.

Are you on a 60hz display? I find higher refresh rate displays have the opposite problem, at least with nvidia cards it keeps them permanently in an elevated power state even at idle.
 
I'd just wait at this point. These things all launched in 2020. Its 2022 so we're getting at the very least some refreshes, and towards the fall probably 40-series and RDNA3.
 
Are you on a 60hz display? I find higher refresh rate displays have the opposite problem, at least with nvidia cards it keeps them permanently in an elevated power state even at idle.

I'm on a 1440p 165hz screen, although I did use it with a 240hz 1080p and 120hz 3440x1440 ultrawide. I mean, the card is powerful and all but comparing a $1500 price to say something like a 3060ti that is 80% as good unless you're running 4k I just cant recommend it, I'd say if you really wanted AMD you can find like 6700XTs for ~$850 new right now which is way more reasonable.
 
I was in your shoes on a 1080Ti for over 4 years, I finally got tired of waiting for prices to come down...and deciding they prob wouldnt any time soon. I ended up buying my XFX 319 merc 6900XT in november for $1700, it was the model I wanted, and I would finally be able to take advantage of my 240 hz samsung I got a year ago. Card has been absolutely problem free and runs all my games without an issue.
 
I guess I'll be sticking with the 1080 Ti then... It's been really long in the tooth, lol.
Hah! You think your 1080ti is long in the tooth? I'm still stuck on my 1070 and can't for the life of me score a new card at a decent price.
 
Not really, theres like 10 different "workarounds" like using Radeon Chill to set the min frames the same as max, using "custom" gaming clock setups which raise the clock speed but not the usage, etc. Its extremely frustrating, and then if I play something like Cyberpunk 2077 that is a demanding AAA title, the fps is no better than my 3070 was at 1440p and if you use ray tracing its actually slower. I just don't see much point in it, I was super excited to try AMDs top of the line offering but after dealing with it for a while it wasn't worth it at all to me.
I'm not entirely in this boat, but I did give up and go back to an rtx 2080ti. I currently still have both cards so...
Are you on a 60hz display? I find higher refresh rate displays have the opposite problem, at least with nvidia cards it keeps them permanently in an elevated power state even at idle.

I use a 144hz 1440p monitor. I honestly don't care how much power I use as long as I can cool it and my games feel good. I have not personally experienced this. *not saying it doesn't exist, it's just not something I experienced personally, I also use only one monitor*

I would not buy an rx 6900xt at anything other than 1-1.2k.
 
Hah! You think your 1080ti is long in the tooth? I'm still stuck on my 1070 and can't for the life of me score a new card at a decent price.

This.

I'm still on a 1080 and have been wanting a 3070 for over a year now....
 
If i could buy it online from them for that price i think i would get it and a 4k display
 
There's a reason the AMD cards are largely available at this point. The decision is going to be up to you. I personally wouldn't want one because there are enough games I play with Raytracing features, and the AMD cards just aren't going to give you a good experience if you want the raytracing. Unless you specifically spend your time playing very specific games that don't use raytracing, like, maybe you spend 99% of your time playing Elite Dangerous, then the AMD cards make sense. Otherwise, I just don't see it being worth it. The card is going to have a very short lifespan as more and more games with raytracing come out. Or maybe you just don't care about raytracing, which is fine too. However, Dying Light 2 I believe is a pretty solid example of what we can expect from more AAA titles moving forward, and how big of a difference raytracing makes.
 
There's a reason the AMD cards are largely available at this point. The decision is going to be up to you. I personally wouldn't want one because there are enough games I play with Raytracing features, and the AMD cards just aren't going to give you a good experience if you want the raytracing. Unless you specifically spend your time playing very specific games that don't use raytracing, like, maybe you spend 99% of your time playing Elite Dangerous, then the AMD cards make sense. Otherwise, I just don't see it being worth it. The card is going to have a very short lifespan as more and more games with raytracing come out.
This, and dlss is a killer app.
 
This, and dlss is a killer app.
I think some might debate that, but at the end of the day DLSS/FSR/Whatever - Even if you do use a scaling method, the AMD cards are at such a huge deficit for raytracing performance that even using an upscaler isn't going to give you acceptable raytracing performance at higher resolutions. At least on the nvidia cards you can use DLSS Quality or FSR Quality modes for a very minimal IQ impact with the raytracing features maxed out.
 
That is something you should not need to do. Besides most of it doesn't actually work.

I'll try a few things that I missed in the video, but............ We will see.
 
Last edited:
I've moved back and forth between a 3080/ti and a 6900xt. In general, it's really a toss up on base rendering performance with the edge going ro the 6900xt at lower resolution, and to the 3080 ti at 4k.

I've experienced none of the downclocking issues others have reported here, but I don't really play older games except for a few point-and-click adventures. Also, it is worth noting the 6900xt overclocks a bit higher (3-4%) than the 3080ti in my experience.

The real disadvantage of the 6900xt is its raytraycing performance, as others have mentioned - and I feel where we're just at the cusp where raytraycing matters.

In summary, you need to ask yourself if raytraycing matters to you. If yes, nV; if no, 6900xt all the way since you'll save a few hundred. And the 6900xt has been dropping even further in price the past few weeks, so that rational is even more valid.
 
I've moved back and forth between a 3080/ti and a 6900xt. In general, it's really a toss up on base rendering performance with the edge going ro the 6900xt at lower resolution, and to the 3080 ti at 4k.

I've experienced none of the downclocking issues others have reported here, but I don't really play older games except for a few point-and-click adventures. Also, it is worth noting the 6900xt overclocks a bit higher (3-4%) than the 3080ti in my experience.

The real disadvantage of the 6900xt is its raytraycing performance, as others have mentioned - and I feel where we're just at the cusp where raytraycing matters.

In summary, you need to ask yourself if raytraycing matters to you. If yes, nV; if no, 6900xt all the way since you'll save a few hundred. And the 6900xt has been dropping even further in price the past few weeks, so that rational is even more valid.

I don't use raytracing, maybe I'm blind but I notice zero difference with it on or off. But the downclocking absolutely infuriates me.

Just one example, I play Armored Warfare occasionally which is a cryengine game and the card refuses to go above 200mhz which gives it single digit fps numbers. I see the same clock speed watching HD videos on youtube. For a while in order to play I would launch a unigine benchmark in the background to get the card to boost itself up a bit and run the damn game.

If I played battlefield and far cry all the time it would be a non-issue, the card performs fantastic in those games. Its just many of the games I like to play simply do not agree with the card.
 
I don't use raytracing, maybe I'm blind but I notice zero difference with it on or off. But the downclocking absolutely infuriates me.

Just one example, I play Armored Warfare occasionally which is a cryengine game and the card refuses to go above 200mhz which gives it single digit fps numbers. I see the same clock speed watching HD videos on youtube. For a while in order to play I would launch a unigine benchmark in the background to get the card to boost itself up a bit and run the damn game.

If I played battlefield and far cry all the time it would be a non-issue, the card performs fantastic in those games. Its just many of the games I like to play simply do not agree with the card.
i have run into certain hit and miss games that the amd drivers just dont function correctly with...they are usually a few years old and not super popular...in my case it was one of the far cry releases i bought for little to nothing and it was the same thing you are describing happening....only happen to me the one time with that one game so it didnt bother me to to much...but yea aside from trying to make an ancient driver work with a newer card....it just wasn't gonna happen
 
The power management is so aggressive that in old games it ends up trying to idle, its been a huge pain in my ass. I deeply regret selling my 3070 after I got this.
Not everyone is having this issue. Neither 6800XT I've had has had a single issue (other than when one died from hardware failure, but that's not a driver issue - just bad luck). Both perform superbly. As do my RTX 3XXX series cards. Sucks that you're running into it still :(
 
I've moved back and forth between a 3080/ti and a 6900xt. In general, it's really a toss up on base rendering performance with the edge going ro the 6900xt at lower resolution, and to the 3080 ti at 4k.

I've experienced none of the downclocking issues others have reported here, but I don't really play older games except for a few point-and-click adventures. Also, it is worth noting the 6900xt overclocks a bit higher (3-4%) than the 3080ti in my experience.

The real disadvantage of the 6900xt is its raytraycing performance, as others have mentioned - and I feel where we're just at the cusp where raytraycing matters.

In summary, you need to ask yourself if raytraycing matters to you. If yes, nV; if no, 6900xt all the way since you'll save a few hundred. And the 6900xt has been dropping even further in price the past few weeks, so that rational is even more valid.
The problem is there’s no RTX 3080 / 3080 Ti available… and I have been watching microcenter everyday for the last 2 months…

The best nvidia card available I can see is 3070 and 3070 Ti… which I’m not sure is quite the upgrade I’d like to get.
 
The problem is there’s no RTX 3080 / 3080 Ti available… and I have been watching microcenter everyday for the last 2 months…

The best nvidia card available I can see is 3070 and 3070 Ti… which I’m not sure is quite the upgrade I’d like to get.

If there's a Micro Center near your neck of the woods, you need to be on one of the Discord channels that cover delivery times/inventory (some kind folks will usually post up photos of the stuff available that particular day, once they "win" their spot in the GPU lottery, and are in-store getting their video cards).
The Micro Center website inventory takes time to update, so stock will have shifted around by the time it's posted in the website.
Speaking for myself, the local Micro Center has had a good number of RTX 3080/Ti available every week or two.

Naturally, with that kind of situation, being able to drive quickly to your local Micro Center to get into the GPU lotteries is crucial.
 
I'm enjoying my 6800 xt. Price aside, as some have said there is a refresh coming soon for AMD. So if your cards not dead, just wait.
 
The problem is there’s no RTX 3080 / 3080 Ti available… and I have been watching microcenter everyday for the last 2 months…

The best nvidia card available I can see is 3070 and 3070 Ti… which I’m not sure is quite the upgrade I’d like to get.
Then get the 6900xt. It will be a significant upgrade, you'll save money, and like I explained, any difference in performance between it and a 3800 ti is negligible. Even if you care about raytraycing, it doesn't arguably matter just yet. As for the downclocking issue, do you play 10 year old+ games regularly? In any case, neither raytracing or a potential issue on old games should stop you upgrading.
 
6900xt is a great card and that price is not horrible, not great either. I have had no issues with mine but I got a reference version so I could keep it water cooled.
 
I guess I'll be sticking with the 1080 Ti then... It's been really long in the tooth, lol.
If you have a 1080ti i really doubt you need to upgrade. Id wait as well. I bought this same 6900xt and while i have been very impressed with it -- nothing about it suggests its worth $1499 + tax. I have it dual mining Eth / Ton until it pays me back half of its value. The only game I tried on it briefly was DCS at max settings 2560x1440 and it was flawless. Now that I have it mining, went back to my AMD R9 Fury and i have to turn a lot of settings down and its still not as smooth even at 1080p. Outside of DCS most of my games play fine on an AMD Fury and a 1080ti is probably at least 2x as powerful as that?
 
Picked up one a few weeks ago and it's a beast, I really like it. I was surprised how much I actually liked the AMD driver interface. Runs everything I play (Destiny 2, Warzone, FFXIV, etc.) really well on my LG CX48. It replaced a GTX 1080 (non-TI).
tempImage1jgd13.jpg
 
Honestly, at 1440 I don't think i'd bother upgrading off a 1080ti yet given that the more available AMD cards won't give you any level of performance for raytracing and the only real suitable replacement for a 1080ti on the nvidia side is a 3080 12GB or 3080ti, both of which are impossible to get at a decent price.
 
Back
Top