Best value current gen GPU at MSRP?

mda

2[H]4U
Joined
Mar 23, 2011
Messages
2,207
Hi all,

Would like to leave current retail prices out of the current situation -- just a good discussion on value @ MSRP.

I was looking at reviews recently and it seems like the 3060TI at $400 seems to be the best value among the bunch. The difference between the 3060TI and the 3070 doesn't seem like much?

I'm only playing at 1080/144 and maybe will move up to 1440/144 maximum within a few years when a monitor dies.

I've sold all my 'backup' GPUs and am living on a 1070TI which is good enough for my needs with the goal of repopulating my GPUs in my other machines when the price comes down.

Open to AMD cards as well but for some reason, MSRPs of AMD cards in my country are just a tad bit higher than usual. Looking at the 6700XT but that only seems to trade blows with the 3060TI/3070.

So currently, best value out of the bunch seems to be like the 3060TI > 6700XT > 6800/3070? Any reason to target the higher priced cards for my use case?

Thanks :]
 
If you can find a 3060TI for $400 buy it. But good luck finding one at MSRP.
 
Yeah, willing to wait here ;]

3060 standards are still going at the equivalent of $700-900 around here

Just looking at the performance / value of a 3060TI vs the others.
 
AMD is a good option too. Performance has increased by 9% among the 6800XT cards, it now beats the 3080 in a lot of benchmarks & Fidelity FX is something to seriously consider also, especially at 1080P like you have. I have checked AMD's website every day since they launched and have never seen them in stock. Were gonna be waiting for a long time, probably until the next generation of graphics cards are announced.

I've also heard that AMD has cut production of waffers to APU's and GPU's to have more for their EPYC server chips.

In summery, at $649, the 6800XT is the best value among all the GPU linups, IF you can get it at MSRP.
 
For maybe $250 more over the 3060TI, the 6800XT is actually very tempting.
 
I concur that the RX 6800 XT is a very impressive and underrated card. It sits between the 3080 and 3090 in some benchmarks. The only downside is raytracing performance isn't great and I hear AMD's equivalent to DLSS, FSR, has a way to go.
 
I concur that the RX 6800 XT is a very impressive and underrated card. It sits between the 3080 and 3090 in some benchmarks. The only downside is raytracing performance isn't great and I hear AMD's equivalent to DLSS, FSR, has a way to go.
AMD's Fidelity FX looks good. Everyone seems to love it from all the reviews I've seen.

I think AMD's next generation is going to really shine, they'll get RT nailed down. RDNA3 is gonna be the generation to buy.
 
AMD's Fidelity FX looks good. Everyone seems to love it from all the reviews I've seen.

I think AMD's next generation is going to really shine, they'll get RT nailed down. RDNA3 is gonna be the generation to buy.
I mean more in terms of supported games. I'm sure they will get there but nvidia has a leg up since DLSS has been out longer.
 
AMD's Fidelity FX looks good. Everyone seems to love it from all the reviews I've seen.

I think AMD's next generation is going to really shine, they'll get RT nailed down. RDNA3 is gonna be the generation to buy.

Well this generation has already been out for a year when most GPU architectures are usually superseded by the 2nd year. Wonder if they'll stretch the design life considering it has largely been unavailable to the mass market...
 
AMD is a good option too. Performance has increased by 9% among the 6800XT cards, it now beats the 3080 in a lot of benchmarks & Fidelity FX is something to seriously consider also, especially at 1080P like you have. I have checked AMD's website every day since they launched and have never seen them in stock. Were gonna be waiting for a long time, probably until the next generation of graphics cards are announced.

I've also heard that AMD has cut production of waffers to APU's and GPU's to have more for their EPYC server chips.

In summery, at $649, the 6800XT is the best value among all the GPU linups, IF you can get it at MSRP.
Exactly how I feel about this.
I too, have never seen any listed as available on AMD's site.
I'd take a peek every time I turned on the computer (made a bookmark).
After a while, I stopped.
MSRP was probably a ploy by AMD and Nvidia this generation to try undercutting interest in each other's cards.
I am now focused on HPG DG2.
That may be the 'spoiler'....if they can get it out at a decent price.
 
Last edited:
Dropped an EVGA RTX 3060 Ti FTW3 into my wife's rig for $449, does 4K 60fps no problem (Battlefront 2, BFV, MechWarrior 5, Overwatch, Sims 4). In any other year, it might be considered overpriced for a *60 SKU, but amazing bang for buck in the current landscape.
 
Dropped an EVGA RTX 3060 Ti FTW3 into my wife's rig for $449, does 4K 60fps no problem. In any other times, it might be considered overpriced for a *60 SKU, but amazing bang for buck in the current landscape.
For 450... not bad!
 
The 3060ti is killer for $400. The 6700xt is decent for the $479 it costs. 6800xt is monstrous for $649.
 
The 3060ti is killer for $400. The 6700xt is decent for the $479 it costs. 6800xt is monstrous for $649.

+1 to all this, also the AMD 6800 ain't bad. I almost wanted to say the 3070, but it's in a weird place where the 3060ti performs almost as well and also has just as much vram. I own a 3070, but if I didn't already have one I'd probably be gunning for a 3060ti or one of the AMD options.
 
As a streamer I find the 3060ti and 3080 amazing values for the extras they give. Noise reduction and camera blur and cleaning on top of good cards speeds is great. The nvenc encoder is world's better than amds solution right now and dlss will probably always be a little better but fingers cross that amds fsr is more than adequete.
 
3060ti looks like great value at $400, but i think 6700xt is going to be available at MSRP first. Just checked ebay they are selling for $750 shipped. 3060ti are mostly around $1100.
 
I concur that the RX 6800 XT is a very impressive and underrated card. It sits between the 3080 and 3090 in some benchmarks. The only downside is raytracing performance isn't great and I hear AMD's equivalent to DLSS, FSR, has a way to go.
Well it just released. Give it time. DLSS 1.0 was nothing to look at either.
 
Yea but you are not finding those AMD cards for those prices.

And yet the first post says the whole point is to discuss value if the cards were MSRP.

Hi all,

Would like to leave current retail prices out of the current situation -- just a good discussion on value @ MSRP.

For some of us because of the pricing we stopped following cards all together. As a result I have no clue what is actually a good bang for the buck card if and when you are able to find them at MSRP. I don't care if it never happens because I just won't build a new PC. On the off chance it does it would be nice to see some discussion based off if the pricing wasn't affected.
 
I concur that the RX 6800 XT is a very impressive and underrated card. It sits between the 3080 and 3090 in some benchmarks. The only downside is raytracing performance isn't great and I hear AMD's equivalent to DLSS, FSR, has a way to go.
I don’t know about that, in port royal my oc 6900xt gets close to 12,000 score which is very close to my friends oc 3090 score of a little over 13,000. That’s a lot better than “isn’t great”
 
The 3060ti is killer for $400. The 6700xt is decent for the $479 it costs. 6800xt is monstrous for $649.
Agreed, Moore’s law is dead just released their 3080 “TIE” card. (Can’t believe I’ve been saying ti wrong all these years lol) the 6800xt was faster in a lot of games he benched, was very surprised until I remembered AMD cards age like fine wine with persistent driver tuning from AMD. If you can find an rdna2 card for msrp (could happen in a couple months hopefully) then rdna2 wipes the deck with ampere. Yeah ray tracing is slightly better on ampere but in most games it’s an unneeded gimmick that does nothing for immersion (of course there are exceptions don’t get mad at me for saying this) and dlss is a blurry and artifacty mess for most of the games I’ve tried it with. (Again there are exceptions) and the 6000 series actually has a proper amount of vram and uses less energy. I honestly don’t see the point in ampere. But that nvidia for you, they haven’t gotten a gpu release right since the 10 series
 
I still feel lucky to have been able to purchase a 3080 at MSRP of about $800 six months ago. What a crazy world.

Quite happy with the card, but $800 still feels like way too much for a "value". I'd agree with the 3060 Ti as the best value.

Supply and demand constraints aside, I think the push toward 4K resolution has driven up the price of a mid-range card. One or two generations from now, when 4K/60 becomes more attainable on low and mid-tier cards, I'm hopeful we might see prices recede a bit. Nobody wants to play at less than 60fps, but once you can lock in 60fps you start to see diminishing interest from casual gamers. People will pay more to go from 40fps to 60fps than they will from 60 to 80.
 
I don’t know about that, in port royal my oc 6900xt gets close to 12,000 score which is very close to my friends oc 3090 score of a little over 13,000. That’s a lot better than “isn’t great”
Well, JasonPC said 6800 XT's RT performance isn't great. Your 6900 XT is not a 6800 XT. The 6800 XT matches ray tracing performance of its competitor's product priced 30% (or $180) below it, which is objectively "not great."

Control
1625336000892.png


Metro Exodus
1625335977089.png


Battlefield V
1625336079339.png

Source: https://www.eurogamer.net/articles/digitalfoundry-2021-nvidia-geforce-rtx-3060-review?page=5

Cyberpunk 2077 1.2
1625336336097.png

1625336344751.png

Source: https://www.thefpsreview.com/2021/03/31/radeon-rx-6800-xt-cyberpunk-2077-ray-tracing-performance/2/
 
Well, JasonPC said 6800 XT's RT performance isn't great. Your 6900 XT is not a 6800 XT. The 6800 XT matches ray tracing performance of its competitor's product priced 30% (or $180) below it, which is objectively "not great."

Control
View attachment 371655

Metro Exodus
View attachment 371654

Battlefield V
View attachment 371656
Source: https://www.eurogamer.net/articles/digitalfoundry-2021-nvidia-geforce-rtx-3060-review?page=5

Cyberpunk 2077 1.2
View attachment 371657
View attachment 371658
Source: https://www.thefpsreview.com/2021/03/31/radeon-rx-6800-xt-cyberpunk-2077-ray-tracing-performance/2/
Well there is a very good answer here. These are all games that implemented ray tracing before AMD brought ray tracing to their gpu’s so all these games were specifically tuned to use nvidia’ ray tracing architecture which positively influences nvidia rtx performance and negatively influences AMD rtx performance since they use a different architecture. There are very good articles written about the nvidia ray tracing good AMD bad fallacy.
 
Well there is a very good answer here. These are all games that implemented ray tracing before AMD brought ray tracing to their gpu’s so all these games were specifically tuned to use nvidia’ ray tracing architecture which positively influences nvidia rtx performance and negatively influences AMD rtx performance since they use a different architecture.
How about this: are there any games where AMD more closely matches their price competitor's in ray tracing performance? I provided real world examples - not excuses - and it's "not great" for the RX 6000 series' RT performance. All 4 of the above games use DXR (DirectX Raytracing) implementations of ray tracing, so I'm not sure how the games themselves are "tuned" for nvidia RT hardware, when that's on the API? Or are you saying the DXR feature in the DirectX API is specifically tuned for Nvidia architectures while disadvantaging AMD?

I have time for some reading. Hit us with some very good articles.

And note, I may have a RTX 3070 in my signature, but I also have a RTX 3060 Ti XC3, RX 6800 Reference, and RX 6700 XT Pulse in family's builds. Competition is great, aint it.
 
Last edited:
How about this: are there any games where AMD more closely matches their price competitor's in ray tracing performance? I provided real world examples - not excuses - and it's "not great" for the RX 6000 series' RT performance. All 4 of the above games use DXR (DirectX Raytracing) implementations of ray tracing, so I'm not sure how the games themselves are "tuned" for nvidia RT hardware, when that's on the API? Or are you saying the DXR feature in the DirectX API is specifically tuned for Nvidia architectures while disadvantaging AMD?

I have time for some reading. Hit us with some very good articles.

And note, I may have a RTX 3070 in my signature, but I also have a RTX 3060 Ti XC3, RX 6800 Reference, and RX 6700 XT Pulse in family's builds. Competition is great, aint it.
first, okay I will try and find those articles I read but a very good parallel to what I’m saying about ray tracing is that the reason assassins creed Valhalla and several other games perform better on RDNA2 is because the game was optimized to AMD’s rasterization architecture. Both companies pay developers to spend more time optimizing for their respective architectures. Still just using generic direct x 12 but tuned for AMD’s or Nvidia’s specific architecture. The same thing happens on the ray tracing side. Now that AMD is in the ray tracing game, new games will be much more neutral in ray trace tuning (unless they get paid by one or the other) as they have to make it usable on both brands, but before AMD came out with rdna2, there was only one ray tracing architecture out there it would be stupid not to optimize the game engines ray tracing to nvidia architecture to produce the best results (no one knew ray tracing would be on RDNA2 until shortly before their release) so the games you listed didn’t have enough time to include AMD’s ray tracing architecture in their optimization regimen before release and it’s doubtful developers would optimize after the majority of sales have already happened.

I know this isn’t technically an article but a guy went into the Vulcan code and changed a few lines of code and increased ray tracing performance by 20% on RDNA2, just imagine what someone who actually knows what they are doing optimizes their game for RDNA2

https://www.neogaf.com/threads/prog...ve-ray-tracing-performance-on-rdna-2.1595270/
 
How about this: are there any games where AMD more closely matches their price competitor's in ray tracing performance? I provided real world examples - not excuses - and it's "not great" for the RX 6000 series' RT performance. All 4 of the above games use DXR (DirectX Raytracing) implementations of ray tracing, so I'm not sure how the games themselves are "tuned" for nvidia RT hardware, when that's on the API? Or are you saying the DXR feature in the DirectX API is specifically tuned for Nvidia architectures while disadvantaging AMD?

I have time for some reading. Hit us with some very good articles.

And note, I may have a RTX 3070 in my signature, but I also have a RTX 3060 Ti XC3, RX 6800 Reference, and RX 6700 XT Pulse in family's builds. Competition is great, aint it.

https://www.tomshardware.com/features/amd-vs-nvidia-best-gpu-for-ray-tracing

Okay here you go, the two games that AMD comes out on top is godfall and dirt 5 which are AMD sponsored games so they were optimized for AMDs rt architecture. So basically if the game is nvidia optimized then nvidia comes out on top, and it’s AMD optimized then AMD comes out on top. Now yes there are many variables I’m not taking into account but at the very least I doubt developers would artificially decrease the performance of the non-sponsored architecture so it’s simply optimized for one or the other. To be honest I’m waiting for unreal engine 5 as optimization should be for both architectures as AMDs architecture has been out long enough for unreal to have thoroughly experimented with the architecture, so rt performance in unreal 5 will be the best comparison in the power of both architectures.
 
https://www.tomshardware.com/features/amd-vs-nvidia-best-gpu-for-ray-tracing

Okay here you go, the two games that AMD comes out on top is godfall and dirt 5 which are AMD sponsored games so they were optimized for AMDs rt architecture. So basically if the game is nvidia optimized then nvidia comes out on top, and it’s AMD optimized then AMD comes out on top. Now yes there are many variables I’m not taking into account but at the very least I doubt developers would artificially decrease the performance of the non-sponsored architecture so it’s simply optimized for one or the other. To be honest I’m waiting for unreal engine 5 as optimization should be for both architectures as AMDs architecture has been out long enough for unreal to have thoroughly experimented with the architecture, so rt performance in unreal 5 will be the best comparison in the power of both architectures.
I see what you mean. One camp's tuning isn't necessarily doing the other camp any favors. Well as we get more DXR/RT releases in the coming months we'll get more data points to see where the chips really fall. Nvidia RTX obviously had an unopposed 2 year lead in that regard so the data skews heavily in Nvidia's favor.
 
Last edited:
I see what you mean. One camp's tuning isn't necessarily doing the other camp any favors. Well as we get more DXR/RT releases in the coming months we'll get more data points to see where the really chips fall. Nvidia RTX obviously had an unopposed 2 year lead in that regard so the data skews heavily in Nvidia's favor.
I totally agree, I look forward to what future releases bring!
 
I see what you mean. One camp's tuning isn't necessarily doing the other camp any favors. Well as we get more DXR/RT releases in the coming months we'll get more data points to see where the really chips fall. Nvidia RTX obviously had an unopposed 2 year lead in that regard so the data skews heavily in Nvidia's favor.

Or a different perspective would be, 2 years of waiting for Cyberpunk to be released and claiming it would be the ultimate RTX game. RTX support has been nearly non-existent outside of a handful of games for the 2 years between Turing and Ampere. It's still not anywhere close to being available in the majority of AAA titles at game launch. Jensen lied to all of the potential Turing owners during his August 2018 launch presentation (meanwhile raising the MSRP to reflect new technologies that no games were able to use). Don't get me started again :p.

But to the OP question, I'd say either the $650 6800XT or the $699 3080. I play a lot of Ubisoft games (Assassin's Creed, Farcry, Ghost Recon, etc.) and for whatever reason they do significantly better on AMD GPUs from all the reviews. None of the games I play are competitive shooters, so as long as the FPS stays in the 50-100 range, with my freesync monitor, I don't notice. I played through AC: Odyssey jumping between a 1080Ti, 2060 Super, and RX5700 (flashed to XT) and it wasn't noticeably different between any of the cards. Likewise, I played through AC: Valhalla using a mixture of 3080, 6800 (non-XT), 3070, 3090, and 6900XT, and I don't know if I could point out any big differences on my 1440p monitor.

The 6900XT and the 3090 are too high priced for too little difference. If the RX6800 non-XT was $499 or maybe even $529, it would probably be the de facto champ in bang for the buck.
 
Depends on your needs, but the 6800 XT at MSRP would be pretty nice. Better than a 2080 Ti and just about trading blows with the RTX 3080 (outside of ray tracing).
 
I was able to test 3060ti, 3070, 3070ti, 3080, 3080ti and 3090.

For the MSRP 3060ti is a beast.

Vote on 3060ti
 
Back
Top