If you could have a 3080 or a 6800XT which would you rather and why?

Status
Not open for further replies.
Joined
Sep 15, 2011
Messages
850
If you were building a system for 4K gaming at high settings, which of these cards would you choose? I know the AMD cards are even less available right now, but let's pretend both were easily found.

I know AMD doesn't have DLSS and that its ray tracing performance isn't as good as NVIDIAs, though from what I read on these forums, folks have differing opinions about the worth of each of these features at this point.

Next spring or summer I'll be building a system (assuming the world doesn't collapse and cards can actually be found, ha) and will have to choose between AMD or Intel for mobo and NVIDIA or AMD for GPU. Thanks.
 

equinox654

Limp Gawd
Joined
Apr 7, 2005
Messages
383
My 3080 just came in. Nvidia was my preference this time solely for dlss. I think it adds alot in supported titles. I think the performance benflefit is worth it. 4k with dlss looks WAY better to me than dropping to 1440p.
 

GoldenTiger

Fully [H]
Joined
Dec 2, 2004
Messages
20,098
Budget constraints not mentioned to help narrow the field but hands down no budget constraints the 3090 wins by a good margin in most games.
He said 3080 or 6800xt already.

I went 3080 myself. Raytracing is in many new aaa games and will only expand thanks to consoles. Dlss makes them playable while 10gb is plenty for 4k for the next few years that the 3080 will be string enough for anyways.
 

LukeTbk

Gawd
Joined
Sep 10, 2020
Messages
825
Because Blender and other usage create a good bias toward Ampere I would go that route, in Canada the 3080 seem to be cheaper (lowest priced 6800xt seem to be between $1,070 to $1,136 cad depending on retailer some cheapest option are above $1,250 while the cheapest 3080 are $975-$999) and faster at 4K anyway.
 

KATEKATEKATE

Limp Gawd
Joined
Jan 20, 2019
Messages
494
this is a tough decision, I actually had a dream about this recently lol. my first reaction is to say 6800XT because I'm lowkey a Lisa Su simp, but the reality is RTX 3080 is faster for 4K because of the wider architecture (and maybe bandwidth stuff, debatable).

Basically
4K = 3080 wins
1080P high refresh rate = 6800XT wins
1440P = tie
RT = 3080 wins
overclocking fun = 6800XT wins
efficiency = basically a tie, slight lead to AMD

If I were buying a new card right now I'd probably go with a 6800XT and stick with 1080P for longevity but for 4K Big Navi just isn't it.

the wildcard is the DirectML upscaling tech that AMD & Microsoft have teased, we have no idea how that'll compare to DLSS in terms of performance/IQ/compatibility and could make a big difference with regards to RT and 4K gaming.

edit to add- another advantage of Ampere (that will matter not at all to some and a lot to others) is the compute performance lead over Navi. A large part of why I went with Vega 64 instead of GTX 1080 is the extra compute power for non-gaming tasks. Now the situation is reversed...
 
Last edited:

griff30

Supreme [H]ardness
Joined
Jul 15, 2000
Messages
6,406
Both.
I Would prefer AMD but 'll take the first I find at MSRP.
Not enough AAA ray tracing titles for me to give a toss over which is better at it.
I'll wait 4 years after this upgrade to jump again.
 

zandor

2[H]4U
Joined
Dec 14, 2002
Messages
3,425
By the time the 6800XT came out it was already whatever I can actually get my hands on. That ended up being a 3090. I'd rather have a 6900XT, 3080 or 6800XT just due to cost, but that's what I actually managed to buy.

Now that I'm at 4k I'd take a 3080 over a 6800XT for the 4k performance and DLSS. After trying out DLSS I'd say it's better than running at a non-native res. If I were still at 1440p I think I'd rather have the 6800XT, mostly because of the spat NV is having with the Linux community over some sort of API they refuse to implement in their proprietary drivers but also because the 3080 loses the slight performance advantage it has at 4k. The net result of the spat is it makes using the NV proprietary drivers a PITA since there are a bunch of desktops and distributions that won't work. Basically I'm kind of stuck with Gnome. AMD's open source drivers are much better than the NV open source drivers, so given this annoyance I'd rather have a Radeon. If I actually gamed in Linux it would be NV hands down due to the performance of their proprietary drivers, but I do all my gaming in Windows.
 

erball

n00b
Joined
Nov 27, 2010
Messages
9
It’s the ecosystem I can’t justify a switch from. If AMD had a card that was readily available at a decent price I’d be much more tempted, but between DLSS, RT, RTX voice, broadcast, Gamestream, and cuda accelerated stuff, I’m not sure how I could justify it for a similarly priced product. When they’re both basically impossible to find, I know the direction I’d lean.
 

defaultluser

[H]F Junkie
Joined
Jan 14, 2006
Messages
13,818
3080. If I buy a card it's for the RTX right now. 6800XT is slower than the 3070.

Also DLSS gives you an additional boost in future games.
 

Epos7

Gawd
Joined
Aug 31, 2015
Messages
732
3080. The extra VRAM on the 6800XT doesn't seem to be an advantage in games, at least current ones.

I appreciate the 6800 XT though, because it's close enough it gives Nvidia competition.
 

exlink

Supreme [H]ardness
Joined
Dec 16, 2006
Messages
5,743
Personally, I’d go with a 3080. I play at 3840x1600 which is just about in between 1440p and 4K where Nvidia starts to typically pull away in pure rasterization. I also enjoy RT and DLSS in some of the titles I play which currently AMD either lacks the technology or the performance in. Since the MSRP of the reference cards is only a $50 difference and the difference between AIB cards is even lower it makes little sense for me to go with a 6800 XT over the 3080.
 

mvmiller12

[H]ard|Gawd
Joined
Aug 7, 2011
Messages
1,034
I put my money where my mouth is, and I don't buy nVidia products with my money. Period. It's not a specific preference for AMD, but nVidia takes corporate dirty to a whole new (and shameless) level with the most recent stings of the GPP and Hardware Unboxed debacles just being the icing on their shit cake. I stopped buying nVidia for myself after the nForce 2.

If I am getting parts for someone else, and they still want nVidia despite this, than that is their money to spend.
 

jhatfie

[H]ard|Gawd
Joined
Mar 19, 2003
Messages
1,515
I ended up going with a 6900xt. I was sort of on the fence, but Nvidia proved once again they are a piece of shit company with this latest hardware unboxed mischief. So my choice was easy. Plus I game at 4k and 10gb of the 3080 is not enough, so I'd get the 6800xt over the 3080. Ray tracing is not a must have for me yet as it only looks decent in two games.
 

Snowdensjacket

Limp Gawd
Joined
Apr 10, 2017
Messages
347
I wanted a 3080 but months later I still can't get one. I was going to buy whatever I could get from amd and so I did, which is nothing. Now I've lost interest.
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,909
6800 XT, 16gb of vram for longevity, 3-5 year usage scenario. Next version of ProRender is supposed to support HW RT, we will see. Frankly I like the AMD drivers/UI better. The newest Nvidia WHQL made FS2020 on 1080 Ti run at 15fps, Ultra, High, Medium settings does not matter, 1440p -> WTF with dithered looking clouds. Broke HDR and it looks worst then ever. It is like my 1080 Ti is slower and slower compared to my 5700XT which does not have these issues. To run Serious Sam 4 in HDR I have to tweak Nvidia driver display settings with max contrast to get it right. Too bad AMD has not released a significant number of cards so it makes it easier to get one.
 

Decko87

[H]ard|Gawd
Joined
Sep 23, 2007
Messages
1,863
I just went from a RX 5700 to a 3060 ti, I'd rather have a 3080. Ray tracing and DLSS in Control has been an absolute blast.
 

KickAssCop

Supreme [H]ardness
Joined
Mar 19, 2003
Messages
6,821
3080 ofcourse. I don’t want 2015 DX12 tech. Raster can eat a dick when you can’t max out visual fidelity and frames in latest titles. Also you gain the difference with a decent after market 3080. OC to OC the difference is even slimmer and at 4K 3080 is a no brainer.
 

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,674
RTX
DLSS
Stencil
Cuda
Physix
HairWorks
FaceWorks
RTX Voice
Shadowplay/Share
Ansel
TXAA
HBAO
G-Sync
SLI
Etc...

Proprietary features that Nvidia creates get a lot of flack, but frankly only Nvidia is regularly innovating new features that improve our gameplay experience.
Really only one team is pushing the ball forward time after time.

Where is AMD’s new feature innovation?
The only two I can think of off the top of my head are AMD Mantle which turned into Vulcan as it was dropped by AMD and picked up by open source devs and AMD was the first to allow mixed monitor rotation in triple screen gaming. (Meaning you could run different resolution monitors in mixed orientation allowing for the kingpin monitor setup at the time Portriat, Landscape, Portrait. 20” 30” 20” — 900x1600, 2560x1600, 900x1600). This was a very niche usecase, but I loved it, and it was the reason I switched to AMD. I don’t think nvidia has ever implemented PLP support to this day.

I’m probably missing a few proprietary features from both sides. But it does seem lopsided that Nvidia innovates nearly all of the new features gamers enjoy and AMD just creates, generally, slightly slower cards without the feature sets that cost a bit less.

Also my personal ownership of AMD from ~2015 - specifically 285, 380, Fury X, RX480, RX580, Vega 56, Vega 64. I will without a doubt tell you Nvidia’s drivers are better. From my personal experience, gaming, crypto mining, and just general computer use. If Nvidia drivers are the standard at 1 or 100% (and they aren't perfect, but just as the reference stic). Then AMD drivers (ON A MATURE CARD) are 85% to 90% of the nvidia drivers. At launch AMD drivers are trash, and that's being too nice. Takes them about 3-4 months to sort it all out and get to that 85-90% quality level of the nvidia standard bearer. My opinion/experience anyway.

From Nvidia from ~2017 I’ve owned and used 1060, 1070, 1080ti, 2080.
Before 2015 I was 100% Nvidia back to the days of 3dfx, (with the exception of a Diamond s2000 card.)

I vote 3080.
 
Last edited:

Surly

Limp Gawd
Joined
Apr 26, 2003
Messages
266
3080. Primarily for two reasons:

1) DLSS + Useable (barely) raytracing
2) Superior ancilliary features that I am very interested in (NVENC, RTX Voice)

The reality is that whatever I buy will likely be a short term purchase (18-24 months) instead of my usual 4 years. The next gens from both companies are rumored to be massive increases again, including another big jump in raytracing performance. Because of this I'll likely upgrade sooner rather than later which means the main shortcoming of the 3080, its limited memory, really doesn't pose to be much of a factor.
 

FlawleZ

[H]ard|Gawd
Joined
Oct 20, 2010
Messages
1,327
6800XT. No way I'd buy a brand new card for 4K today, especially with current prices with only 10GB of VRAM. I own an R9 Fury. Not making that mistake again.
 
  • Like
Reactions: noko
like this

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,674
By the time 10GB isn't relevant for games the speed of the 3080 wont be either.
Fury X is no longer relevant in the enthusiasts sector and it isn't the amount of VRAM that made it thus.

Anyone who buys the 3080TI for 20GB or the AMD series ONLY because of the VRAM size is making a poor financial decision -- speaking specifically to gamers.

The 3080TI will cost $300 extra for 10GB extra VRAM. There won't be a game out in the next five years that will require more than 10GB of VRAM to make it work. You MIGHT have to drop a single setting to get it in. Maybe. Possibly. But the game will still work. Also VRAM size has somewhat plateaued for 4k, and there are a LOT of 8GB cards on the market that game developers will want to cater to. Worst case you have to drop a texture setting from insane to high.

Just like Doom 2016 for the Fury X. It couldn't play Doom Nightmare mode. Nobody could tell the difference in gameplay between Doom Nightmare and Doom Ultra. Heck, it was unreasonably hard to tell the difference in any of the graphic settings for Doom 2016. That nightmare mode that required more than 4GB of VRAM was arbitrary.

Signed, an owner of two Fury X cards that operated in Crossfire (and I really enjoyed those cards) -- I used them to Mid 2017 when I moved to 1080TI SLI cards. The only time I ever ran into a VRAM limitation during that time was when I was playing on 3, 32" 2560x1440 monitors (7680x1440) on max settings in Battlefront, Doom, and the first Mordor game. Drop a setting and your back in business. The 1080TI card has 11GB of VRAM. Think it'll last longer than the 3080?
 
Last edited:

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,909
By the time 10GB isn't relevant for games the speed of the 3080 wont be either.
Fury X is no longer relevant in the enthusiasts sector and it isn't the amount of VRAM that made it thus.

Anyone who buys the 3080TI for 20GB or the AMD series ONLY because of the VRAM size is making a poor financial decision -- speaking specifically to gamers.

The 3080TI will cost $300 extra for 10GB extra VRAM. There won't be a game out in the next five years that will require more than 10GB of VRAM to make it work. You MIGHT have to drop a single setting to get it in. Maybe. Possibly. But the game will still work. Also VRAM size has somewhat plateaued for 4k, and there are a LOT of 8GB cards on the market that game developers will want to cater to. Worst case you have to drop a texture setting from insane to high.

Just like Doom 2016 for the Fury X. It couldn't play Doom Nightmare mode. Nobody could tell the difference in gameplay between Doom Nightmare and Doom Ultra. Heck, it was unreasonably hard to tell the difference in any of the graphic settings for Doom 2016. That nightmare mode that required more than 4GB of VRAM was arbitrary.

Signed, an owner of two Fury X cards that operated in Crossfire (and I really enjoyed those cards) -- I used them to Mid 2017 when I moved to 1080TI SLI cards. The only time I ever ran into a VRAM limitation during that time was when I was playing on 3, 32" 2560x1440 monitors (7680x1440) on max settings in Battlefront, Doom, and the first Mordor game. Drop a setting and your back in business. The 1080TI card has 11GB of VRAM. Think it'll last longer than the 3080?
My crystal ball says differently.

I could tell the difference between Nightmare and Ultra in Doom o_O
 

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,674
I could tell the difference between Nightmare and Ultra in Doom o_O
https://www.pcgamer.com/dooms-nightmare-graphics-image-quality-compared-and-benchmarked/

1608876680321.png


1608876706010.png


1608876733965.png
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,909

funkydmunky

2[H]4U
Joined
Aug 28, 2008
Messages
2,612
3080 because I am VR centrist and it does seem to have an advantage at 4K+ . 6800XT because I seems that reference cards are still trickling out at such a better $/performance ratio.
Answer: I wish I could get a 3080 at near msrp. Probably settle for AMD 6800XT at msrp because of savings and such close performance.
I think, "just wait till Spring" but 2080ti never came down to proper pricing so I don't have much hope or pretend msrp's today are any form of reality.
 

AVATARAT

Limp Gawd
Joined
Jun 16, 2020
Messages
134
If you were building a system for 4K gaming at high settings, which of these cards would you choose? I know the AMD cards are even less available right now, but let's pretend both were easily found.

I know AMD doesn't have DLSS and that its ray tracing performance isn't as good as NVIDIAs, though from what I read on these forums, folks have differing opinions about the worth of each of these features at this point.

Next spring or summer I'll be building a system (assuming the world doesn't collapse and cards can actually be found, ha) and will have to choose between AMD or Intel for mobo and NVIDIA or AMD for GPU. Thanks.
6800XT

There is enough videos that show how Nvidia make quality on some elements (textures) bad to achieve more FPS.
DLSS is Nvidia's future proof way to make people to buy something bad but expensive...
 

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,674
And what does above prove? One shot at one location is supposed to reflect the whole game, quality of monitor where differences are more noticeable also come into play as well. I could definitely tell the difference, was it a big difference? Not really but it was a nice one.
The article and screenshots align with what I said. Show me something that aligns with what you said where nightmare was a nice and worthwhile visual improvement. I went back and played some doom 2016 when I got the 1080TI in 2017 so I could see nightmare. Flipping back and forth I don’t remember being able to see a difference aside from thinking maybe I see more sparks flying around. maybe? I remember it being ridiculously difficult to see any difference at all. In the heyday of the Fury X that game was the only game I ran across that you could hit the VRAM limits on a single display. There was so much grief about 4GB not being enough for AMDs flagship card, but it never practically was an issue while the card was current gen. By the time it actually mattered Fury X was old news. When I bought my 2080 I read the same thing about 8GB VRAM not being enough. That didn’t pan out either. I see history repeating here - that’s all. I personally don’t see $300 justifying an extra 10GB of RAM. The extra speed and RTX cores might justify the 3080Ti, but not the RAM. In the end that’s just my own opinion/experience. People will buy what they will.
 
Last edited:

vegeta535

Supreme [H]ardness
Joined
Jul 19, 2013
Messages
5,419
3080. More features and better performance. Nvidia has always been the better option when it came to VR also.
 
Status
Not open for further replies.
Top