If you could have a 3080 or a 6800XT which would you rather and why?

Status
Not open for further replies.
The article and screenshots align with what I said. Show me something that aligns with what you said where nightmare was a nice and worthwhile visual improvement. I went back and played some doom 2016 when I got the 1080TI in 2017 so I could see nightmare. Flipping back and forth I don’t remember being able to see a difference aside from thinking maybe I see more sparks flying around. maybe? I remember it being ridiculously difficult to see any difference at all. In the heyday of the Fury X that game was the only game I ran across that you could hit the VRAM limits on a single display. There was so much grief about 4GB not being enough for AMDs flagship card, but it never practically was an issue while the card was current gen. By the time it actually mattered Fury X was old news. When I bought my 2080 I read the same thing about 8GB VRAM not being enough. That didn’t pan out either. I see history repeating here - that’s all. I personally don’t see $300 justifying an extra 10GB of RAM. The extra speed and RTX cores might justify the 3080Ti, but not the RAM. In the end that’s just my own opinion/experience. People will buy what they will.
Could care less about that article, amateur at best. Believe my lying eyes more. Seriously, if you can't tell the difference that means then no one else can? World revolves around you? If one can't tell the difference between a cheap TN panel to a high end IPS, good for then since they won't really need to buy an expensive GPU to push that gaming experience. There are more factors involved, preferences, what one looks at maybe totally different on what someone else looks at in the game. Anyways with Doom 2016, Nightmare gave a much better depth/contrast overall to my experience and I could definitely tell between the settings.
 
Could care less about that article, amateur at best. Believe my lying eyes more. Seriously, if you can't tell the difference that means then no one else can? World revolves around you? If one can't tell the difference between a cheap TN panel to a high end IPS, good for then since they won't really need to buy an expensive GPU to push that gaming experience. There are more factors involved, preferences, what one looks at maybe totally different on what someone else looks at in the game. Anyways with Doom 2016, Nightmare gave a much better depth/contrast overall to my experience and I could definitely tell between the settings.
Placebo. It is well documented for that title.
 
Last edited:
If you were building a system for 4K gaming at high settings, which of these cards would you choose? I know the AMD cards are even less available right now, but let's pretend both were easily found.

I know AMD doesn't have DLSS and that its ray tracing performance isn't as good as NVIDIAs, though from what I read on these forums, folks have differing opinions about the worth of each of these features at this point.

Next spring or summer I'll be building a system (assuming the world doesn't collapse and cards can actually be found, ha) and will have to choose between AMD or Intel for mobo and NVIDIA or AMD for GPU. Thanks.

The 3080 by a mile. Without DLSS, 4K is still not attainable with reasonable performance levels in some games. And as you said, ray tracing is basically not usable on AMD's cards. Debate the merit of those features all you want, but I at least want the option to use it when its implemented well. On the CPU front, when it comes to gaming it makes no difference. 5800X, 5900X, 5950X, or Core i9-10900K, 10700K etc. makes no difference. Especially not at 4K where they are all sufficiently powerful and you are primarily GPU limited.
 
3080 so when I want to see ray tracing I can play games on my potato 1080p TV. (J/K)

But in general I would prefer to 3080 for more features for today. In a few years 6800 XT may age better because of VRAM. But I tend to upgrade every 2 years anyway, so I prefer the better card for current gen.
 
Gamers: competition is great! Happy to see AMD back in the game etc etc

Also gamers: just get the 3080, DLSS bro
 
Dunno, when amd's answer to dlss arrives then i'd have a better opinion.
 
  • Like
Reactions: Elios
like this
Gamers: competition is great! Happy to see AMD back in the game etc etc

Also gamers: just get the 3080, DLSS bro
Well, the 3080 is a clearly superior card for my 4k and raytracing gaming. The AMD is decent but it is just a bit short of being a worthy buy. The larger vram is nice but it won't matter for years as testing has shown, so it's not a selling point today like rtx and dlss are.
 
Well, the 3080 is a clearly superior card for my 4k and raytracing gaming. The AMD is decent but it is just a bit short of being a worthy buy. The larger vram is nice but it won't matter for years as testing has shown, so it's not a selling point today like rtx and dlss are.

RTX isn't more of a selling point today than that extra VRAM. I don't care about better lighting in games I probably won't play so give me the cheaper card with more VRAM is what I say

Besides AMDs feature set since Adrenaline has been top notch, no reason not to believe something's coming
 
RTX isn't more of a selling point today than that extra VRAM. I don't care about better lighting in games I probably won't play so give me the cheaper card with more VRAM is what I say

Besides AMDs feature set since Adrenaline has been top notch, no reason not to believe something's coming
A lot of new releases have had improved graphics with raytracing since Turing launched, and it's only going to get bigger with many console games using it too.

Did you complain about resolutions getting higher too and say "meh, who cares about the eye candy" in the past too? Unified shades? T&l? AntiAliasing?

Raytracing benefits you today and tomorrow. 16gb of vram won't matter for until the card is obsolete anyway.
 
3080 so when I want to see ray tracing I can play games on my potato 1080p TV. (J/K)

But in general I would prefer to 3080 for more features for today. In a few years 6800 XT may age better because of VRAM. But I tend to upgrade every 2 years anyway, so I prefer the better card for current gen.

I think the RTX 3080 barely has enough VRAM today. While generally good enough for most titles, it's painfully close to running out with Cyberpunk 2077. Using the ray tracing ultra-preset, w/DLSS at 3840x2160 with HDR, the game was consuming upwards of 9.5GB of VRAM. The game has a couple of settings that go beyond ultra as well. Obviously, it's just one example but it's probably not the only game where this occurs. In short, there are edge cases today where you start running low on VRAM. I don't think the RTX 3080 is going to be one of those cards that ages well the way some cards have.

Now, if you don't game at 4K or have any plans on doing so anytime soon, the 3080 is likely going to be fine for some time. That being said, the 4K crowd is both the same group that's likely to get the most benefit from a 3080 and that's the same group that's most likely going to replace it the second something faster comes out. But, if you are the kind of person who buys high end and sits on that system for 3 years or more, then the RTX 3080 may be a disappointment down the road.
 
Last edited:
That is good each can decide what is most important for their needs/wants. One size does not have to fit all. For me it was 6800 XT, 6900 XT or 3090, ended up with the 3090 mostly due to luck or bad luck how ever one wants to look at it. I will have some fun exploring RT, DLSS, have options for using the nice fat 24gb of vram. On the AMD side I will be missing the fun part of OCing the GPU to 2600ghz+, water cooling the AIBs that can do 480w I suspect will push 2800ghz+. I also like the drivers better with AMD from a usability standpoint but can live with both. OCing the 3090 seems much more limiting on the results but may do that as well with a shunt mod/water block.

I do find it funny, where a number of folks find RT virtually the same as non RT when playing a game normally from a visual standpoint or find the performance penalty not worth it, so do others that see the difference just a placebo effect :D. Also some pounding others for lesser performance numbers for FPS on which is the better video card, yet with way less FPS with RT it then becomes grand and not important any longer the massive drop.

My take at this time, RT is only a tool that may or may not be useful. There are plenty of games, as well as upcoming games where RT would hinder the artistic side, feel or expressions for a unique world where the artist bend and make light how ever they want. For a quick example, Ori and the Will of the Wisps, the artistic lighting is way better than forcing a more mechanical method of automatic rules and consistency. A game that will play 120fps at 4K on the XBox Series X.

 
I think the RTX 3080 barely has enough VRAM today. While generally good enough for most titles, it's painfully close to running out with Cyberpunk 2077. Using the ray tracing ultra-preset, w/DLSS at 3840x2160 with HDR, the game was consuming upwards of 9.5GB of VRAM. The game has a couple of settings that go beyond ultra as well. Obviously, it's just one example but it's probably not the only game where this occurs. In short, there are edge cases today where you start running low on VRAM. I don't think the RTX 3080 is going to be one of those cards that ages well the way some cards have.
And was it still at a reasonable FPS or Cyberpunk at those setting a good example that for scenario that would use enough VRAM for 10 gig to become an issue the 3080 is already quite far from 40 fps anyway.

The one day having either 12-16-20 gig would have made it possible for a 3080 to run at 30 fps instead of 10 fps in a game at ultra setting is arguably of little to even no importance at all, would it run at 25-35 instead of a playable 45-65 is more what will end up being important imo.
 
re: VRAM
I'd love to see a 1080P->2160P scaling comparison between Fury X, 980Ti, and 1070 with games released between 2015 and now. Looking at those three cards that have almost identical shader performance but differing memory capacity would be v. informative.

but I suspect, as has been pointed out earlier in the thread, that they'll all come up short on compute power before running out of memory, save perhaps for a few select cases where the Fury suffers.

regardless, it seems a bit of a moot point considering
1. 20GB on GA102 looks to be reserved for the 3080Ti, which will be in a completely different price bracket than the 3080 and isn't a direct alternative
2. the enormous width of GA102 gives it enough of an architectural advantage over Navi21 at high resolutions that Navi's extra VRAM is unlikely to magically turn the tables at 4K in coming years.
 
re: VRAM
I'd love to see a 1080P->2160P scaling comparison between Fury X, 980Ti, and 1070 with games released between 2015 and now. Looking at those three cards that have almost identical shader performance but differing memory capacity would be v. informative.

but I suspect, as has been pointed out earlier in the thread, that they'll all come up short on compute power before running out of memory, save perhaps for a few select cases where the Fury suffers.

regardless, it seems a bit of a moot point considering
1. 20GB on GA102 looks to be reserved for the 3080Ti, which will be in a completely different price bracket than the 3080 and isn't a direct alternative
2. the enormous width of GA102 gives it enough of an architectural advantage over Navi21 at high resolutions that Navi's extra VRAM is unlikely to magically turn the tables at 4K in coming years.
Ed-Zachary .... the 3080 easily pushes north of 800 Gb/s on the RAM when OC'd a smidge, Navi can't break 550 Gb/s even with most OCs...
 
I have a feeling that this question is going to be a recurring theme throughout this generation of "Raw Performance vs. GPU feature set". My take is as follows.

Competitive Gaming: RX 6000 GPUs.
Eye Candy: RTX 3080/3090.

AMD and Nvidia had roughly the same silicon budget (around 28B transistors) this time around, and it's interesting to see how they each approached the market differently. Nvidia devoted a significant portion of their silicon budget to advanced features (AI, Ray Tracing), while AMD devoted almost all of their silicon to raw rasterization; Nvidia is trying to change the GPU landscape, while AMD is attempting to brute force higher framerates.

Both approaches are excellent in their own way, but because you said 4K is a priority, the RTX 3080 will probably be superior due to (potential) DLSS and the higher memory bandwidth afforded to a wider memory bus and the GDDR6X of the RTX 3080. We will see if this changes over time.
 
I'd have gone with the RX6800XT simply because the titles I run do not use DLSS or indeed raytracing and would benefit from the pure rasterization.
 
3080. I applaud AMD's efforts to become competitive again. But I would rather have a card that supports 2nd generation RT. AMD's next card will be interesting if they can improve RT results.
 
I'd have gone with the RX6800XT simply because the titles I run do not use DLSS or indeed raytracing and would benefit from the pure rasterization.

6800XT
Same here, if I can ever find one.​

 
And was it still at a reasonable FPS or Cyberpunk at those setting a good example that for scenario that would use enough VRAM for 10 gig to become an issue the 3080 is already quite far from 40 fps anyway.

The one day having either 12-16-20 gig would have made it possible for a 3080 to run at 30 fps instead of 10 fps in a game at ultra setting is arguably of little to even no importance at all, would it run at 25-35 instead of a playable 45-65 is more what will end up being important imo.

With DLSS set to balanced, the RTX 3090 achieves roughly 55FPS on a stock Core i9-10900K or 3950X at 3840x2160. There are dips into the high 40's and occasionally it breaks into the 61-62FPS range. With G-Sync, the game actually feels pretty smooth north of 45FPS. I do not know what a 3080 would get in the same scenario. I don't have one.
 
With DLSS set to balanced, the RTX 3090 achieves roughly 55FPS on a stock Core i9-10900K or 3950X at 3840x2160. There are dips into the high 40's and occasionally it breaks into the 61-62FPS range. With G-Sync, the game actually feels pretty smooth north of 45FPS. I do not know what a 3080 would get in the same scenario. I don't have one.
The described scenario is with no DLSS at all, if a 3090 at balanced is under 60s, I imagine it would be quite low with no DSSL on a 3080 regardless of VRAM.
 
The described scenario is with no DLSS at all, if a 3090 at balanced is under 60s, I imagine it would be quite low with no DSSL on a 3080 regardless of VRAM.

Performance in Cyberpunk 2077 at 4K is low on every GPU available today without DLSS and ray tracing / maximum settings. You need DLSS to make it playable using those settings.
 
Performance in Cyberpunk 2077 at 4K is low on every GPU available today without DLSS and ray tracing / maximum settings. You need DLSS to make it playable using those settings.
That the original point, better look at VRAM usage at would be playable for a 3080 scenarios, situation where a 3080 having just 10 gig of ram become an issue are already existing and will be common, do they matter or are they only in situation that a 20 gig 3080 would struggle anyway, a bit like we see now in 2020 games video card with less than 6 gig crashing under some scenarios when it feel like the same card with 6 gig would be able to do 25-30 fps, that something to point, but it is more interesting to see how the card handle themselves in scenario were they would be able to push 40 and stay always above 30 fps, if they had the ram.
 
Budget constraints not mentioned to help narrow the field but hands down no budget constraints the 3090 wins by a good margin in most games.
Not vs 6900xt see HU reviews.

At 4k 6900xt is faster in actually more games. 6800xt can be OCd to pretty much 6900xt level. I know it feels like the 3090 is faster emotionally but realistically its not. And dont mention RT. That should NOT be a decision parameter unless you specifically and only want rays.

However, for best of all gaming scenarios, op, I recommend 3080.

And I have owned both cards side by side. 3090 evga and 6900xt and there was ZERO benefit I could derive in having a 3090.

Cyberpunk should NOT be the game to decide your purchase on. Cp2077 is a hot mess and its a graphical bloatware disaster with so many rediculous demands placed on a system that is imperceptable to a gamer.

Turn off RT, can you tell the difference? If so where? A rain puddle? Vokumeteric lighting is so messed up by all the bloated gaseous and foggy haze atmosphere you cant even tell where light source is.

I also discivered that nVidia is essentially using DLSS to cheat a high FPS when I noticed the 6900xt delivering the same frame rate without DLSS voodoo.

So I personally wouldnt base a purchase off of this graphical trainwreck of a game. Great amazing story and it will suck you in but its literally the Crysis of today and the game is in a crysis. So dont think a 3090 will be a better card because you will see after actually owning one that you bought a semi Truck and you havent anything to pull with it.
 
Last edited:
6800XT, because I will not send the money I earned to Nvidia after their various corporate grabs at throttling the market.
 
Not vs 6900xt see HU reviews.

At 4k 6900xt is faster in actually more games. 6800xt can be OCd to pretty much 6900xt level. I know it feels like the 3090 is faster emotionally but realistically its not. And dont mention RT. That should NOT be a decision parameter unless you specifically and only want rays.
According the HU/TechSpot review in individual game benchmarks at 4K the 3090 and 6900 XT "won" two games each. The 18 game average shows the 3090 ahead at 4K by about 6%.

1609020747563.png

I also discivered that nVidia is essentially using DLSS to cheat a high FPS when I noticed the 6900xt delivering the same frame rate without DLSS voodoo.
All of the benchmarks and data I have seen do not support your discovery.
 
Last edited:
I also discivered that nVidia is essentially using DLSS to cheat a high FPS when I noticed the 6900xt delivering the same frame rate without DLSS voodoo.


Not. Even. Close.

From Gamer Nexus's review of the 6800 XT

Control DLSS.png


THIS is the power of DLSS.
 
Not. Even. Close.

From Gamer Nexus's review of the 6800 XT

View attachment 312986

THIS is the power of DLSS.
Nice cherry pick. I am not going to continue this conversation anymore past this point - you are being incredibly ridiculous. AMD is the clear winner for raw graphical horsepower and the price is a stellar deal compared to the bloated 3090. Sorry you and nV both lose.

Here is a nice wide spread of common resolutions and scenarios the average person is going to play with. And no one buys FE cards anymore because they virtually dont exists. Thus 1700 is the avg price you are going to see for an AIB 3090.

Oh look below on a raw raster level that precious 3090 is huffing along ------------------------behind the AMD

1609031952918.png

1609032000717.png

1609032023790.png

1609032046813.png


The one that matters - because it shows what the fastest card truly is on the hardware level - raw raster performance across the board
1609032103372.png


Lets look at this closer - Ray tracing on without all the voodoo magic of DLSS which is actually fake fps which I already know youre gonna argue with me until
time its self runs out for the universe so don't bother.
1609032165254.png


Again at Native Resolution
The 6900xt stomps your precious beloved card you don't even own but I did
1609032314821.png

A little Pauls hardware
1609032373385.png


Oh lookie! 7 FPS more for $700 more and that is not even giving value to the fact that the 700 more card has a whopping 0.9 fps faster 1% low

1609032403037.png


Oh 1 fps faster

1609032464481.png

Whaaaaa? Even the lower AMD is faster than the higher AMD which are both faster than the highest nVidia

1609032535354.png


oooh look $700 more for 3.3 FPS more

1609032558012.png


Oh man 6 fps more @ $700 more
1609032628915.png


I can only imagine when AMD fires up its Voodoo Black Magic DLSS equivalent the 6900xt is going to wipe the floor with nVidia (ok im jesting a little with this one)

So I am done with this converstaion. I hope the OP can decide wisely on what card to get that he can't even get anyways.

Oh and this, look a 6900xt in the box and an EVGA 3090 right there. I have used them both and know directly what they can and cant do and I am not attacking the 3090. It is a hell of a card. I wont lie. But its not worth $700 even a $1000 more on average to the OP over a 68/69xt. Sorry I am going to draw the bullshit line right there and defend AMD as the king of this launch! (ok maybe the 3080 is the king -probably so)

If you really want to know my ultimate opinion I would say get the 3080. Not the AMD stuff, not the 90. The 3080, albeit anemic in memory, has best of all worlds. DLSS Bafoonary, RTX performance, a fantastic software suite and great driver support and is a powerhouse.

/fin

20201218_123246.jpg
 
Last edited:
Nice cherry pick. I am not going to continue this conversation anymore past this point - you are being incredibly ridiculous. AMD is the clear winner for raw graphical horsepower and the price is a stellar deal compared to the bloated 3090. Sorry you and nV both lose.

Here is a nice wide spread of common resolutions and scenarios the average person is going to play with. And no one buys FE cards anymore because they virtually dont exists. Thus 1700 is the avg price you are going to see for an AIB 3090.

Oh look below on a raw raster level that precious 3090 is huffing along ------------------------behind the AMD

View attachment 312988
View attachment 312989
View attachment 312990
View attachment 312991

The one that matters - because it shows what the fastest card truly is on the hardware level - raw raster performance across the board
View attachment 312992

Lets look at this closer - Ray tracing on without all the voodoo magic of DLSS which is actually fake fps which I already know youre gonna argue with me until
time its self runs out for the universe so don't bother.
View attachment 312994

Again at Native Resolution
The 6900xt stomps your precious beloved card you don't even own but I did
View attachment 312997
A little Pauls hardware
View attachment 312998

Oh lookie! 7 FPS more for $700 more and that is not even giving value to the fact that the 700 more card has a whopping 0.9 fps faster 1% low

View attachment 312999

Oh 1 fps faster

View attachment 313000
Whaaaaa? Even the lower AMD is faster than the higher AMD which are both faster than the highest nVidia

View attachment 313001

oooh look $700 more for 3.3 FPS more

View attachment 313002

Oh man 6 fps more @ $700 more
View attachment 313004

I can only imagine when AMD fires up its Voodoo Black Magic DLSS equivalent the 6900xt is going to wipe the floor with nVidia (ok im jesting a little with this one)

So I am done with this converstaion. I hope the OP can decide wisely on what card to get that he can't even get anyways.

Oh and this, look a 6900xt in the box and an EVGA 3090 right there. I have used them both and know directly what they can and cant do and I am not attacking the 3090. It is a hell of a card. I wont lie. But its not worth $700 even a $1000 more on average to the OP over a 68/69xt. Sorry I am going to draw the bullshit line right there and defend AMD as the king of this launch!

If you really want to know my ultimate opinion I would say get the 3080. Not the AMD stuff, not the 90. The 3080, albeit anemic in memory, has best of all worlds. DLSS Bafoonary, RTX performance, a fantastic software suite and great driver support and is a no shit powerhouse.

/fin

View attachment 313006
Sure are a lot of 1080p and 1440p benches shown here. Interesting when in the original discussion you claimed the 6900 XT winning in most games at 4K. Don’t move the goal post too far.

Not many people are going to drop $1000+ on a GPU to game on a 1440p monitor, let alone on a 1080p monitor.
 
Sure are a lot of 1080p and 1440p benches shown here. Interesting when in the original discussion you claimed the 6900 XT winning in most games at 4K. Don’t move the goal post too far.

Not many people are going to drop $1000+ on a GPU to game on a 1440p monitor, let alone on a 1080p monitor.

I said I am done replying to this. I didn't move the goal post. I honestly got sick and tired of posting shit in reply and gave up while copying and pasting. Im getting older and honestly just tired of arguing on the internet with complete strangers. I derive ZERO benefit from any of it.

1609035116160.png


OP get what ever you want. I have already said it 2x the 3080 is the best card overall. Enjoy your GPU if and when you can get it.
 
  • Like
Reactions: noko
like this
I said I am done replying to this. I didn't move the goal post. I honestly got sick and tired of posting shit in reply and gave up while copying and pasting. Im getting older and honestly just tired of arguing on the internet with complete strangers. I derive ZERO benefit from any of it.

View attachment 313021

OP get what ever you want. I have already said it 2x the 3080 is the best card overall. Enjoy your GPU if and when you can get it.

I'm simply challenging claims you made and didn't back up. You had claimed the 6900 XT wins in more games at 4K and then proceeded to post mostly 1080p/1440p benchmarks. The OP was about 4K gaming as well.

Back on topic, Big Navi unfortunately begins to slightly choke at 4K in some games and that is where Ampere begins to gap it. At 1440p Big Navi and Ampere are somewhat similar, maybe a very slight edge to Big Navi. At 1080p Big Navi flat out wins in raw rasterization. But as I said, I'm not sure who would buy a $1000+ or even a $700+ GPU to game at 1080p.

I wanted the 6900 XT, but both the reference and AIB cards are basically vaporware at this point compared to even the 3080 and 3090 which I've had an easier time securing. I paid the 3090 premium but at least I know that I got the GPU that is most likely best suited for my use case. I wasn't a fan of RT until I turned it on in CyberPunk 2077, so I'm partially glad I was unable to secure a 6900 XT as I would've never really known what I was missing out on. I'm excited to see what future games bring with RT.
 
Status
Not open for further replies.
Back
Top