PC Gamer - AMD RX 6800 XT runs at almost half the frame rate of Nvidia's RTX 3080 in Vulkan Ray Tracing tests

The big question will be if the 6800XT ( and other cards like 6800 and 6900XT) will, thanks to hardware, considerably lag behind Nvidia's higher end 3000 series or if this is essentially growing pains. Its no surprise that Vulkan Raytracting has to date literally been built exclusively to service Nvidia cards/needs - as they were the ones building it! AMD's latest GPUs coming out of the gate still have some difficulties when it comes to DirectX raytracing thanks to lack of optimization, so the same thing for Vulkan isn't surprising.

I hope that within a relatively short time those performance deficits will be made up ASAP which, along with AMD showcasing their version of DLSS and the like, will mean that more open applications of raytracing and other tech will be preferable (on PC GPUs from AMD and Nvidia + consoles like PS5/XBSX) as opposed to something suited primarily to Nvidia. I really want to support AMD here and its fantastic that their rasterized performance and core game features are present, but I don't want to see another generation where Nvidia will be able to convince players and more importantly developers/publishers to invest in their way of doing things , because now the lead is predicated on raytracing, DLSS, and other features (ie Nvidia's GameStream encoder) . Hopefully AMD has a top notch dev team improving Vulkan raytracing among other features so that the gap between the two narrows to the point its negligible.
 
The big question will be if the 6800XT ( and other cards like 6800 and 6900XT) will, thanks to hardware, considerably lag behind Nvidia's higher end 3000 series or if this is essentially growing pains. Its no surprise that Vulkan Raytracting has to date literally been built exclusively to service Nvidia cards/needs - as they were the ones building it! AMD's latest GPUs coming out of the gate still have some difficulties when it comes to DirectX raytracing thanks to lack of optimization, so the same thing for Vulkan isn't surprising.

I hope that within a relatively short time those performance deficits will be made up ASAP which, along with AMD showcasing their version of DLSS and the like, will mean that more open applications of raytracing and other tech will be preferable (on PC GPUs from AMD and Nvidia + consoles like PS5/XBSX) as opposed to something suited primarily to Nvidia. I really want to support AMD here and its fantastic that their rasterized performance and core game features are present, but I don't want to see another generation where Nvidia will be able to convince players and more importantly developers/publishers to invest in their way of doing things , because now the lead is predicated on raytracing, DLSS, and other features (ie Nvidia's GameStream encoder) . Hopefully AMD has a top notch dev team improving Vulkan raytracing among other features so that the gap between the two narrows to the point its negligible.
This is almost exactly the same situation we were in back when DX10 and the 8800GTX were the new hotness. The HD2900XT was late to the party and a good bit slower than G80 at DX10. Fast forward a couple generations and by HD4000 and especially HD5000, AMD caught and surpassed Nvidia.
 
Hey more than 60fps at 1080p isn't bad for your first try. I didn't expect mindblowing RT performance the first go around from AMD but seeing that graph is more than I though it would be.

A comparison with last generations RT performance.
Q2-RTX-1080-1480x833.png

6800XT > RTX 2060 in Quake 2 RT
 
Last edited:
I still question AMD's ability to deliver quality GPU drivers.

That is actually one of the reasons I am not actively looking to buy the 6800XT. It is a tough choice for me. I prefer NVIDIA but the 3090 is crazy money for the performance gain. 3080, not enough VRAM for my 4k gaming. Waiting for the 3080 ti but it has been delayed.
 
Well if more than five people had the damn cards we could actually learn if the 6k drivers are any good or not. Personally, I think the driver thing is overblown. I've never had any real bad issues. Then again I'm running an old POS R9 290 so what do I know?
 
Not gonna lie, those are disappointing numbers. No way to spin that.

Yeah, it's their first gen but still. If the card is struggling in Quake II then you can forget Cyberpunk 2077.

I mean, maybe there is room for driver improvements, but I'm not expecting a 2X boost. Sad.
 
Not gonna lie, those are disappointing numbers. No way to spin that.

Yeah, it's their first gen but still. If the card is struggling in Quake II then you can forget Cyberpunk 2077.

I mean, maybe there is room for driver improvements, but I'm not expecting a 2X boost. Sad.
I suppose, but it was kind of a false hope to begin with. It was either make a competitive card in rasterization or make a competitive card for RT or do both and be mediocre in both.

Like the 20xx series it will take AMD a generation or two to shine with RT (hopefully)
 
No not really a false hope. AMD had the potential there, but obviously Nvidia had a several year head start.

That said, none of that matters when it comes time to purchase a graphics card. You look at the features and the performance and you make a decision.

And, right now, AMD is far behind in ray tracing, and we've yet to see their DLSS competitor (maybe another false hope).

BUT, since AMD is using standard shader cores and not dedicated hardware, there is a small chance they could release a driver update or game patch. I'll give them the benefit of the doubt.
 
Not gonna lie, those are disappointing numbers. No way to spin that.

Yeah, it's their first gen but still. If the card is struggling in Quake II then you can forget Cyberpunk 2077.

I mean, maybe there is room for driver improvements, but I'm not expecting a 2X boost. Sad.
The more disappointing thing is that it means the consoles aren't going to be good at ray tracing, which will hold it back on PCs. If AMD was crushing it with RT, we'd see a lot more adoption. As it stands, it'll be more niche, even as they improve their performance.

I'm still a little undecided on it at this point. It is cool tech, but there are a lot of sacrifices being made to make it work realtime which translate to things like sparking/noise in games when you use it.

Oh also Quake 2 is more of a hitter, RT wise, than you might things since it is ALL ray traced, and most games just use it to augment some things. I have to run Quake 2 RTX at like 90% rez scale on my monitor to get 60fps, but Control runs at 80-110 fps with everything cranked with no DLSS. So while you might think if it can't handle Quake it can't handle anything but actually games that just use it to augment visuals may be less demanding.
 
ray tracing to me is like what 4k was 4 or 5 yrs ago. not on my list of concerns. it'll most likely be a thing next gen which is like 7-8 yrs from now.

if i have to play games at 24-30fps to have ray tracing, well, they can shove it. i've lived this long without it and cyberpunk is a beauty to behold without it, so i need not. (in yoda voice)

🤙🏻RT OFF GANG!!!
 
The 3000 NV cards don't shine with RT either. Lets all just face facts RT is a 30-60 FPS technology today, and it will be for another generation probably at least. AMD hasn't released a driver optimizing for vulcan yet at all.... I'm willing to believe there is another 20-30% on the table there driver wise. Which would put them where I would expect about on par with 2080 level performance.

Reality is however in the real modern games the WD legion and Cyberpunk... even the 3080 is a 1080p card with RT enabled. Its just the way it is. This tech is not ready for a place as a defacto must enable quite yet.

The other sad reality is this... RT is now locked in for 3-5 years depending how long this generation of console sticks around before its replaced by new models. Game developers will be targeting the consoles RT capabilities.... if we're lucky there will be a ultra setting on the PC that will cast a few more rays or like cyberpunk there will be more objects in the world at the same time on PC (so more reflections and shadows to calculate). RT is basically now frozen in time to cater to the consoles. If AMD and NV stick with it the next next generation of consoles should usher in real RT. lol
 
Last edited:
The more disappointing thing is that it means the consoles aren't going to be good at ray tracing, which will hold it back on PCs. If AMD was crushing it with RT, we'd see a lot more adoption. As it stands, it'll be more niche, even as they improve their performance.

I'm still a little undecided on it at this point. It is cool tech, but there are a lot of sacrifices being made to make it work realtime which translate to things like sparking/noise in games when you use it.

Oh also Quake 2 is more of a hitter, RT wise, than you might things since it is ALL ray traced, and most games just use it to augment some things. I have to run Quake 2 RTX at like 90% rez scale on my monitor to get 60fps, but Control runs at 80-110 fps with everything cranked with no DLSS. So while you might think if it can't handle Quake it can't handle anything but actually games that just use it to augment visuals may be less demanding.
From the videos AMD released of developers talking about and working on next gen console titles. I assume most are using it as a light sprinkling effect. Which should allow them to still get some good performance. I think developers will figure out where they can still fake some lighting and where the RT looks best. Just using RT rays for all lighting is sort of stupid (but easier) for developers. Up till now most games that have RT have had it added after the fact... and its a couple programmers just dropping it in to replace systems already put in place by the games actual artists. So brute force RT lighting replacement. I have some hope that going forward games that are designed with RT in mind from day one will perform a lot better as developers will use RT in spots where its use will be obvious and cause some ohhh cools... but fall back to the old tricks in areas where no one will really be able to tell the difference anyway. Used properly I can see RT costing a lot less performance wise if its implemented from day one design with the actual artists involved... and not just a couple programmers and a Nvidia rep replacing all light maps. ;) (not every level/area in every game has to have water puddles and huge panes of glass :) )
 
The 3000 NV cards don't shine with RT either. Lets all just face facts RT is a 30-60 FPS technology today, and it will be for another generation probably at least.
But you gotta admit that 30-60FPS is better than Turing 15-30FPS. That's progress at least. In some ways it's following the same trajectory as AA in the early days, expensive at first and then gradually got better.

I look forward to AMD's RT improvements in 7800XT / 7900XT. RT should remain nonpartisan, any progress in this space helps the whole space.
 
Last edited:
That is actually one of the reasons I am not actively looking to buy the 6800XT. It is a tough choice for me. I prefer NVIDIA but the 3090 is crazy money for the performance gain. 3080, not enough VRAM for my 4k gaming. Waiting for the 3080 ti but it has been delayed.
10GiB is plenty for 4k onn3080. Nvidia uses very robust compression in vram.
 
I just ordered my 3080 yesterday, but I would have happily taken a 6800xt until I saw dlss in action.

I know they are supposed to have an alternative in development, but with out the dedicated hardware... I don't think it will be competitive.
 
Haha ray tracing. I turn that off almost all the time. I dont even notice rays when im ripping and tearing through a game. Its cool once or twice but then I turn it off.

I love how nVidia has to push rays rays rays because truly they have hit thier limit of raster and AMD is biting chunks out of thier heels. Rdna 3 is probably gonna smoke nVidia in raster. So nV is trying to capitalize, even through threat of thuggery, on RT because if you ask me that is all the tech lead they have on AMD right now.
 
There is like only 3 games that use RT. RT is not even main stream enough. This is all Nvidia marketing they want you to focus on RT performance like it matters. AMD is working on a opensource version of what you would consider RTX features. It will be compatible with everything from consoles to gpus. 12 months from now AMD will have more competitive RT. I don't believe it's 100% hardware or GPU related more than driver or feature optimization. If you look at the benchmarks AIB 6800 & 6800XT oc beats the 3000 series. It's so bad Nvidia went after reviewers that didn't focus only on their RT performance in benchmarks. When 98% of the games use rasterization. It's kinda stupid to believe the Nvidia hype. This coming from someone that owned a 2070 super and seen the benefits of DLSS2.0 and RT. It's just ridiculous to think RT is ready for Prime time when their are few titles.
 
10GiB is plenty for 4k onn3080. Nvidia uses very robust compression in vram.

While I agree that for now 10GB is enough, I keep my cards for at least 3 years and I skip generation(s). My current card is an evga 1080 ti ftw and the warranty is up in 4 months. It is also hard for me to give up a card with 11Gb for one with 10GB.

The game that I play the most, age of empires 3 definitive edition shows that it allocates/uses over 10GB from time to time, and yes I understand it could be allocate and not use, but for my peace of mind, I prefer a card with a minimum of 12GB, since I plan to keep it until the warranty is near its end.

I really think nvidia misses the boat this go around, thinking people that upgrade to the 3080 are 1080 card owners. Or maybe, they really try to push 1080 ti owners into buying the 3090. I don’t know, but 3090 is too much money for me. I paid $800 with tax for my 1080 ti, at most I would pay $1k this time around, so I am hoping for the 3080 ti if that ever comes out.
 
I still question AMD's ability to deliver quality GPU drivers.

I still wonder what all of the whining about AMD driver issues is about.

I've used both Nvidia and AMD GPU's over the years, and for single GPU installs they were pretty equivalent. (SLI and Crossfire were a whole other story, both AMD and Nvidia sucked here)

I like Nvidia's driver control panel better for the configurability it provides, but I never encountered any problems with AMD's drivers.

In fact, the most recent real problem I have had with a driver was on the Nvidia side.

My Pascal Titan X launched with 369.05 drivers and it worked fine. The next driver update was 372.54, and as soon as I installed them I started getting intermittent. "Code 43: Windows has stopped this device..." errors at boot time. This went on for like two years of new driver versions before they finally fixed it. For a while I just rolled back to 369.05 drivers, but eventually I wanted to run titles launched after 2016, and take advantage of the driver optimizations...
 
AMD cards struggle with renderer built by NVIDIA, news at 11!

(inb4 comments about the API. I'm talking about the renderer itself, not the extensions.)
 
More than most I want to see AMD thrive because they tend to be an equal or better value while also being more ethical and open in certain facets (ie open source Linux driver base, using standards instead of proprietary etc) so that's a reason that I really hope to see progress here. If it appears that AMD can gain a significant - maybe not a dominant "better than 3080 / 3080ti / 3090" level raytracting performance but significant - amount from now, adding another 30-50% or so, then the argument for developers to favor AMD's way of doing things RT wise (ie for both AMD cards and both next-gen consoles) will lean properly in that direction.

The big threat is if Nvidia can make it appear that raytracing really matters to hardware and game buyers alike as THE thing to have, in concert with DLSS or not etc.. then they can convince more developers to go their direction, even as a halo feature. We don't want that because that means that even if AMD's next cards are a big step forward for RT, Nvidia is again able to manipulate the narrative and come out on top - either with their own answer in terms of existing/upcoming RT performance, or yet another feature. I'm most interested in a future where players, game devs, and ideally hardware (both console and PC / GPU) manufacturers alike are using open, hardware and platform/OS independent tech for gaming. When it comes to raytracing, this means Vulkan ; as far as I am aware, there's nothing competing save for DirectX 12 Ultimate Raytracing, which, though its more open than some things, s still Windows focused (though apparently it translates to Vulkan with little overhead).

With luck AMD will be able to optimize, build a renderer with their cards in mind and make up a good bit of the current deficit if it can do so and the new RDNA2 powered consoles benefit as well, then this will mean that (independent of Nvidia - who will do just fine on Vulkan ) software developers will focus on Vulkan RT and its features for games and the like, primarily. It will be the "best bet" for wide ranging performance across various hardware platforms and OSes. I'd much rather see this then have developers trying to decide if they're going to support some Nvidia specific/focused feature set because it seems "better" even if its limited to a lesser array of hardware, thus prompting fragmentation. This paradigm is Nvidia's preference and we see all the "The Way Its Meant to Be Played" and "GameWorks" features when Nvidia is willing to throw money at developers to showcase their tech - as in significant raytracing at all, right now. However, this can be disrupted if AMD provides a comparable performance after revisions to the Vulkan RT renderer, drivers, and the like.
 
There is like only 3 games that use RT. RT is not even main stream enough.

Not really, it is actually surprisingly "main stream" for such a new technology. Battlefield 5, Watchdogs Legion, Call of Duty, World of Warcraft, Fortnite, Minecraft, Cyberpunk 2077, Control, Shadow of the Tomb Raider, Mechwarrior 5. All these support ray tracing and say what you want about how much you like an individual game (Watch Dogs looks like a boring ass game to me) these are big titles. These are not niche games that nobody has heard of/plays.

Now as to if it is worth it/desirable in those games, well that is up to the individual playing them. But just because it isn't in every game doesn't mean that it is something that isn't "main stream". It's new, but there are plenty of big name titles that support it.

I still wonder what all of the whining about AMD driver issues is about.

For some, like me, it can be the once, or actually more than once, bitten twice shy thing. I hopped on the nVidia train in the GeForce 256 days and periodically tried ATI/AMD and each time, it didn't go well. It was never catastrophic, like "OMG computer is melt!" kind of thing, but I had issues that I just didn't with nVidia. The last time around was with the ATI 5870. This is back when TSMC couldn't into 40nm and they were hard to get, but I happened to score one early on, and nVidia had no DX11 card in response. It was neat and plenty ran well... but I found problems pretty fast. First was Brink. Bunch of friends wanted to play it because we thought it would be a cool squad based shooter (boy were we wrong, what a terrible game) and it didn't run well. I couldn't hold 60fps. Most powerful GPU out there, and it couldn't do it. The problem was something I'd encountered in the past: ATI drivers just sucked at OpenGL. They were much slower in GL than DX, and Brink used iDTech. Likewise I got bitten by the bug to replay Jedi Academy, so I reinstalled it. Couldn't get it to work. Research showed that there was an issue with OpenGL extensions and older games that nVidia has a solution for but ATI didn't.

So some time later, after some more frustrations, I bought an nVidia 580 when those came out, despite not being much of an upgrade. Frustrations went away. Similar experience to the previous two times I'd tried ATI.

Well that can lead to a "It just works, so I sticks with it," mentality.
 
Not really, it is actually surprisingly "main stream" for such a new technology. Battlefield 5, Watchdogs Legion, Call of Duty, World of Warcraft, Fortnite, Minecraft, Cyberpunk 2077, Control, Shadow of the Tomb Raider, Mechwarrior 5. All these support ray tracing and say what you want about how much you like an individual game (Watch Dogs looks like a boring ass game to me) these are big titles. These are not niche games that nobody has heard of/plays.

Now as to if it is worth it/desirable in those games, well that is up to the individual playing them. But just because it isn't in every game doesn't mean that it is something that isn't "main stream". It's new, but there are plenty of big name titles that support it.



For some, like me, it can be the once, or actually more than once, bitten twice shy thing. I hopped on the nVidia train in the GeForce 256 days and periodically tried ATI/AMD and each time, it didn't go well. It was never catastrophic, like "OMG computer is melt!" kind of thing, but I had issues that I just didn't with nVidia. The last time around was with the ATI 5870. This is back when TSMC couldn't into 40nm and they were hard to get, but I happened to score one early on, and nVidia had no DX11 card in response. It was neat and plenty ran well... but I found problems pretty fast. First was Brink. Bunch of friends wanted to play it because we thought it would be a cool squad based shooter (boy were we wrong, what a terrible game) and it didn't run well. I couldn't hold 60fps. Most powerful GPU out there, and it couldn't do it. The problem was something I'd encountered in the past: ATI drivers just sucked at OpenGL. They were much slower in GL than DX, and Brink used iDTech. Likewise I got bitten by the bug to replay Jedi Academy, so I reinstalled it. Couldn't get it to work. Research showed that there was an issue with OpenGL extensions and older games that nVidia has a solution for but ATI didn't.

So some time later, after some more frustrations, I bought an nVidia 580 when those came out, despite not being much of an upgrade. Frustrations went away. Similar experience to the previous two times I'd tried ATI.

Well that can lead to a "It just works, so I sticks with it," mentality.

OK. I can see that.

You do realize that was almost 10 years ago now though, right? :ROFLMAO:
 
OK. I can see that.

You do realize that was almost 10 years ago now though, right? :ROFLMAO:
I do, but it is the thing that nVidia just keeps working for me. I buy an nVidia card, it works well and plays my games. So I keep doing it. Like many people, I do that with a lot of things in life. If a brand/company does well, I keep buying the product unless I'm given a reason to switch. I don't go seeking to help the underdog, I just want something that works.

Plus during many upgrades there wasn't really a choice. I like high end and for quite some time, AMD hasn't had anything. Like when I bought my 1080Ti, there was no Radeon even close. If I wanted something at that level, it was nVidia or nothing.

As to why I bought nVidia this time around? Biggest reason was being able to get my hands on it. I could, and did, get a 3090. So, done deal. I wanted a new toy, I could have that one but not an AMD one, so I had it. A lesser consideration was that my monitor is Gsync, as in has an actual Gsync module. Should work fine with Freesync but I'd rather not have any issues. Finally the most minor consideration was the fact that all my nVidia cards have worked quite well for me, my 3 ATI cards have had issues, so I'd rather stick with what I know.

Just saying, that is where you get some of the AMD driver stuff from. There is a history and ya, maybe they've fixed it lately, but it takes time to un-earn reputations. Near as I know they've never gotten their shit together with OpenGL, it is just that we don't use that much anymore games that are not DX have moved on to Vulkan, which AMD seems to be good at. So even if their drivers are now the eq2ual to or superior of nVidia's, it'll take time for that to become established as a reputation with people. They aren't going to jump on it because some guy says "AMD is good now!" because they heard that in the past, and it wasn't true.
 
I do, but it is the thing that nVidia just keeps working for me. I buy an nVidia card, it works well and plays my games. So I keep doing it. Like many people, I do that with a lot of things in life. If a brand/company does well, I keep buying the product unless I'm given a reason to switch. I don't go seeking to help the underdog, I just want something that works.

Plus during many upgrades there wasn't really a choice. I like high end and for quite some time, AMD hasn't had anything. Like when I bought my 1080Ti, there was no Radeon even close. If I wanted something at that level, it was nVidia or nothing.

As to why I bought nVidia this time around? Biggest reason was being able to get my hands on it. I could, and did, get a 3090. So, done deal. I wanted a new toy, I could have that one but not an AMD one, so I had it. A lesser consideration was that my monitor is Gsync, as in has an actual Gsync module. Should work fine with Freesync but I'd rather not have any issues. Finally the most minor consideration was the fact that all my nVidia cards have worked quite well for me, my 3 ATI cards have had issues, so I'd rather stick with what I know.

Just saying, that is where you get some of the AMD driver stuff from. There is a history and ya, maybe they've fixed it lately, but it takes time to un-earn reputations. Near as I know they've never gotten their shit together with OpenGL, it is just that we don't use that much anymore games that are not DX have moved on to Vulkan, which AMD seems to be good at. So even if their drivers are now the eq2ual to or superior of nVidia's, it'll take time for that to become established as a reputation with people. They aren't going to jump on it because some guy says "AMD is good now!" because they heard that in the past, and it wasn't true.

Hey this is almost exactly my experience! My last AMD card was the 290 and I was constantly getting black screen and flicker, and drivers crash or failed to install, so I gave up and switch to GTX 680.
 
I still question AMD's ability to deliver quality GPU drivers.

I question developers delivering stable bug free games. Oh wait. that never happens anymore.

That is actually one of the reasons I am not actively looking to buy the 6800XT. It is a tough choice for me. I prefer NVIDIA but the 3090 is crazy money for the performance gain. 3080, not enough VRAM for my 4k gaming. Waiting for the 3080 ti but it has been delayed.

Running Windows 10? Do you have a 68xx card?

Hey this is almost exactly my experience! My last AMD card was the 290 and I was constantly getting black screen and flicker, and drivers crash or failed to install, so I gave up and switch to GTX 680.
So, I installed my 6900 XT yesterday. (Still can't believe I actually was able to buy one! Just got lucky.)

PXL_20201215_233614462.NIGHT.jpg

I intentionally did NOT uninstall the previous NVIDIA drivers. I intentionally did not run the AMD "clean install" option either. I left Afterburner installed. I left Intel XTU installed. And I left Riva Tuner installed. I wanted to see if I had issues. (My personal box is an 8600K that I usually run at 5GHz, 32GB 3600MHz RAM, with Win 10 Pro kept up to date. Full specs in sig.)

I had zero issues with the install. Played a good bit of CyberPunk and Assassin's Creed Valhalla last night with not a single issue. Radeon monitoring software is pretty nice as well.
 
Back
Top