If Navi matches 3080 in Raster performance but not Ray Tracing... will we care?

Hmm to the OPs question: Not so much since I don’t use RT (yet) but it might matter in 2021+ so I’d want top performance. DLSS 2.0 coming to Warzone is something I still hold out hope for so it’s why I’d stick with NVIDIA. Now if AMD can pull off 10-15% higher raster performance at 1080/1440p vs 3080 and still have plenty of OC headroom, then I’d give Navi 21 a very serious look.
 
The problem for AMD at this point is that Nvidia already has a generation head start on Ray Tracing tech. Even if AMD is able to match Nvidia @ RT, it's not going to matter because virtually all current RT implementations are proprietary, and if that's not enough, Nvidia has DLSS, which is a game changer as far as game performance goes. AMD isn't going to be able to brute force their way ahead of DLSS.

The truth is that Big Navi has already lost, and it's not even out yet. It will be at least 1 more generation before AMD can be at parity with (or beat) Nvidia as far as RT and AI-based rendering goes.
Why are they proprietary when they use DX raytracing which both cards will be compatible with? Also the famed 2077 everyone loves to bring up is being written with consoles in mind, which means AMD implementation. I'm not sure why you think games won't run on AMD cards? Maybe the performance will be lower, maybe not, we have no clue, but your ramblings aren't rooted in facts.

They don't have to brute force there way past dlss, they will do what they usually do, release something with something like directml that is open source (or ateast open standard) that can run on all hardware, which gives a much larger market when implemented (aka, Intel, and and Nvidia as well as consoles). Also, how many implementations of dlss 2 are there to date? Dlss 1 was never that great, 2 actually seems like it could be useful but not a lot of support to date. This will hopefully change in the future (or an open solution will replace it) but I think this is less of a deal breaker for most since there are like 1-2 games where it doesn't look worse than an upscaling shader? I mean, the concept is great, but the technology is just now getting there with the 2nd version. I think it could be super useful for things outside gaming as well like all the at home conference calls... Lower bitrates + dlss could make meetings less laggy/choppy with better video quality, but again, this would be better if it was supported on all major hardware to actually be useful.
 
Why are they proprietary when they use DX raytracing which both cards will be compatible with? Also the famed 2077 everyone loves to bring up is being written with consoles in mind, which means AMD implementation. I'm not sure why you think games won't run on AMD cards? Maybe the performance will be lower, maybe not, we have no clue, but your ramblings aren't rooted in facts.

They don't have to brute force there way past dlss, they will do what they usually do, release something with something like directml that is open source (or ateast open standard) that can run on all hardware, which gives a much larger market when implemented (aka, Intel, and and Nvidia as well as consoles). Also, how many implementations of dlss 2 are there to date? Dlss 1 was never that great, 2 actually seems like it could be useful but not a lot of support to date. This will hopefully change in the future (or an open solution will replace it) but I think this is less of a deal breaker for most since there are like 1-2 games where it doesn't look worse than an upscaling shader? I mean, the concept is great, but the technology is just now getting there with the 2nd version. I think it could be super useful for things outside gaming as well like all the at home conference calls... Lower bitrates + dlss could make meetings less laggy/choppy with better video quality, but again, this would be better if it was supported on all major hardware to actually be useful.

While RTX RT runs through DXR the game devs will still need to do some programming / optimizing for AMD. Hopefully AMD does a good job on the toolkit and it’s similar to AMD’s so devs actually use it.
 
I actually have begun to enjoy the more natural lighting in Raytracing then without. I very much hope that RDNA 2 can match it.

That said, RDNA 2 has a tough sell for me because Ampere doesn't just offer better Raytracing. DLSS has proven its worth on my ultrawide, and 4K TV. I prefer the higher frame rates and image quality that DLSS 2.0 provides vs AMDs TAA upscale with sharpening. Additionally, my actual workflow benefits from CUDA more than OpenCL. As much good as Freesync has done, I have personally found it not to be as smooth as a monitor with a Gsync module. Yes I know someone will contest that statement, but this what I have experienced not them.

I will have some RDNA 2 products in my home though. I have an Xbox Series X on order, and plan to get a PS5 as well. I also have a Freesync only 4K TV that I normally use a 5700XT on (currently in my main rig....wife stole the RTX 2080 LOL). However, I suspect I will likely stay in the Nvidia echosystem.
 
Well they were very intent on not stating the model number and the AMD said on the record that we didn't say which card it was which you only say if it's NOT the top end card. Plus AMD has an entire event for it and they are not going to spoil the biggest card. Plus AMD always holds back a halo product for the last reveal. It may be Navi 21 but it is not the top end card. And teasing your top end card would be idiotic.

Well they basically showed that whatever card that was wasn't quite as good as a 3080. So it's not like they are going to admit that is their top-end card even if it was, would sort of kill all hype for it.
 
RT is basically PhysX at this point. If AMD comes in at 5% slower than 3080 with more VRAM, lower power consumption, and $100 cheaper they will have my business.
 
I probably won't buy any amd card at this point. I want ray tracing. But I am looking forward to amd getting some competition going.
 
Nah, I'll pass on RT. I don't play anything with it now, and probably won't play anything with it for this gpu generation.

I *might* check out Cyberpunk, but that's a very big "might".
 
If the performance is there and the price is right then yes. I haven't liked anything I've seen from DLSS or Ray Tracing yet aside from demos. The performance is so awful it just isn't worth it. Being on a 144hz 1440p screen nvidia has a long way to go and seeing the ray tracing performance was probably the most disappointing thing about the 3080. I expected a bit more over the 2080ti.
 
The problem for AMD at this point is that Nvidia already has a generation head start on Ray Tracing tech. Even if AMD is able to match Nvidia @ RT, it's not going to matter because virtually all current RT implementations are proprietary, and if that's not enough, Nvidia has DLSS, which is a game changer as far as game performance goes. AMD isn't going to be able to brute force their way ahead of DLSS.

The truth is that Big Navi has already lost, and it's not even out yet. It will be at least 1 more generation before AMD can be at parity with (or beat) Nvidia as far as RT and AI-based rendering goes.

Not necessarily. There have been discussions that they can convert CUs to Ray Tracing engines. Both Xbox and PS5 will be Navi 2 based so for many games the ray tracing may be optimized for an AMD solution. I just don't get the feeling RT has reached any level of "must have" with consumers at all.
 
Well they basically showed that whatever card that was wasn't quite as good as a 3080. So it's not like they are going to admit that is their top-end card even if it was, would sort of kill all hype for it.

Just seems that so many, maybe not you specifically have just built up this wall to just not even allow for the possibility that AMD could actually outperform Nvidia. So they showed benchmarks that may or may not be higher or lower than 3080 but it was in the ballpark. That is all AMD needed to do at this point, with all the AMD doubters just days before claiming it would only "maybe" compete with a 3070 and be around 2080 TI performance which ignored all reality. In addition the 3090 is only 13% faster than 3080 and both are not scaling great at 1440p, so if it is not the top card, AMD might be able to stand toe to toe with the 3090 at 4K, beat it outright at 1440p and offer lower pricing and power usage. But I saw today people STILL insisting that Big Navi even with the benchmarks will only be 15% better than a 2080Ti
 
I believe that AMD will have great RT performance. While we have no true proof, except a snippet of BL3 during the Lisa reveal, we do have PS5 and Xbox Series X release footage of the games with RT on and they look way above 60 fps. Supposedly that is at 4k as well. Since those are consoles, I would expect the performance to be above that for the discreet cards we will be able to have. Right now, in their favor is supply. Using a node that is refined is a key to getting more made. As with the bump in prices of the CPUs they announced, I expect a sizable bump in GPU prices, if they prove the cards are better than their console variants. I currently have a 2070 Super and it easily OCs to just a hair under the 2080 Super speeds (+1113 RAM, +112Mhz GPU) . I was considering the RX 5700 XT, but with all the driver issues and even some compatibility issues with some games I play, I decided against it. I was tired of having to mess with a driver or adjust things in the card just to make it work. I wanted something that would plug and play, Nvidia provided that experience for me. I hope AMD will learn and fix this ongoing driver issue with their cards, but it has been a couple decades since that has happened.
I believe the mass majority of RT will be devoloped with consoles first in mind lending to AMD performance. They might not now be best out of the gate on older titles, but surely their hardware will look better for performance on RT as time passes as that is what developers will consider the standard.
 
I will give zero fuck.
I just tried WoW prepatch with RT. Went from 120 fps (locked) to 66 fps at 4k. Got softer shadows and that's all (2080Ti).
NOPE. I game at 4k/120hz, RT is too big of a hit for me to lose that much fps. But if DLSS is included (like in Cyberpunk) then I'd be a bit sad. But just a tiny lil bit.
 
These arguments against RT remind me of the when people dismissed 32 bit color, T&L engines, and unified shaders. We are in the very early generations of RT. Developers are embracing it. It's in the consoles now. Like it or not, it's here to stay. I suspect in 5 years, RT performance will be excellent, and in 10 years, it will be a requirement for gaming.
 
These arguments against RT remind me of the when people dismissed 32 bit color, T&L engines, and unified shaders. We are in the very early generations of RT. Developers are embracing it. It's in the consoles now. Like it or not, it's here to stay. I suspect in 5 years, RT performance will be excellent, and in 10 years, it will be a requirement for gaming.

You may also remeber plenty of people getting burned buying video cards with new standards that ended up as dead ends or not supported. Buying a 3000 series card for ray tracing is just as silly as the people that bought the 2000 series cards for ray tracing. RT is still in beta and will take a lot of time to become mainstream, and will not become commonplace until everyone has a card that can do it. It is going to take a long time to replace all those old low end cards, and game designers wont support it much until then.
 
Video games got really good over decades at faking light and got really good at it, it will need a killer app.

Yep, very few games use fully dynamic lighting. Most games use baked lighting which of course looks very good. This is what people are referring to when they say they can’t see a difference.

We’re simply used to static baked lighting and think it looks good enough. After a few years (decades?) of proper dynamic GI either from raytracing, SVOGI, light volumes etc we will look back on games from the past and think baked lighting looks off. This is how it has always worked. People just don’t know what they’re missing yet.

A simple example of this is any game that lets you use a flashlight or arbitrarily create/destroy light sources. Those types of games are guaranteed to look a whole lot better with raytracing because they can’t use baked lighting.
 
While RTX RT runs through DXR the game devs will still need to do some programming / optimizing for AMD. Hopefully AMD does a good job on the toolkit and it’s similar to AMD’s so devs actually use it.
That I understand... but that's not what the argument was "virtually all current RT implementations are proprietary". Optimizing is one thing, but he said they were proprietary, which means no compatibility, not just poor optimization, it means a completely new implementation would have to be written from scratch. Those are 2 very difference concerns and only one of them has truth to it ;). I agree there may be some tweaking required to get the most out of AMD hardware, but that doesn't make the current implementations proprietary with AMD having zero change of getting supported... it just means it may run a bit slower than it could unless tweaked.

Anyways, as mentioned by others, I prefer RT for global illumination, the reflections most of the time end up over-done (like AMD's RT demo, OMG to many shiny objects, lol). The GI RT seems to be a much smaller hit to performance so for me as long as RT performance is there and not a complete turd it won't make to much of a difference in the grand scheme of things.
 
Yep, very few games use fully dynamic lighting. Most games use baked lighting which of course looks very good. This is what people are referring to when they say they can’t see a difference.

We’re simply used to static baked lighting and think it looks good enough. After a few years (decades?) of proper dynamic GI either from raytracing, SVOGI, light volumes etc we will look back on games from the past and think baked lighting looks off. This is how it has always worked. People just don’t know what they’re missing yet.

A simple example of this is any game that lets you use a flashlight or arbitrarily create/destroy light sources. Those types of games are guaranteed to look a whole lot better with raytracing because they can’t use baked lighting.
While "very few games" is very relative, there are plenty of games that implemented SVOgi (sparse voxel octree global illumination) and others are using SDFgi (signed distance field global illumination) both of which provide dynamic GI (with SDFgi being a bit newer), then there are also some engines starting to implement GI via voxel cone tracing as well (which is similar to how it's done in hardware, but it was in written using shaders). These aren't just baked lighting techniques like they used to be 5-6 years ago. I'm not saying all games, or that they all have done a perfect implementation of GI because that's of course not true, but RTX isn't a perfect implementation either, it would be great to see a side by side with one of the newer techniques vs RT to compare visuals and performance costs, but most games that go RT for GI don't bother to implement a good version of GI without RT, so the comparisons tend to be apples to oranges (aka you're comparing GI technique to a scene with ambient lighting instead of 2 different types of dynamic GI implementations). All of this said, RT hardware can/will accelerate some of these techniques and I'd love to see some actual #'s to know how much the RT hardware helps or doesn't help. Raytracing is going to be the path forward eventually, but you seem to think that RT hardware is required and all other games use some crappy half baked hack (probably because a lot of games do!!!), but that isn't true for all games/engines, some have decent implementations that are similar to what you are seeing with RT.
 
Think of it like this.

NV gets its fps now from DLSS trickery. Raster performance without DLSS is meh

AMD doesnt have DLSS and it appears to be set to trade blows. This tells me AMD is a higher raw performance chip. No trickery like Mr. Leather
 
Yep, very few games use fully dynamic lighting. Most games use baked lighting which of course looks very good. This is what people are referring to when they say they can’t see a difference.

We’re simply used to static baked lighting and think it looks good enough. After a few years (decades?) of proper dynamic GI either from raytracing, SVOGI, light volumes etc we will look back on games from the past and think baked lighting looks off. This is how it has always worked. People just don’t know what they’re missing yet.

A simple example of this is any game that lets you use a flashlight or arbitrarily create/destroy light sources. Those types of games are guaranteed to look a whole lot better with raytracing because they can’t use baked lighting.
Very astute - and one of the driving factors for RT is to cut down on the development costs of baking processes in modern games. Our (finally) getting user side RT is not a consumer focused decision - rather industry is driving the need because of the high cost associated with the current baking workaround.

In other words - if they can offload costs onto the consumer and have the work done on the local machines, they save money.

Still - and eventually - we will be the beneficiaries, but at the current tech level and deployment, baked lighting isn't going anywhere for a long while.
 
More important to me is 4k raster performance. RT not important for me at this stage, though with next gen cards (Hopper, RDNA 3) things should look more interesting with more RT titles and more capable HW to run RT @ 4k.
 
You may also remeber plenty of people getting burned buying video cards with new standards that ended up as dead ends or not supported. Buying a 3000 series card for ray tracing is just as silly as the people that bought the 2000 series cards for ray tracing. RT is still in beta and will take a lot of time to become mainstream, and will not become commonplace until everyone has a card that can do it. It is going to take a long time to replace all those old low end cards, and game designers wont support it much until then.

Within 5 years there will be 100 million or more "cards" in the form of consoles that can Ray Trace thanks to the consoles. In addition, every single GPU RNDA 2 and Ampere card announced so far has it.
 
While "very few games" is very relative, there are plenty of games that implemented SVOgi (sparse voxel octree global illumination) and others are using SDFgi (signed distance field global illumination) both of which provide dynamic GI (with SDFgi being a bit newer), then there are also some engines starting to implement GI via voxel cone tracing as well (which is similar to how it's done in hardware.

I'm pretty sure there are more games implementing hardware accelerated RT today than games using light volumes or voxel tracing. Which is ironic given SVOGI has been around for over a decade.

I know of these games. What are the others?

Crysis 3 - one bounce SVOGI, only works for the sun.
Kingdom Come Deliverance - SVOGI, lots of artifacts.
Star Citizen - SDF
Forza 3 - VXGI
 
Last edited:
I went all in with a GSync monitor so I think I'm stuck with Nvidia and it's craptastic inventory. Hoping to score a 3080 in the near future...
 
Its going to be a long time before RTX actually takes off, and I don't buy thousands of dollars of equipment just for niche features.
 
I like RT. I'd pay more for it. Now, if AMD could offer similar raster performance for $150-$200 less? I might consider that.
 
Nope. Only played 1 RT game (battlefield V). It was cool, I guess. Could care less if my game has RT or not.
 
I'm pretty sure there are more games implementing hardware accelerated RT today than games using light volumes or voxel tracing. Which is ironic given SVOGI has been around for over a decade.

I know of these games. What are the others?

Crysis 3 - one bounce SVOGI, only works for the sun.
Kingdom Come Deliverance - SVOGI, lots of artifacts.
Star Citizen - SDF
Forza 3 - VXGI
I do know Unreal Engine supports SVOgi and is adding support for SDFgi (UE5 in early 2021), while the GODOT (version 4?) has SDFgi support as well. Before adding SDFgi they did have something else they used (some sort of GI Probes as well as some voxel cone tracing). Again, I'm not positive which games actually suport and/or use off hand as I don't do a ton of game playing. Like I said it would be great to find a decent implementation and be able to have a comparison to RT. I'm sure RT will be better (either performance or visual or possibly both) but I don't think the difference will be that big (again, just speculation as I don't know of any good comparisons unless you know of any?) I wasn't meaning to argue, just saying that I do know there are some other options and I'm not sure how widely used they are.
 
Yes I will care. Back in the old days Radeon cards had superior image quality due to better anisotropic filtering and better VGA out (among other things) and I would pick them even when geforce cards were faster with lower IQ. This is no different - I want to play new games with eye candy on - DLSS and RT where available. AMD needs to catch up.
 
I do know Unreal Engine supports SVOgi and is adding support for SDFgi (UE5 in early 2021), while the GODOT (version 4?) has SDFgi support as well. Before adding SDFgi they did have something else they used (some sort of GI Probes as well as some voxel cone tracing). Again, I'm not positive which games actually suport and/or use off hand as I don't do a ton of game playing. Like I said it would be great to find a decent implementation and be able to have a comparison to RT. I'm sure RT will be better (either performance or visual or possibly both) but I don't think the difference will be that big (again, just speculation as I don't know of any good comparisons unless you know of any?) I wasn't meaning to argue, just saying that I do know there are some other options and I'm not sure how widely used they are.

Yeah there is a lot of engine support but it hasn't showed up in games. I can only assume it's either harder to implement than promised or it's still too slow.
 
You may also remeber plenty of people getting burned buying video cards with new standards that ended up as dead ends or not supported. Buying a 3000 series card for ray tracing is just as silly as the people that bought the 2000 series cards for ray tracing. RT is still in beta and will take a lot of time to become mainstream, and will not become commonplace until everyone has a card that can do it. It is going to take a long time to replace all those old low end cards, and game designers wont support it much until then.

Exactly. I'm not buying a card for 5 years from now. I'm buying a card for right now, and right now, RT/DLSS, etc. are gimicky at best.
 
Exactly. I'm not buying a card for 5 years from now. I'm buying a card for right now, and right now, RT/DLSS, etc. are gimicky at best.

I would disagree, RTX when implemented well is not a gimick, but that is only going to happen in a small handful of titles.

DLSS however is more of a gimick, it spoofs a higher resolution and frankly in my experience is not better than running native. Its a tool to get extra fps at the expense of eye candy, people keep thinking they can get something for free which is a fallacy.
 
These arguments against RT remind me of the when people dismissed 32 bit color, T&L engines, and unified shaders. We are in the very early generations of RT. Developers are embracing it. It's in the consoles now. Like it or not, it's here to stay. I suspect in 5 years, RT performance will be excellent, and in 10 years, it will be a requirement for gaming.
But that's not an argument for Ampere because it still sucks at ray tracing. It's maybe an argument for future generations of Nvidia cards. Unless you are saying there is some sentimental good in being part of investing in Nvidia's future.
 
I would disagree, RTX when implemented well is not a gimick, but that is only going to happen in a small handful of titles.

DLSS however is more of a gimick, it spoofs a higher resolution and frankly in my experience is not better than running native. Its a tool to get extra fps at the expense of eye candy, people keep thinking they can get something for free which is a fallacy.

That's what I meant by gimmick. It is touted as a feature, but is rarely available in practice. The only RT game I played was Shadow of the tomb raider and it was somewhat lackluster for RT. I think in 5 years the adoption rate will be higher and the effects better.
 
RT is still in beta and will take a lot of time to become mainstream, and will not become commonplace until everyone has a card that can do it. It is going to take a long time to replace all those old low end cards, and game designers wont support it much until then.
For AAA games how much do developper care about PC video cards, versus what the Xbox-PS5 offer ? I could imagine that for a game with a 2021 and after that release date in mind, they will partly assume player have PS5/Xbox capibility (not that they will stop right away ps4-xbox release, but have the newer console also in mind at least).
 
But that's not an argument for Ampere because it still sucks at ray tracing. It's maybe an argument for future generations of Nvidia cards. Unless you are saying there is some sentimental good in being part of investing in Nvidia's future.

I wasn't really arguing for Nvidia or against them. I'm stating that people dismissed a lot of new techs that are standard because they weren't very well implemented at first, and this situation reminds me of that. While in the immediate future you might not need it, there will come a time where you will. With consoles embracing it, and both manufactures supporting it, like it or not this is happening. So to refer back to my earlier post, I think I will stick with Ampere because DLSS looks better (at least to my eye) than TAA with sharpening. I really have no loyalty to any particular company. I get the parts that suit my needs and tastes. My four main systems in my house are equally spread between AMD Radeon and Nvidia depending on what worked best in my situation.

To emphasize my sig: I’m very discrete. I have no code of brand loyalty. I will build anything, anywhere; Gaming PC, Workstation, HTPC, doesn’t matter. I just love building. Wubba lubba dub dub!!!! ;)
 
I wasn't really arguing for Nvidia or against them. I'm stating that people dismissed a lot of new techs that are standard because they weren't very well implemented at first, and this situation reminds me of that. While in the immediate future you might not need it, there will come a time where you will. With consoles embracing it, and both manufactures supporting it, like it or not this is happening. So to refer back to my earlier post, I think I will stick with Ampere because DLSS looks better (at least to my eye) than TAA with sharpening. I really have no loyalty to any particular company. I get the parts that suit my needs and tastes. My four main systems in my house are equally spread between AMD Radeon and Nvidia depending on what worked best in my situation.

To emphasize my sig: I’m very discrete. I have no code of brand loyalty. I will build anything, anywhere; Gaming PC, Workstation, HTPC, doesn’t matter. I just love building. Wubba lubba dub dub!!!! ;)

So basically off-topic commentary then.
 
I wasn't really arguing for Nvidia or against them. I'm stating that people dismissed a lot of new techs that are standard because they weren't very well implemented at first, and this situation reminds me of that. While in the immediate future you might not need it, there will come a time where you will. With consoles embracing it, and both manufactures supporting it, like it or not this is happening. So to refer back to my earlier post, I think I will stick with Ampere because DLSS looks better (at least to my eye) than TAA with sharpening. I really have no loyalty to any particular company. I get the parts that suit my needs and tastes. My four main systems in my house are equally spread between AMD Radeon and Nvidia depending on what worked best in my situation.

To emphasize my sig: I’m very discrete. I have no code of brand loyalty. I will build anything, anywhere; Gaming PC, Workstation, HTPC, doesn’t matter. I just love building. Wubba lubba dub dub!!!! ;)

It certainly seems like RT will be one of those features that's just expected, even if it's limited today.

But as far as DLSS, the game choice is so small. In addition the list of games that were announced to support it and still don't is the same size or bigger, I wonder what games people are playing that makes it a must have.

Completely honest question, not trying to be a dick. Which of these game(s) is the one that makes it a gotta have feature?

Fortnite
Death Stranding
F1 2020
Final Fantasy XV
Anthem
Battlefield V
Monster Hunter: World
Shadow of the Tomb Raider
Metro Exodus
Control
Deliver Us The Moon
Wolfenstein Youngblood
Bright Memory
Mechwarrior V: Mercenaries
 
Back
Top