If Navi matches 3080 in Raster performance but not Ray Tracing... will we care?

It certainly seems like RT will be one of those features that's just expected, even if it's limited today.

But as far as DLSS, the game choice is so small. In addition the list of games that were announced to support it and still don't is the same size or bigger, I wonder what games people are playing that makes it a must have.

Completely honest question, not trying to be a dick. Which of these game(s) is the one that makes it a gotta have feature?

Fortnite
Death Stranding
F1 2020
Final Fantasy XV
Anthem
Battlefield V
Monster Hunter: World
Shadow of the Tomb Raider
Metro Exodus
Control
Deliver Us The Moon
Wolfenstein Youngblood
Bright Memory
Mechwarrior V: Mercenaries


None. If Navi is within 10% of the 3080 I'll buy it even if it's 10% more expensive. I'll be happy to have an nvidia-free build. That said...we need some leaked benches, dammittohell!
 
nope, wont care. maybe in 5+ years when everything has it, not just a handful.

Yeah once it is mainstream and supported then it is worth buying a card for RT. Anything before that your doing beta testing and your milage may vary. When all the games support it, and all the cards can do RT with no issues, then it is safe to buy a card for RT.

What happens if 2 years from now some massive breakthrough in RT happens and makes it work great, but is no longer compatable with the old cards way of doing things. You are now stuck with a card that is useless with what it was bought for, and now time to pay up and get out of the beta test.
 
Yeah once it is mainstream
Considering the next generation of console is releasing soon, isn't that could be really soon ?

The list of PS5 games with ray tracing, or at least visually recognizable ray tracing in their trailers, includes Demon's Souls, Gran Turismo 7, Marvel's Spider-Man: Miles Morales, Pragmata, Ratchet and Clank, and Stray.

I imagine how much it is used and for what could keep your statement true too, maybe it will be used by an high percentage of title, but not that much in them.

That said, because those will be RDNA 2 raytracing anyway it keep the statement of the not buying your next card between those 30xx and AMD with raytracing as a significant reason between the 2 reasonable, will see when the review come out.
 
So basically off-topic commentary then.

On topic that lead to further explanation...though technically I’m off topic by replying to this. ;)

I also wanted to emphasize I have no favorite in this race (aka fanboy) and that it was that my Personal opinion based on my use case as the topic asked.
 
Well, technically I don't think it's AMD's first try... RT can... be done on Navi albeit in software mode... and I think in preparing Navi2 for the consoles they have a pretty good idea of it's performance, as well as time to tweak the performance. But this whole thing does remind me of the DX11 58xx series when AMD first came out with Tessellation, hardware tessellation, and the performance was terrible since it was their first go at it.
Vega actually has RT hardware in it. But something about it didn't work out in production. So, it's disabled.

NVidia Broadcast is by far the best thing to come from Nvidia's AI stuff. And it just released.

DLSS is neat. But still needs another round of polish (at least, it does for Death Stranding). However, I am hopeful.
 
Last edited:
It certainly seems like RT will be one of those features that's just expected, even if it's limited today.

But as far as DLSS, the game choice is so small. In addition the list of games that were announced to support it and still don't is the same size or bigger, I wonder what games people are playing that makes it a must have.

Completely honest question, not trying to be a dick. Which of these game(s) is the one that makes it a gotta have feature?

Fortnite
Death Stranding
F1 2020
Final Fantasy XV
Anthem
Battlefield V
Monster Hunter: World
Shadow of the Tomb Raider
Metro Exodus
Control
Deliver Us The Moon
Wolfenstein Youngblood
Bright Memory
Mechwarrior V: Mercenaries


Death Stranding, FFXV, BFV (Battlefield Junkie), Shadow of Tomb Raider, Metro Exodus, Control (my wife as well on this one), MWV (though I admit I wasn't aware of DLSS support as I haven't started the campaign yet). My schedule for the last year or so has been extremely hectic, so I have a lot of catching up to do. But my use case is not everyone else's use case.

I'm playing on an Ultra Wide high refresh rate display with Free/G-Sync, or a 4K TV. I'm and Ultra settings junkie who wants a minimum of 60FPS. DLSS has it's place for me there. Additionally, if all things were equal between an upcoming Radeon and GeForce card (price/performance/RT performance), but GeForce included DLSS, I would choose the GeForce card.
 
  • Like
Reactions: Parja
like this
That's what I meant by gimmick. It is touted as a feature, but is rarely available in practice. The only RT game I played was Shadow of the tomb raider and it was somewhat lackluster for RT. I think in 5 years the adoption rate will be higher and the effects better.

I personally don't believe gimmick is the right term. Lack of availability/use as you just stated is a better term. We'll see DLSS and RT more as time goes on just as we did other new 3D effects in the past. I always say it has to start somewhere and the start is always the weakest link. Eventually it will work out fine.
 
Death Stranding, FFXV, BFV (Battlefield Junkie), Shadow of Tomb Raider, Metro Exodus, Control (my wife as well on this one), MWV (though I admit I wasn't aware of DLSS support as I haven't started the campaign yet). My schedule for the last year or so has been extremely hectic, so I have a lot of catching up to do. But my use case is not everyone else's use case.

I'm playing on an Ultra Wide high refresh rate display with Free/G-Sync, or a 4K TV. I'm and Ultra settings junkie who wants a minimum of 60FPS. DLSS has it's place for me there. Additionally, if all things were equal between an upcoming Radeon and GeForce card (price/performance/RT performance), but GeForce included DLSS, I would choose the GeForce card.

Good reply, thanks. I have two panels, 1440p/144hz and a 4k/60. I put the 4k one away when I was playing pubg, and now I'm attached to the high refresh rate. I only own one (SOTTR) of the games on that list, the rest are a mix of not interested/epic exclusives. I can't say I had any issues playing that one title and getting decent FPS/Quality out of it, but with most of them being SP titles, I'd have to see a lot more support for it to be a thing. Making a hardware decision based on less than 15 SP games, just can't jive that in my head.

But like you said, different use case for different people. (y)
 
Frankly, until we see how DX12 RT, PS5 and Xbox SX do raytracing on AMD hardware - it's impossible to compare em.

I fully expect that most things, especially titles that will be on consoles, are gonna move towards non hardware locked raytracing.

I've said this before, but the single biggest title I care about raytracing on is Minecraft, so that will influence me quite a bit.

Better rasterization performance can overtake and is more important than any amount of DLSS IMO.
 
Frankly, until we see how DX12 RT, PS5 and Xbox SX do raytracing on AMD hardware - it's impossible to compare em.

I fully expect that most things, especially titles that will be on consoles, are gonna move towards non hardware locked raytracing.

I've said this before, but the single biggest title I care about raytracing on is Minecraft, so that will influence me quite a bit.

Better rasterization performance can overtake and is more important than any amount of DLSS IMO.

Well - I think that's the point that folks are missing. The good folks at Sony and MicroSquish have invested a penny or two and have interest in making sure the durned things work. Smart folks with access to a whole lot more resources than YouTube Reviewers have been getting reference cards - probably for a loong time now and providing feedback that can't be ignored. Even smaller players, like Epic have gotten to give feedback on why they want client-side RT to work... because it will save hundreds of thousands in bake time, so I really doubt that AMD is 'guessing in the dark' here. However - the super strict NDAs between these giants is probably why we're not seeing the hype and leaks that preceded Ampere. I think its a fair bet that Big Navi will be capable... even if the final YouTuber's benches show they're still playing catch-up to Nvidia.

And that's the thing; aside from one fanboy or another of either card... if the real-world performance of both cards is 'capable' then we as consumers win. And even then a 7 frame lead, while it looks good on a bench graph, is frankly meaningless to gamers.

I wrote this in a different thread, but think it applicable to this one:

"I think this [is] the biggest takeaway for me during this 3080 debacle. Unless you are determined to game at 4k, and have the screen, Mobo CPU and RAM to support it - why buy [Ampere]? **​
I think people are so enthused by the potential of the card - and get blown away by the numbers - that they're losing perspective on the context.​
From what I can tell, the greatest benefit of the 3k cards is making 4k gaming a realistic goal - where you are not sacrificing playable /competitive frames for high pixels. The real crux of this is that to really need 4k, you should also be running screens larger than 27" (where 1440 is probably perfect). The irony is that 32 inch monitors are not all that great atm. and anything larger is probably a TV.​
So the 3080 (etc) and the next Gen consoles are kicking open the door for RT and large screen, high pixel and high refresh rate monitors - but industry hasn't shown up yet with the products that can really show off the tech.
Hopefully that changes soon - because if these forums are any indication... The demand is there (presuming people don't get suckered into buying a 24 inch 4k - which to me sounds like playing at 1080p on a 15 inch laptop!)​
** 1440 players will benefit for sure - but 1080 monitor owners should take a pass."​
 
I would say we do not know enough about AMD raytracing performance nor if developers can effectively use it. I don't think the older games using DXR 1.0 will be able to fully use AMD raytracing effectively.
We know the PLENTY about amd raytracing performance. Ok, we know next to nothing, but what we do know is the only thing we will have to know.

What we do know is that AMD will set the base standard for raytracing because both next gen consoles will support ray tracing and AMD is the chip in both of them. So there's really only so many options. RT flops. Is sucks on console and devs just don't bother for the next 5 years. RT is used on console and PC does it better. If PC does it better, it's a pretty safe bet that nvidia will be faster at it given they have a head start.

Outside of the epeen competition, I highly suspect that AMD cards will be satisfying for raytracing on pc, maybe not at 4k, but I suspect that will have more to do with DLSS than nvidia's raw raytracing performance.
Well they were very intent on not stating the model number and the AMD said on the record that we didn't say which card it was which you only say if it's NOT the top end card. Plus AMD has an entire event for it and they are not going to spoil the biggest card. Plus AMD always holds back a halo product for the last reveal. It may be Navi 21 but it is not the top end card. And teasing your top end card would be idiotic.

You tease flattering numbers. Unless they have a huge upset where their top end card just kicks all kinds of ass, those leaked numbers are from their top end card. Recent history suggests their card will not kick all kinds of ass, especially since we alrady know it is one big hunk of silicon rather than some clever chiplet design. We can all just be happy that it looks like it will be competitive. Personally I'll be very happy with amd having an awesome CPU but still have to work for it in the gpu arena. When you can sit on top being comfy, your product goes to shit. At best when that happens you get the 2080 release where you were charged out the ass for what amounted to a proof of concept for compelling technologies.
 
  • Like
Reactions: noko
like this
We know the PLENTY about amd raytracing performance. Ok, we know next to nothing, but what we do know is the only thing we will have to know.

What we do know is that AMD will set the base standard for raytracing because both next gen consoles will support ray tracing and AMD is the chip in both of them. So there's really only so many options. RT flops. Is sucks on console and devs just don't bother for the next 5 years. RT is used on console and PC does it better. If PC does it better, it's a pretty safe bet that nvidia will be faster at it given they have a head start.

Outside of the epeen competition, I highly suspect that AMD cards will be satisfying for raytracing on pc, maybe not at 4k, but I suspect that will have more to do with DLSS than nvidia's raw raytracing performance.


You tease flattering numbers. Unless they have a huge upset where their top end card just kicks all kinds of ass, those leaked numbers are from their top end card. Recent history suggests their card will not kick all kinds of ass, especially since we alrady know it is one big hunk of silicon rather than some clever chiplet design. We can all just be happy that it looks like it will be competitive. Personally I'll be very happy with amd having an awesome CPU but still have to work for it in the gpu arena. When you can sit on top being comfy, your product goes to shit. At best when that happens you get the 2080 release where you were charged out the ass for what amounted to a proof of concept for compelling technologies.
I am interested in many aspects of RNDA2 cards. Gaming performance yes, compute performance yes, will AMD expand ProRender to use the HW capability of RNDA2 as in very soon? AMD support software side for compute type workloads, professional applications. As for how the 3 games performance was revealed says not much on overall game performance, RT, price, availability, drivers etc. Nvidia released a full product that works now with a few hiccups which looks mostly resolved, problem is folks not able to get them in a reasonable or easiest enough manner. For AMD, it is more than just game performance, at least for me is the jest. Game RT is not as important let say as RT support in professional programs. If AMD releases a Frontier like card for less than $1000, not the BS driver hell of that card (works great now but with Registry hack to get gaming driver to work), 32gb and it performs between the 3080/90 or better than the 3090 = > Not sure what the hell Nvidia could do with their upper lineup, AMD would have pawned them so hard, Jensen's jacket would turn into rabbit fur.
 
We know the PLENTY about amd raytracing performance. Ok, we know next to nothing, but what we do know is the only thing we will have to know.

What we do know is that AMD will set the base standard for raytracing because both next gen consoles will support ray tracing and AMD is the chip in both of them. So there's really only so many options. RT flops. Is sucks on console and devs just don't bother for the next 5 years. RT is used on console and PC does it better. If PC does it better, it's a pretty safe bet that nvidia will be faster at it given they have a head start.

Outside of the epeen competition, I highly suspect that AMD cards will be satisfying for raytracing on pc, maybe not at 4k, but I suspect that will have more to do with DLSS than nvidia's raw raytracing performance.


You tease flattering numbers. Unless they have a huge upset where their top end card just kicks all kinds of ass, those leaked numbers are from their top end card. Recent history suggests their card will not kick all kinds of ass, especially since we alrady know it is one big hunk of silicon rather than some clever chiplet design. We can all just be happy that it looks like it will be competitive. Personally I'll be very happy with amd having an awesome CPU but still have to work for it in the gpu arena. When you can sit on top being comfy, your product goes to shit. At best when that happens you get the 2080 release where you were charged out the ass for what amounted to a proof of concept for compelling technologies.
Well I guess one of us will be right. It just seems that so many like you cant get past their previous history under Raj and refuse to believe that they will be able to make the leap that everything we have seen suggests. There are still some Nvidia fanboys still insisting despite the benchmarks that the benchmarks are fake and it will only compete with 3070 which is just ludictous
 
Nope, don't care about RT really. Will be concerned if games like Cyberpunk gimp non-Nvidia cards though. Otherwise, if Big Navi is within spitting distance of the 3080 with better power usage, then it's a no brainer and I would have no problem replacing my aging GTX 980 with a 6900 XT/XTX or whatever they end up calling it.

Though that also assumes no major driver problems as well.

To me Ray Tracing is still in its infancy and I haven't seen too many compelling implementations of it that make it a must have feature at this stage. Certainly a nice to have, but not my end goal in buying a new card.
 
Last edited:
You may also remeber plenty of people getting burned buying video cards with new standards that ended up as dead ends or not supported. Buying a 3000 series card for ray tracing is just as silly as the people that bought the 2000 series cards for ray tracing. RT is still in beta and will take a lot of time to become mainstream, and will not become commonplace until everyone has a card that can do it. It is going to take a long time to replace all those old low end cards, and game designers wont support it much until then.
Yes, new tech takes a while to become more mainstream and, what you may already realize, as I do, being RT is on both consoles, this takes it to mainstream much faster. If a console player has a desire to play a PC game that is not on consoles with RT, he/she will opt for what they're more familiar with, IE RT on graphics cards. I honestly cannot wait to see what is going to happen in the interim and long term!
 
Nope, don't care about RT really. Will be concerned if games like Cyberpunk gimp non-Nvidia cards though. Otherwise, if Big Navi is within spitting distance of the 3080 with better power usage, then it's a no brainer and I would have no problem replacing my aging GTX 980 with a 6900 XT/XTX or whatever they end up calling it.

Though that also assumes no major driver problems as well.

To me Ray Tracing is still in its infancy and I haven't seen too many compelling implementations of it that make it a must have feature at this stage. Certainly a nice to have, but not my end goal in buying a new card.
Looks like Cyberpunk will play great even on a 5700 XT without RT. It only falls apart, from the beta tests when RT is used. Will the added eye candy be a game changer? Like furry animals in The Witcher 3, probably not.
 
Looks like Cyberpunk will play great even on a 5700 XT without RT. It only falls apart, from the beta tests when RT is used. Will the added eye candy be a game changer? Like furry animals in The Witcher 3, probably not.
Good to hear. Then I am not worried about it at all. If Big Navi is cheaper, more power efficient, and trades blows or is close enough to the 3080 then it is a clear winner to me.
 
What we do know is that AMD will set the base standard for raytracing because both next gen consoles will support ray tracing and AMD is the chip in both of them.

The nice thing is that even if consoles suck at raytracing and only use it sparingly it's still a huge win for PCs. Whether you're casting a few rays or tons of rays you still need to add support for raytracing in your render pipeline. Once it's already in there it's much easier to scale up the number of rays or add new raytraced based effects for the PC version of the game. Consoles getting raytracing hardware is the best thing to happen to raytracing on the PC.
 
Well I guess one of us will be right. It just seems that so many like you cant get past their previous history under Raj and refuse to believe that they will be able to make the leap that everything we have seen suggests. There are still some Nvidia fanboys still insisting despite the benchmarks that the benchmarks are fake and it will only compete with 3070 which is just ludictous

It could kick ass at RT. But I doubt it. Regardless of the performance of either team green or team red in the PC world, RT is going to be defined by the consoles and that's that for the vast majority of games. Team red will be good enough, or RT is toast for 5 years because console.

I'm pretty certain that 4k rasterized performance will be good. Mainly because sony and MS would have obliterated them if the console chips were failures, and they are going to do some form of reasonably ok looking 4k at some usable frame rate, AND the stand alone cards are going to be beefier parts.

Honestly, I think lots of people will see it as an upset if Navi is even just solidly competitive. Heck, They could launch it, have it beat the 3080 by a small but respectable margin, but be a much more efficient mining card, and then they won't matter one bit in the gaming arena as AMD's lower production vaolume plus miner demand will make it nonexistent for pretty much it's whole life span.


I am interested in many aspects of RNDA2 cards. Gaming performance yes, compute performance yes, will AMD expand ProRender to use the HW capability of RNDA2 as in very soon? AMD support software side for compute type workloads, professional applications. As for how the 3 games performance was revealed says not much on overall game performance, RT, price, availability, drivers etc. Nvidia released a full product that works now with a few hiccups which looks mostly resolved, problem is folks not able to get them in a reasonable or easiest enough manner. For AMD, it is more than just game performance, at least for me is the jest. Game RT is not as important let say as RT support in professional programs. If AMD releases a Frontier like card for less than $1000, not the BS driver hell of that card (works great now but with Registry hack to get gaming driver to work), 32gb and it performs between the 3080/90 or better than the 3090 = > Not sure what the hell Nvidia could do with their upper lineup, AMD would have pawned them so hard, Jensen's jacket would turn into rabbit fur.

Your interests are much more broad than most. Which makes you a niche user. While I do see the potential for this gen of cards to be meaningful in big ways to several types of content creation, I don't know that it makes up an appreciable market. Except maybe the possibility for data center tensor compute. I suspect NV will have something interesting there that AMD won't be able to compete with solidly just based on NV's recent acquisitions. Dominance there won't be determined by a single chip's architecture and performance I'm guessing.
 
  • Like
Reactions: noko
like this
The nice thing is that even if consoles suck at raytracing and only use it sparingly it's still a huge win for PCs. Whether you're casting a few rays or tons of rays you still need to add support for raytracing in your render pipeline. Once it's already in there it's much easier to scale up the number of rays or add new raytraced based effects for the PC version of the game. Consoles getting raytracing hardware is the best thing to happen to raytracing on the PC.

Absolutely. That is why I'm inclined to think RT is not going to be a flop. It probably won't change the face of gaming, but it'll be a nice to have eye candy feature.
 
What happens if 2 years from now some massive breakthrough in RT happens and makes it work great, but is no longer compatable with the old cards way of doing things. You are now stuck with a card that is useless with what it was bought for, and now time to pay up and get out of the beta test.
If you feel that you need the extra performance then you buy a new card? It's called the early adopter tax for a reason. My 2080ti has been more than worth it as it has given me top tier performance and allowed me to play with the early ray traced and DLSS titles for 1+ years now (even if it's just been a handful of games).

By the same token what if RDNA2's ray tracing performance ends up being weaker from the get go and never substantially improves while Ampere, or even Turing, is better? You could also just as easily ask the question, "What if DLSS really takes off"? All of a sudden a cheap NV GPU will perform as well as a more expensive AMD GPU. Is it worth it to save a few bucks if it has similar raster performance but no DLSS alternative, no NV Broadcast alternative, no NVENC alternative, etc? Sure to some buyers a 5-10% price difference could be a deciding factor but most buyers will pay a little more for something that is feature complete.

I've been making use of Broadcast's features in Discord; the background noise cancellation and "camera man" features are pretty incredible when my friends and I are streaming together. When I'm in a web meeting it looks professional as hell and my coworkers comment on it nearly every time. It's damn near sci-fi future tech which kind of makes AMD's, "Hey our GPUs can play Battlefield" ring pretty hollow. As such I hope AMD announces much more than some minute raster performance differences. I'm glad we're only a couple of weeks away from knowing for sure.
 
If you feel that you need the extra performance then you buy a new card? It's called the early adopter tax for a reason. My 2080ti has been more than worth it as it has given me top tier performance and allowed me to play with the early ray traced and DLSS titles for 1+ years now (even if it's just been a handful of games).

By the same token what if RDNA2's ray tracing performance ends up being weaker from the get go and never substantially improves while Ampere, or even Turing, is better? You could also just as easily ask the question, "What if DLSS really takes off"? All of a sudden a cheap NV GPU will perform as well as a more expensive AMD GPU. Is it worth it to save a few bucks if it has similar raster performance but no DLSS alternative, no NV Broadcast alternative, no NVENC alternative, etc? Sure to some buyers a 5-10% price difference could be a deciding factor but most buyers will pay a little more for something that is feature complete.

I've been making use of Broadcast's features in Discord; the background noise cancellation and "camera man" features are pretty incredible. When I'm in a web meeting it looks professional as hell and my coworkers comment on it nearly every time. It's damn near sci-fi future tech which kind of makes AMD's, "Hey our GPUs can play Battlefield" ring pretty hollow. As such I hope AMD announces much more than just 0-10% raster performance differences. I'm glad we're only a couple of weeks away from knowing for sure.
I think AMD will have a DLSS alternative. If they haven't then that's dumb on their part.
 
I would take RTX and DLSS today even in a handful of games rather than get a card from AMD that misses these important features. I only play a game once and on day 1 if these features exist then I want them.
 
I think AMD will have a DLSS alternative. If they haven't then that's dumb on their part.

They might, what’s the state of dlss and parents? A quick search showed a Sony patent on a system to acquire the training images, which implies something like that will be used in the ps5.
 
Don’t they have Radeon Sharpening?
Yes, in general it's better than dlss 1 but not as good as dlss 2.0, of course this is generalized, some games/scenes do better or worse and it works with all games instead of just a handful, so arguably you could consider it better overall if you include the games dlss isn't supported in which is a lot.
 
Yes, in general it's better than dlss 1 but not as good as dlss 2.0, of course this is generalized, some games/scenes do better or worse and it works with all games instead of just a handful, so arguably you could consider it better overall if you include the games dlss isn't supported in which is a lot.
Wonder if Nvidia will have an all-in-one solution similar to Radeon Sharpening. I remember a few months ago there were murmurs that DLSS 3.0 would support any game that supports TAA. But nothing has popped up since then. Guessing if its in the pipeline then its still many months away at least.
 
Wonder if Nvidia will have an all-in-one solution similar to Radeon Sharpening. I remember a few months ago there were murmurs that DLSS 3.0 would support any game that supports TAA. But nothing has popped up since then. Guessing if its in the pipeline then its still many months away at least.
I mean, Radeon image sharpening is open source and works on Nvidia products as well ;). They said it could easily be added to any game supporting TAA, not that it would just magically work (at least that was my thought recollection), so it's still a far cry from being universal. AMDs solution works with zero addition data that dlss requires as inputs to work (hence the requirement to have TAA support as it stores more than just color info per pixel). If it's in the pipeline then it's probably another generation or 2 away, unless they somehow intercept graphics pipeline calls and stuff in the required data via the drivers so it can do the dlss.
 
I would take RTX and DLSS today even in a handful of games rather than get a card from AMD that misses these important features. I only play a game once and on day 1 if these features exist then I want them.

Since we're talking hypotheticals. How much more raster performance would it take to pass on DLSS support. It's supported in 14 titles now, and has support announced for another 8.
 
There are roughly 60 million gamers that play Warzone daily, it's pretty much all I play. I sold my 1440p 144hz monitor and switched to 280hz 1080p just for Warzone. I dialed in my memory timings just for Warzone. I do an hour of aim training (Kovaaks) every day, working through the Aimer7 training guide just for Warzone. I'm buying a 5600x just for Warzone. From my perspective, ALL I care about is what kind of framerate I can get in that game.

Nvidia has the following giant advantages in Warzone (And Fortnite):
Reflex support
Nvidia Freestyle Color/Brightness filters are pretty much required in order to see people while fighting inside buildings.
DLSS 2.0 is coming to Black Ops for sure but we don't know if Warzone will be getting that feature. If Warzone gets DLSS 2.0 then I pretty much can't buy an AMD card.

I've personally purchased over a dozen ATI graphics cards and I loved them. I consider the R9 290 the best performance/value graphics card ever made when you look at how it aged.

AMD needs to start tackling feature support, if Big NAVI rasterization in Warzone blows the 3080 out of the water then I'd be tempted but I literally cannot play the game without modifying the color filters with Freestyle. I've heard that I could use Reshade to do a similar color filter in Warzone as Nvidia Freestyle but there is so little information out there on exactly what settings to use. On the other hand, every streamer and their dog has 10 videos about their favorite Freestyle color settings in Warzone.

Lisas post that Warzone at 4k had 88fps on Big navi did get me interested but that's still not quite matching the 3080. I've been waiting for a 3080 to be delivered for over a month and it's starting to drive me crazy.

EDIT: No one uses or gives a shit about raytracing in Warzone.
 
Last edited:
I would take RTX and DLSS today even in a handful of games rather than get a card from AMD that misses these important features. I only play a game once and on day 1 if these features exist then I want them.

And how many games so far have had these features on Day 1? Maybe Control (don't remember)? Every other game was a patch some months after the "Day 1" release. If you're playing on "Day 1," then RTX is practically useless.
 
And how many games so far have had these features on Day 1? Maybe Control (don't remember)? Every other game was a patch some months after the "Day 1" release. If you're playing on "Day 1," then RTX is practically useless.

Certainly not having them at all is 100% useless.
 
Certainly not having them at all is 100% useless.

I was being generous. In a Day 1 situation, history has shown it to be 99.9966% useless (based on the total number of Steam games available and assuming Day 1 RTX only for Control). 99.9879% if you just look at games released in 2019 AFTER RTX became a thing.

Statistically equivalent.
 
Last edited:
There are roughly 60 million gamers that play Warzone daily, it's pretty much all I play. I sold my 1440p 144hz monitor and switched to 280hz 1080p just for Warzone. I dialed in my memory timings just for Warzone. I do an hour of aim training (Kovaaks) every day, working through the Aimer7 training guide just for Warzone. I'm buying a 5600x just for Warzone. From my perspective, ALL I care about is what kind of framerate I can get in that game.

Nvidia has the following giant advantages in Warzone (And Fortnite):
Reflex support
Nvidia Freestyle Color/Brightness filters are pretty much required in order to see people while fighting inside buildings.
DLSS 2.0 is coming to Black Ops for sure but we don't know if Warzone will be getting that feature. If Warzone gets DLSS 2.0 then I pretty much can't buy an AMD card.

I've personally purchased over a dozen ATI graphics cards and I loved them. I consider the R9 290 the best performance/value graphics card ever made when you look at how it aged.

AMD needs to start tackling feature support, if Big NAVI rasterization in Warzone blows the 3080 out of the water then I'd be tempted but I literally cannot play the game without modifying the color filters with Freestyle. I've heard that I could use Reshade to do a similar color filter in Warzone as Nvidia Freestyle but there is so little information out there on exactly what settings to use. On the other hand, every streamer and their dog has 10 videos about their favorite Freestyle color settings in Warzone.

Lisas post that Warzone at 4k had 88fps on Big navi did get me interested but that's still not quite matching the 3080. I've been waiting for a 3080 to be delivered for over a month and it's starting to drive me crazy.

EDIT: No one uses or gives a shit about raytracing in Warzone.
Always interesting for me to read about the 'pro' and 'competitive' play fascination with high refresh 24 inch 1080p. You guys have to be sitting leaned forward into the monitor and peering at pixels hoping for the slightest edge.

...

No criticism on your choice: I've been a happy 1080 (well, okay, 1200) gamer for years on my HP 24. But having crossed the 50yo threshold - I'm looking for a 4k 32 IPS and will be happy (thrilled even) with a 120 refresh rate. The larger size allows me to sit at my typical, comfortable 'work' distance with similar, if not better pixel pitch... and in games like Squad - seeing distant targets rendered a bit bigger will be a huge boost.

But not required, for COD, I'd guess, given their notoriously small maps.

...The thing is, however, that the newest cards aren't really needed for 1080p. Perhaps those thinking they can see the difference (and react to) something at 240 frames vs 120 frames may want the power to push small form factor that hard - but for the 'average' 24 inch user, the extra horses are wasted.

These newest cards really are just kicking open the door to larger format 4k and high frame 1440.
 
Last edited:
Always interesting for me to read about the 'pro' and 'competitive' play fascination with high refresh 24 inch 1080p. You guys have to be sitting leaned forward into the monitor and peering at pixels hoping for the slightest edge.

...

No criticism on your choice: I've been a happy 1080 (well, okay, 1200) gamer for years on my HP 24. But having crossed the 50yo threshold - I'm looking for a 4k 32 IPS and will be happy (thrilled even) with a 120 refresh rate. The larger size allows me to sit at my typical, comfortable 'work' distance with similar, if not better pixel pitch... and in games like Squad - seeing distant targets rendered a bit bigger will be a huge boost.

But not required, for COD, I'd guess, given their notoriously small maps.

...The thing is, however, that the newest cards aren't really needed for 1080p. Perhaps those thinking they can see the difference (and react to) something at 240 frames vs 120 frames may want the power to push small form factor that hard - but for the 'average' 24 inch user, the extra horses are wasted.

These newest cards really are just kicking open the door to larger format 4k and high frame 1440.

You've definitely never played warzone... No one plays COD anymore except to level the guns they need in Warzone. The Warzone map is gigantic, it's actually larger than the fortnite map but significantly smaller than the Pubg map.

I have a 27 inch 1080p 280hz monitor and I like the larger size vs the 24/25 inch. Squad is awesome by the way, I liked it a lot but it's missing that BR addiction. I'm 39 years old and my reflexes are slowing down but I'm doing everything I possibly can to keep it at bay. I cut pretty much all sugar, have a consistent yoga practice and I'm in great shape. Take all the right supplements to help retain quick reflexes. It's honestly a little insane how hard I am willing to work to be competitive in a video game. The thing is, there are a LOT of people like me, look at the sales numbers for the 3080. To get 200+ frames in Warzone you need a 2080 Ti or a 3080 and I want 280 fps if I can get it.

The input lag on your older 16:10 hp display would be a deal breaker for me.

Here is a wonderful video demonstrating the advantage of a 240hz monitor over lower refresh rates.
 
You've definitely never played warzone... No one plays COD anymore except to level the guns they need in Warzone. The Warzone map is gigantic, it's actually larger than the fortnite map but significantly smaller than the Pubg map.

I have a 27 inch 1080p 280hz monitor and I like the larger size vs the 24/25 inch. Squad is awesome by the way, I liked it a lot but it's missing that BR addiction. I'm 39 years old and my reflexes are slowing down but I'm doing everything I possibly can to keep it at bay. I cut pretty much all sugar, have a consistent yoga practice and I'm in great shape. Take all the right supplements to help retain quick reflexes. It's honestly a little insane how hard I am willing to work to be competitive in a video game. The thing is, there are a LOT of people like me, look at the sales numbers for the 3080. To get 200+ frames in Warzone you need a 2080 Ti or a 3080 and I want 280 fps if I can get it.

The input lag on your older 16:10 hp display would be a deal breaker for me.

Here is a wonderful video demonstrating the advantage of a 240hz monitor over lower refresh rates.


I play COD and don’t have any issues getting into MP matches where people are playing the objective. Sure, every so often I’ll run into a match where people are just trying to get kills or level up but you make it sound like that’s all anyone does. Sounds like you’re living in a bubble and think everyone has the same intentions you do when playing anything outside of war zone.
 
Back
Top