If Navi matches 3080 in Raster performance but not Ray Tracing... will we care?

I was being generous. In a Day 1 situation, history has shown it to be 99.9966% useless (based on the total number of Steam games available and assuming Day 1 RTX only for Control). 99.9879% if you just look at games released in 2019 AFTER RTX became a thing.

Statistically equivalent.

You’re being generous by assuming most people play games on day 1? Hmmm. I don’t know what the stats are but I certainly don’t fall in that category.
 
I play COD and don’t have any issues getting into MP matches where people are playing the objective. Sure, every so often I’ll run into a match where people are just trying to get kills or level up but you make it sound like that’s all anyone does. Sounds like you’re living in a bubble and think everyone has the same intentions you do when playing anything outside of war zone.

4 days ago when the new shotgun came out every single MP match I joined was a revolving door of people getting three kills and then leaving so they could unlock it. That lasted for an hour and then everyone was leveling the shotgun in death match. That lasted for a few hours until every streamer, youtuber and their dog was posting videos about how overpowered the new shotgun (JAK) is in warzone. The next day MP was filled with Warzone players copying the builds from the streamers.

I understand that there are still some people that play COD MP but Warzone is the most popular game in the world right now and has become the defacto COD going forward.
 
I would take RTX and DLSS today even in a handful of games rather than get a card from AMD that misses these important features. I only play a game once and on day 1 if these features exist then I want them.

You’re being generous by assuming most people play games on day 1? Hmmm. I don’t know what the stats are but I certainly don’t fall in that category.

In context, I was replying to this guy. I agree I generally don't play day one but some people do. Even so we both know that RTX features are far from universal, and the adoption rate has been pretty woeful.
 
Pretty sure my post is being misread. I mentioned that when I am playing the game then on day 1 of my playthrough if these features exist then I want them. I never claimed that all my playthroughs of games are on day 1 of release.
 
Pretty sure my post is being misread. I mentioned that when I am playing the game then on day 1 of my playthrough if these features exist then I want them. I never claimed that all my playthroughs of games are on day 1 of release.

Fair enough, but "Day 1" generally has certain connotation to it.
 
Since we're talking hypotheticals. How much more raster performance would it take to pass on DLSS support. It's supported in 14 titles now, and has support announced for another 8.
How many of those titles are dlss 2.0 and are an improvement over just using a rescale filter? That number drops from 14 to like 2? Maybe 3? Just having support doesn't make it better. It being better makes it better ;). Anyways, this thread is about RT anyways, but similar logic. How many games are better with RT and to what extent? Seeing as this is highly subjective there isn't a clear answer, which is why you get so many different opinions. Some think 20-30% performance drop isn't worth having it on when in motion it's hard to notice the difference. Others think the highest fidelity is more important to them. Neither is right or wrong, it's subjective (as are lots of things dealing with image quality).
I don't care much for RT if the game has a decent lighting implementation... If it has a crappy lighting implementation and you turn on RT for ambient occlusion it makes a good difference. If it has a good AO implementation already and RT is just used for more puddles... I couldn't care less.
 
  • Like
Reactions: noko
like this
If all you play are competitive online games then raytracing will literally never matter. RTSS and Reflex are vital but nonsense like gsync, freesync and raytracing are just increases in input lag and frame rate losses.
 
If all you play are competitive online games then raytracing will literally never matter. RTSS and Reflex are vital but nonsense like gsync, freesync and raytracing are just increases in input lag and frame rate losses.

On the flip side, if all you play are non-competitive single player games then RT and VRR would be worth the increase in input lag, etc. I can't remember the last time I played an online shooter. The next game I play will be AC: Valhalla.
 
No I dont think people will care. RT is just a gimmick which isn't ready for prime time. Just my opinion, but as of right now I dont think it's useful in games.

Give it a good 2-3 generations then it will be imo.
 
Last edited:
You've definitely never played warzone... No one plays COD anymore except to level the guns they need in Warzone. The Warzone map is gigantic, it's actually larger than the fortnite map but significantly smaller than the Pubg map.

I have a 27 inch 1080p 280hz monitor and I like the larger size vs the 24/25 inch. Squad is awesome by the way, I liked it a lot but it's missing that BR addiction. I'm 39 years old and my reflexes are slowing down but I'm doing everything I possibly can to keep it at bay. I cut pretty much all sugar, have a consistent yoga practice and I'm in great shape. Take all the right supplements to help retain quick reflexes. It's honestly a little insane how hard I am willing to work to be competitive in a video game. The thing is, there are a LOT of people like me, look at the sales numbers for the 3080. To get 200+ frames in Warzone you need a 2080 Ti or a 3080 and I want 280 fps if I can get it.

The input lag on your older 16:10 hp display would be a deal breaker for me.

Here is a wonderful video demonstrating the advantage of a 240hz monitor over lower refresh rates.

.
That was an interesting video. From an 'educate the consumer' standpoint - it was very poorly edited... But the gist came through.

Bottom line / tldr
- as a 60hz gamer on an old card, you are at a disadvantage. Not really a bombshell report.
- a 60hz gamer with a good card pumping out high frames is still on a playable system, where personal skill and ability can compensate for the tech. - again, not a bombshell, it's literally the mantra from before higher Hz monitors came out.
-- thus far we are still in 'what the OG gamers have always said is true' category. Moving on to the 'new tech is bypassing yew, gramps' territory:
- a 144 Hz monitor and high frames seems to be the best target for most people's upgrades. Players get significant and measurable improvements in response time over 60hz gaming both in the hardware and the wetware.
- 240 Hz provides diminishing return over the 144, still far better than the 60, it is marginally better than the 144 for competitive play at higher cost.

Needless to say - thanks for the link (cemented my desire to get a 120 (+) Hz 32 inch 4k ips
 
... . The Warzone map is gigantic, it's actually larger than the fortnite map but significantly smaller than the Pubg map.

I...
Oh - and I'm glad you specified that the WZ map was PubG sized - I was all ready with a quip about the 'largest COD map evah' (720m^2)
 
While RTX RT runs through DXR the game devs will still need to do some programming / optimizing for AMD. Hopefully AMD does a good job on the toolkit and it’s similar to AMD’s so devs actually use it.
This could be naive on my part, but my understanding is that since the big navi architecture is being used for both the upcoming AMD cards and the next gen consoles, that for multi-platform games, development focus will first be on consoles (AMD hardware), and should translate to PC nicely. nVidia tech becomes the afterthought when your primary market uses AMD (console users).
I highly doubt, RT was not a primary concern by Sony and MS for their next gen systems, but I could be wrong again. AMD likely has a few tricks up their sleeves other then cheaper/better raster performance at lower cost and power consumption.

If not though, 3080 here I come.
 
This could be naive on my part, but my understanding is that since the big navi architecture is being used for both the upcoming AMD cards and the next gen consoles, that for multi-platform games, development focus will first be on consoles (AMD hardware), and should translate to PC nicely. nVidia tech becomes the afterthought when your primary market uses AMD (console users).
I highly doubt, RT was not a primary concern by Sony and MS for their next gen systems, but I could be wrong again. AMD likely has a few tricks up their sleeves other then cheaper/better raster performance at lower cost and power consumption.

If not though, 3080 here I come.

We’ve heard that for years (decades?) and it’s never been close to true. I’d put zero weight of that in my purchase decision.
 
We’ve heard that for years (decades?) and it’s never been close to true. I’d put zero weight of that in my purchase decision.
Sorry I am not following - are you saying that there has been console hardware in the past, using chips nearly identical in design to the PC counterparts, and there were driver issues?

Don't get me wrong, I am still waiting on seeing results before I make a purchase decision, but if the next gen console chips are for the first time, nearly identical to what is on the PC side, I imagine that would be a point of serious consideration in comparing driver compatibility and developer focus.
AMD has been used for nearly 2 decades in console hardware, but I don't think their chips between consoles and PC were ever so aligned as this time around.
Of course I don't know for sure, this all assumption of course, but I do not feel my reasoning is too far off?
 
Sorry I am not following - are you saying that there has been console hardware in the past, using chips nearly identical in design to the PC counterparts, and there were driver issues?
XBOX has been running AMD/ATI hardware since at least 15 year's, both since 2013 I think and it does not feel like that converted in a big advantage to owning AMD hardware on PC, I think there is a truth that it make safe to choose an AMD videocard and that AAA port should run on them, but not necessarily that they will run any better than on an NVIDIA Gpu, specially if NVidia keep the lion share of the PC market.

And if there is a transition toward Vulkan, from what I understand difference/drivers could become less and less on an issue.
 
XBOX has been running AMD/ATI hardware since at least 15 year's, both since 2013 I think and it does not feel like that converted in a big advantage to owning AMD hardware on PC, I think there is a truth that it make safe to choose an AMD videocard and that AAA port should run on them, but not necessarily that they will run any better than on an NVIDIA Gpu, specially if NVidia keep the lion share of the PC market.

And if there is a transition toward Vulkan, from what I understand difference/drivers could become less and less on an issue.
Well here's what I think is different, they may have used AMD components, but were the architectures, and hardware environments as close to 1:1 as the new consoles and cards will be?
It's my impression that they were not (but again, I am not sure, that's why I am sharing my thoughts from a standpoint of ambiguity).
For example, the Gamecube and PS3/PS4 systems did not really resemble PC systems. So why would development on those systems translate as easily? Again, I likely am not seeing the full picture, I am just under the assumption, that console hardware now, resemble PC hardware more than ever. By a substantial margin over previous generations, and this should translate to overlap in development/compatibility.
This is my reasoning, but I am sure there is a lot I am incorrectly assuming. But I would like to know where I am wrong, rather than, simply, I am wrong.
 
I think the general consensus with the consoles is that the API's are different enough and the hardware is custom enough, that once you port to PC and its API's.....AMD doesn't necessarily always enjoy an advantage. But that's also assuming that the PC version of a game is actually a port of the console code. PC versions are often developed alongside. I'd say the only clear porting situations, are something like Horizon Zero Dawn. Which was a single system exclusive game and the PC version was never planned.
 
Well here's what I think is different, they may have used AMD components, but were the architectures, and hardware environments as close to 1:1 as the new consoles and cards will be?
It's my impression that they were not (but again, I am not sure, that's why I am sharing my thoughts from a standpoint of ambiguity).
For example, the Gamecube and PS3/PS4 systems did not really resemble PC systems. So why would development on those systems translate as easily? Again, I likely am not seeing the full picture, I am just under the assumption, that console hardware now, resemble PC hardware more than ever. By a substantial margin over previous generations, and this should translate to overlap in development/compatibility.
This is my reasoning, but I am sure there is a lot I am incorrectly assuming. But I would like to know where I am wrong, rather than, simply, I am wrong.

PS4/XB1 are using an AMD APU and amd has not gained any advantage over nvidia due to it. DX12/Vulcan closed the gap some because AMD can’t manage to write an efficient driver. Windows 8 didn’t save bulldozerm PS4 didn’t magically catapult AMD GPUs and neither will PS5. If AMD wants to beat nvidia they’ll need to do it by having a better product.
 
We’ve heard that for years (decades?) and it’s never been close to true. I’d put zero weight of that in my purchase decision.
There’s a sizable difference this time though: in previous years, the software was completely different in the consoles and so the hardware was used differently. DXR runs on fixed hardware built for that purpose, so those accelerators should be very similar if not equal on consoles and pc. For once I would actually expect the DXR code to be quite similar, if not with the Ps5, certainly with the Xbox, which also uses dx12.
 
There’s a sizable difference this time though: in previous years, the software was completely different in the consoles and so the hardware was used differently. DXR runs on fixed hardware built for that purpose, so those accelerators should be very similar if not equal on consoles and pc. For once I would actually expect the DXR code to be quite similar, if not with the Ps5, certainly with the Xbox, which also uses dx12.

I am kind of like a bank loan when it comes to new features. I don’t care if you might make 100k in a year, come back when you can prove it.

So far AMD has shown us jack shit for ray tracing besides a terrible demo that I wish I didn’t see. Even if what you wrote is true I have little faith the consoles have nearly enough power to properly RT games. If I am wrong, by the time matters we’ll be on the next gen of cards.

I don’t see any scenario where I would factor it in, especially given AMD’s terrible driver history trying to do just rasterized.
 
If AMD raster = Nvidia, if power draw roughly same, if price within $50 of Nvidia, if RT better with Nvidia, I think majority of buyers will go Nvidia. Reviews will show RT performance in games and that will be a determining factor. Ppl are swayed by benchmarks. +/- $50 wont make much difference to buyers in the $600+ range.
 
I would suspect first generation of RT games will run like crap on AMD hardware unless undated to DXR 1.1. So RT performance, from my prospective will not be very clear on real potential. If some new Xbox series X and/or PS5 titles make it to the PC quickly using RT then we should know more. Not sure how AMD is going to handle RT initially, will it actually be turned on or delayed? For the most part RT for me would be more important not in games but rendering programs such as Pro Render.

Raster performance followed by features and price, reigns supreme is my take.
 
Reviews will show RT performance in games and that will be a determining factor. Ppl are swayed by benchmarks. +/- $50 wont make much difference to buyers in the $600+ range.
Reviewers did not even judge the 3080/3090 on RT performance for the most part. Almost every conclusion I read/listened to, raster was the basis of the overall conclusion. That said, I was very much wondering if we were going to see something otherwise this release cycle.
 
What if AMD beats Ampere in raster and matches Turing in RT.

https://www.igorslab.de/en/3dmark-in-ultra-hd-benchmarks-the-rx-6800xt-without-and-with-raytracing/

It's 3dmark but better than nothing I suppose.

01-Percent.png
 
What if AMD beats Ampere in raster and matches Turing in RT.

https://www.igorslab.de/en/3dmark-in-ultra-hd-benchmarks-the-rx-6800xt-without-and-with-raytracing/

It's 3dmark but better than nothing I suppose.

View attachment 291809

Based on that I'm sure that everyone will buy AMD.

There's been a loud contingent the last two years, explaining how RT is a necessity, and that even the mid range nvidia cards were a better buy than similarly priced, but better performing in raster, AMD competitors. Based solely on their ability to do RT.

If the 6800xt has better RT than a 2080ti, and better raster than a 3080, sounds like a buy.
 
Based on that I'm sure that everyone will buy AMD.

There's been a loud contingent the last two years, explaining how RT is a necessity, and that even the mid range nvidia cards were a better buy than similarly priced, but better performing in raster, AMD competitors. Based solely on their ability to do RT.

If the 6800xt has better RT than a 2080ti, and better raster than a 3080, sounds like a buy.

Yes now that both players are in the game hopefully the silly arguments about whether RT is awesome / useless can die.
 
If that's accurate, definitely picking up a 6900xt instead of nvidia offerings, how close is that 6800XT to 3090, appears to be danger close to me.
I guess that depends on the benchmark. Firestrike Extreme 4k, AMD beats 3080 by almost 20%... 3090 was around 12-13% faster than 3080 in Ultra, not sure how different that would have been with Extreme.... so according to this chart if to be beleived, the 6800XT should beat the 3090 in Firestrike Extreme 4k.

Timespy 4k Ultra, 3080 and 3090 are basically within margin of error, so this points to 6800XT at least on par if not slightly ahead in this (Again, if the chart is correct and things haven't changed).

Port Royal 4k the 3090 is about 15% faster than a 3080, while the 3080 is 22% faster than the 6800XT... so you're looking at a good ~35% margin/difference between 6800XT and 3090 with RT enabled in this benchmark. That all said, we're still going by lots of rumors and it's not a direct comparison, so large quantities of salt. But also, this is a 6800XT not a 6900XT, so expect the 6900XT to do slightly better in general. Also, these are just benchmarks, not real games at real settings in real situations, but it still does look promising. If the 6800XT can beat the 3090 in Raster and costs the same as a 3080... but has slightly less RT performance, at what point do you consider the cost you are paying for extra RT when it can be used vs how much raster you are giving up or how much extra you pay for similar/less raster?
 
If Igors numbers are accurate, Nvidia is toast, lol.
I mean, it's a benchmark, so while it (and other leaks) appears promising, I wouldn't put to much stock into it just yet until we get some verified benchmarks in real games and work loads. I just hope that blender/amd supports RT properly on AMD cards... then I can use it for the few times I do blender work (not a huge deal, but a speedup in this would be a nice to have), and I have all that raster performance for when I want to game. I'll probably wait a little while for all the dust to settle, but I'm excited for the first time a in a while for competition.
 
The thing about synthetic benchmarks is that the are very easy to optimize for. Not saying that these aren’t indicative of AMD producing really good cards. But I’m going to wait for real world gaming performance to see where things are before getting too hyped. If these benchmarks carry over to real world gaming results then I’d strongly consider moving to a 6900XT if it’s priced well.
 
All of this optimism, even if the performance numbers are true, are betting that AMD will price competitively with (or lower than) the Nvidia 3000 series.

I hope so.

Just sayin', just because the cards are seeming to be very good, that does not mean the PRICE will be good.
 
All of this optimism, even if the performance numbers are true, are betting that AMD will price competitively with (or lower than) the Nvidia 3000 series.

I hope so.

Just sayin', just because the cards are seeming to be very good, that does not mean the PRICE will be good.
If AMD GPUs match or exceed’s Nvidia’s I fully anticipate the prices to reflect the performance just like they did with Zen 3. Unless AMD wants to re-enter the high-end GPU market with a bang and shove a fist up Nvidia’s ass by providing better performance at lower costs.
 
That's nt very goid for business. If I pay nvidia max wallet, why won't I pay amd if performance is there?

Fascinating..
Hard...
I think you are twisting my words by implying I think AMD should offer their GPUs at lower costs if they perform better. Which I'm not. I'm just saying that is a strategy that AMD could take if they wanted to completely steal the show this generation. At the beginning of that post I said I anticipate AMD to price their cards according to performance and what their competitor provides for the price at that performance tier.
 
Last edited:
Why such a big discrepancy between the two 3DMark benchmarks? Which one is closer to real-world?
 
Back
Top