If Navi matches 3080 in Raster performance but not Ray Tracing... will we care?

DarkSideA8

Gawd
Joined
Apr 13, 2005
Messages
988
Given that Ray Tracing is the newest 'shiney' that everyone wants... but few games actually do well (atm), is Ray Tracing likely to be the 'killer app' feature that Nvidia wants us to believe?

https://www.tomsguide.com/uk/news/amd-big-navi-benchmarks-revealed-and-its-good-news-for-rtx-3080

The above link suggests that 3080 will beat Navi when RT is enabled... but in some titles, w/o RT enabled - Navi wins.

So the question is; is RT at this point a 'need to have' feature, or a 'nice to have' feature?







And does the 'futureproofing' potential of 3080 - when new games come out with good RT (and not just pasted on shiney for the sake of shiney) mitigate in favor of waiting for 3080 come November when stocks still are not available, but Navi is?
 
I think it's a damn good start. Being back to trading blows with Nvidia again is huge, even if raytracing isn't taken into account. And it WILL be a major win for the consumer, forcing Nvidia to compete in price.

Personally I like turning on the bells and whistles to see how how good the new shineys look, and with Gsync (or Freesync) maintaining 60 FPS minimum isn't absolutely required for smooth gameplay.
 
I would say we do not know enough about AMD raytracing performance nor if developers can effectively use it. I don't think the older games using DXR 1.0 will be able to fully use AMD raytracing effectively.
 
In my opinion, Ampere was an architectural failure for the supposed 2Billion+ spent on it. Example, my MSI Gaming Trio 2080Ti is behind the 3080 between 7-17 fps in 4K gaming which is what I need gaming on either my Faux 4K Epson 4000 or the new true 4K JVC NX7 at 100" screen.(4k was meant for the big screens). With that said folks, my 2080Ti eats 250watts vs. 350watt 3080, which also has twice the Cuda cores! Ampere is creeping up to Turing pricing too. By the time 20GB 3080 gets here you will be close to or at the $999 price point.
This again is my opinion on it.
3080 vs my 2080Ti= 7-17fps more at 4K, twice the Cuda but not the performance, eats 100watts+ more, and both it and 3090 space hogs. Unless they release a super duper performance enhancing driver at NAVI launch, it is a fail in my eyes on Nvidia. My opinion, not stating fact other than yeah, it isn't that much faster than my 2080Ti in the much needed 4K space.

Now lets say Radeon 6900XT drops, performs at the same level as 3080 or just trading blows with it, and 100watts lighter at the 250watt mark. NAVI wins. They also have TSMC producing their cards on a proven node, so cards should be a plenty at launch unlike Samsungs 8nm low yield. Nvidia pulled an Intel. At $649 YA BURNT.
If the last presentation show anything is Lisa likes to leave the best for last. So after showing off the 6700 to 6900XT cards, we now introduce the 6900/50 XTX. Well ahead of 3080 below a 3090 $799. That is the best case scenario that could happen for all of us, finally. I really feel 6900 XT will go head to head with 3080, be in better quantity because of TSMC and efficient.
 
Nope, not one bit - at least not for the next few years. If you were to put a dollar price on Raytracing -- for example, if card 1 had raytracing and card 2 had no ability to do raytracing at all -- I might pay... $20-30 more. That may change a few years down the line when there are more than 2-3 games that support RT in any meaningful way.
 
Last edited:
I bought a 5700 for it’s price/performance at 1440p. Paid just under $300, and couldn’t be happier with it. Without paying significantly more, I’m not going to be able to use ray tracing. IMO the biggest issue with RT is the limitations on mid range cards. Nvidia could fix this with the 3070 and 3060, but I doubt they will.
 
As a 3D artist, I would love to see AMD's RT implementation have path tracing renderer integration in apps like Blender. Nvidia's Optix render makes even the fastest CPUs look like toys when you can fit the rendered scene in RAM, but 10GB is not a lot of room to fit in a complex scene. so if AMD has a card with Optix-like performance (even half the render speed) but have tons of RAM (32GB would be amazing), and also have solid 4K 100+Hz performance, it would be a no-brainer for me. As it stands, the 3090 looks to be my next card, but I'll take a performance hit if the RAM or price is right. Preferably both.
 
I’d personally care. But it depends just how much weaker the RT performance would be and how strong the rasterization performance is to see if I’d care enough.
 
Ray Tracing is in this years biggest upcoming game. We don't know how "good" it is but that's a big reason for people to want it.


At this time even if AMD has similar raster performance you're going to get more with Nvidia which makes it worth the little bit of extra cash IMO:

DLSS
A working video encoder (maybe after 6 years of being broken AMD will have it fixed for Navi)
NV Broadcast (check it out if you haven't; it's actually pretty amazing with what it can do)
Compatibility with both Gsync and Freesync displays
 
It never matted was Nvidia was doing with GTX or RTX .. Navi was the first 7nm with 8Gb of DDR6 and it made it's our market but the neg driver / hardware reviews never helped to get the ball rolling . My card has always been great since Aug 2019
 
Given that I have a hard time telling whether ray tracing is turned on or not without comparison stills in most of the "showcase" games like Control, I won't really care either way.
I'd agree if you said that about tomb raider or metro but the RT effects in control are extremely noticable and look fantastic
 
I care more about DLSS more then just RTX. More developers are going to start using it more and more.

Same for me. Have had a RTX card for a while now and just not bothered to play any RTX titles. I'll get around to it once I buy Control probably.
 
If performance is the same but price is much cheaper then I'm fine.

If performance AND price are the same but AMD doesn't have RTX, then nope. I'll go team green.

I also have slightly more faith in nVidia’s drivers.

For me lets say the $750 3080. If performance is equal AMD would have to be ~$675. If it’s RT is half it’d need to be $500. If no RT then team green.
 
I'm with Dayaks. I'm not going to pay more for fan-boying purposes. AMD has to earn my business. Whether that's through performance or cost-effectiveness. With that being said, I would certainly like to have an all AMD box if possible and not giving money to Nvidia would give me some measure of satisfaction.

Heck the only reason I got a 2080ti was because a bro pretty much gave it to me. In my last build, I went from Titan Xp (sli) to a single Radeon VII before switching to the 2080ti.
 
Given that Ray Tracing is the newest 'shiney' that everyone wants... but few games actually do well (atm), is Ray Tracing likely to be the 'killer app' feature that Nvidia wants us to believe?

https://www.tomsguide.com/uk/news/amd-big-navi-benchmarks-revealed-and-its-good-news-for-rtx-3080

The above link suggests that 3080 will beat Navi when RT is enabled... but in some titles, w/o RT enabled - Navi wins.

So the question is; is RT at this point a 'need to have' feature, or a 'nice to have' feature?







And does the 'futureproofing' potential of 3080 - when new games come out with good RT (and not just pasted on shiney for the sake of shiney) mitigate in favor of waiting for 3080 come November when stocks still are not available, but Navi is?
I'm still confused, either that or nobody bothered to read the article in your link. All it says is raytracing wasn't on and that the 3080 may still beat AMD.... This gives zero indication of any real differences at all. I would speculate that AMD will be behind being it's their first try, but nobody knows how much.

"With that in mind, there’s a good chance that the RTX 3080 would trounce Big Navi if ray-tracing was enabled."
The "with that in mind" part was simply based on AMD having it disabled in their graph. It could simple be they are ironing out the drivers still or it could be they just wanted to hold something back to release or it could be that it's lacking. This is just an article with a title that doesnt match it's content like all the other click bait going around. I'll try with on benchmarks and pricing before I make the decision to spend my money, not based on some click bait that has zero information to add to what is already known.
 
I'm still confused, either that or nobody bothered to read the article in your link. All it says is raytracing wasn't on and that the 3080 may still beat AMD.... This gives zero indication of any real differences at all. I would speculate that AMD will be behind being it's their first try, but nobody knows how much.

"With that in mind, there’s a good chance that the RTX 3080 would trounce Big Navi if ray-tracing was enabled."
The "with that in mind" part was simply based on AMD having it disabled in their graph. It could simple be they are ironing out the drivers still or it could be they just wanted to hold something back to release or it could be that it's lacking. This is just an article with a title that doesnt match it's content like all the other click bait going around. I'll try with on benchmarks and pricing before I make the decision to spend my money, not based on some click bait that has zero information to add to what is already known.
Well, technically I don't think it's AMD's first try... RT can... be done on Navi albeit in software mode... and I think in preparing Navi2 for the consoles they have a pretty good idea of it's performance, as well as time to tweak the performance. But this whole thing does remind me of the DX11 58xx series when AMD first came out with Tessellation, hardware tessellation, and the performance was terrible since it was their first go at it.
 
Yar, RT isn't that much of an importance for me. I'll admit that Control's raytracing looked great (even on my 2080s), but it's not something that would push me to one card over the other.

Price/performance ratio and acoustics are probably the biggest factors for me.
 
Well, technically I don't think it's AMD's first try... RT can... be done on Navi albeit in software mode... and I think in preparing Navi2 for the consoles they have a pretty good idea of it's performance, as well as time to tweak the performance. But this whole thing does remind me of the DX11 58xx series when AMD first came out with Tessellation, hardware tessellation, and the performance was terrible since it was their first go at it.
I mean, just because something is done in software doesn't make it not a first try for their hardware... And consoles use the same architecture, so they are included with the first try. I'm not saying AMD doesn't know the performance, they are well aware by now, I'm saying click bait articles "authors" don't.
 
I believe that AMD will have great RT performance. While we have no true proof, except a snippet of BL3 during the Lisa reveal, we do have PS5 and Xbox Series X release footage of the games with RT on and they look way above 60 fps. Supposedly that is at 4k as well. Since those are consoles, I would expect the performance to be above that for the discreet cards we will be able to have. Right now, in their favor is supply. Using a node that is refined is a key to getting more made. As with the bump in prices of the CPUs they announced, I expect a sizable bump in GPU prices, if they prove the cards are better than their console variants. I currently have a 2070 Super and it easily OCs to just a hair under the 2080 Super speeds (+1113 RAM, +112Mhz GPU) . I was considering the RX 5700 XT, but with all the driver issues and even some compatibility issues with some games I play, I decided against it. I was tired of having to mess with a driver or adjust things in the card just to make it work. I wanted something that would plug and play, Nvidia provided that experience for me. I hope AMD will learn and fix this ongoing driver issue with their cards, but it has been a couple decades since that has happened.
 
I believe that AMD will have great RT performance. While we have no true proof, except a snippet of BL3 during the Lisa reveal, we do have PS5 and Xbox Series X release footage of the games with RT on and they look way above 60 fps. Supposedly that is at 4k as well. Since those are consoles, I would expect the performance to be above that for the discreet cards we will be able to have. Right now, in their favor is supply. Using a node that is refined is a key to getting more made. As with the bump in prices of the CPUs they announced, I expect a sizable bump in GPU prices, if they prove the cards are better than their console variants. I currently have a 2070 Super and it easily OCs to just a hair under the 2080 Super speeds (+1113 RAM, +112Mhz GPU) . I was considering the RX 5700 XT, but with all the driver issues and even some compatibility issues with some games I play, I decided against it. I was tired of having to mess with a driver or adjust things in the card just to make it work. I wanted something that would plug and play, Nvidia provided that experience for me. I hope AMD will learn and fix this ongoing driver issue with their cards, but it has been a couple decades since that has happened.

I have difficulty having hope for AMD and RT given their history with non-RT drivers, as said in your post, and their history in general trying to implement features.

On the flip side the RT I generally care about (global illumination) has minimal impact on peformance.

Half the time RT makes a game look worse at this point so I totally understand people who don’t factor it.
 
Depends on what people play. If I was playing Minecraft a lot, I would care.


THIS, and yes it does matter to me,at this point. 2077/Legion/Control/Minecraft/Ghostrunner (all games I will play/play) and the software ecosystem nvidia offers, outclass AMD hands down at this point. Its not just DXR performance that is important to see on the 28th but Denoising as well I am curious as to how AMD will do that,and there DLSS/Tensor Core alternative,2 things they must have very strong alternatives for. Go watch EposVox 58 min review of Ampere folks,and his other Ampere content pieces the last several weeks. Not needing to buy a $$ Green screen,RTX Voice,etc.AMD's abandonment of the streamer community,is not a good look. I remember when I used to read H reviews and seeing how having hardware that could turn on all the eye candy was important,it still is to many of us.
 
Given that Ray Tracing is the newest 'shiney' that everyone wants... but few games actually do well (atm), is Ray Tracing likely to be the 'killer app' feature that Nvidia wants us to believe?

https://www.tomsguide.com/uk/news/amd-big-navi-benchmarks-revealed-and-its-good-news-for-rtx-3080

The above link suggests that 3080 will beat Navi when RT is enabled... but in some titles, w/o RT enabled - Navi wins.

So the question is; is RT at this point a 'need to have' feature, or a 'nice to have' feature?







And does the 'futureproofing' potential of 3080 - when new games come out with good RT (and not just pasted on shiney for the sake of shiney) mitigate in favor of waiting for 3080 come November when stocks still are not available, but Navi is?

Lots wrong with that article. First what AMD showed is NOT the highest end card, in fact might not even be the second biggest SKU. Second I think people will go with slightly worse Ray Tracing in exchange for lower cost better Rasterization, more RAM and lower power in droves. Not everyone, some will never leave Nvidia but a huge chunk.
 
I care more about DLSS more then just RTX. More developers are going to start using it more and more.

Yeah, honestly DLSS is such a stupidly huge advantage in any game that supports it - to which AMD seems to have no true answer yet.
 
In my opinion, Ampere was an architectural failure

I imagine the 3050-3060-3070 will be a big part of judging that, everything else can be quite niche, if this is somewhat representative:
https://store.steampowered.com/hwsurvey/videocard/

Still to this days among the 1000-2000 Nvidia card the 80 and above cards are around 8% of the users.

If they achieved to more than double the performance by dollars with a bit smaller of a TDP going from the 2080TI to a 3070...

This is probably gross approximation:
https://www.gpucheck.com/compare-mu...0k-3-70ghz-vs-intel-core-i7-8700k-3-70ghz/low

But the cost by frame from the 2080ti to 3070-3080 different is really big.

It could be a failure for high end to high end for pure gaming scenario, will see with mature card/drivers/game in 1-2 year's too (in game where drivers optimization should not be an issue like those ID software Vulkan's game: https://www.thefpsreview.com/2020/09/16/nvidia-geforce-rtx-3080-founders-edition-review/6/ the difference in 4K between an 3080 and a 2080ti is massive, near 50% with RT on, that could stay more niche case or be representative of what the 2022 titles will look like), while still being a success in the segment where almost everyone buy their cards.

Has for the OP question, not until screenshots/video for RayTracing need a RTX on/Off title in them for us to know if it is on or not, Video games got really good over decades at faking light and got really good at it, it will need a killer app, a game that not only look quite better with it but a really popular one, 2077 is an obvious candidate, but has it imagine being made a lot with PS5/Xbox in mind for it's raytracing, I could see it run well on AMD hardware.
 
Last edited:
People keep bringing up Cyberpunk 2077, a game not launched yet with unknown launched performance or final feature set. In preview of RT with a 2080 Ti with DLSS2, without reflections, at 1080p (internal rendering as low as 540p) it would drop to the low 30's. The 3080 is probably at best 2x the RT performance.
https://wccftech.com/cyberpunk-2077...dlss-2-0-enabled-on-an-rtx-2080ti-powered-pc/

Much can happen or not in performance for this particular game, I see no guarantee that RT would make the defining difference in this game. For lower end skews of Ampere as in the 3060 or 3070, RT is probably a mute point or limited, DLSS on the other hand looks promising with some rendering issues at times. Seems like the same reasoning that was with Turing is repeating, as in future games will demand having good RT for good game play. In my opinion, it is the good game play, interesting story of Control that makes the game fun to play, not extra pretty effects. While I would appreciate those effects but not at a sacrifice to overall IQ and performance. Lowering resolution, performance etc. can overall have an negative effect as well. I also agree, in some games turning on RT degrades IQ with very unrealistic reflections that are in your face and just don't look real anyways. Sometimes moderation is a better step forward.
 
The idea that a DLSS competitor is a must have, at this point, is ridiculous. The same with other niche features like rtx voice and the green screen tech.

If you're playing one of the 14 titles that support DLSS, then it's a nice feature, or if you're a streamer, also nice to have noise/background reduction.

But let's be honest, 99% of the people who are holding up those features as must have, either won't use them, or aren't playing a title that supports them.

In addition, for every title that currently supports DLSS, theres one that was announced and support never materialized. Maybe in a generation or two, with much wider support, but to base your purchase on such a limited selection, especially with the limited support after two years of turing, seems foolish.

Released Games w/ DLSS:
Fortnite
Death Stranding
F1 2020
Final Fantasy XV
Anthem
Battlefield V
Monster Hunter: World
Shadow of the Tomb Raider
Metro Exodus
Control
Deliver Us The Moon
Wolfenstein Youngblood
Bright Memory
Mechwarrior V: Mercenaries

Released games w/out DLSS:

We Happy Few
Kinetik
Outpost Zero
PlayerUnknown’s Battlegrounds
Remnant from the Ashes
Scum
Darksiders III
Dauntless
Fear the Wolves
Fractured Lands
Hellblade: Senua’s Sacrifice
Hitman 2
Islands of Nyne: Battle Royale
ARK: Survival Evolved
Justice
JX3
Amid Evil

Announced Games

Cyberpunk 2077
Call Of Duty: Black Ops Cold War
Watch Dogs: Legion
Atomic Heart
Vampire: The Masquerade – Bloodlines 2
Boundary
Serious Sam 4: Planet Badass
Stormdivers

Canceled Games

The Forge Arena
 
If the Big Navi matches the 3080 in raster performance I will care, full stop, no further qualification or caveat needed. I won't buy one, but it'll sure as heck get my attention, because it'll be the first time since I've been in this hobby that AMD has been competitive at the high end.

And competition is good for the consumer.
 
If the Big Navi matches the 3080 in raster performance I will care, full stop, no further qualification or caveat needed. I won't buy one, but it'll sure as heck get my attention, because it'll be the first time since I've been in this hobby that AMD has been competitive at the high end.

And competition is good for the consumer.
Did you get into the hobby after 2015?
 
Lots wrong with that article. First what AMD showed is NOT the highest end card, in fact might not even be the second biggest SKU. Second I think people will go with slightly worse Ray Tracing in exchange for lower cost better Rasterization, more RAM and lower power in droves. Not everyone, some will never leave Nvidia but a huge chunk.
How do you know this? Are you an AMD insider? Why would they not use their top card with their top CPU?
 
How do you know this? Are you an AMD insider? Why would they not use their top card with their top CPU?
Well they were very intent on not stating the model number and the AMD said on the record that we didn't say which card it was which you only say if it's NOT the top end card. Plus AMD has an entire event for it and they are not going to spoil the biggest card. Plus AMD always holds back a halo product for the last reveal. It may be Navi 21 but it is not the top end card. And teasing your top end card would be idiotic.
 
The problem for AMD at this point is that Nvidia already has a generation head start on Ray Tracing tech. Even if AMD is able to match Nvidia @ RT, it's not going to matter because virtually all current RT implementations are proprietary, and if that's not enough, Nvidia has DLSS, which is a game changer as far as game performance goes. AMD isn't going to be able to brute force their way ahead of DLSS.

The truth is that Big Navi has already lost, and it's not even out yet. It will be at least 1 more generation before AMD can be at parity with (or beat) Nvidia as far as RT and AI-based rendering goes.
 
Back
Top