Nvidia's RT performance gains not as expected?

AdoredTV is a fanboy.
Some games are hard to see if raytracing is doing anything. Look at examples like Spider-Man,Cyberpunk,Control. The quake,quake2,doom rt ports, Minecraft.

Hell, raytracing on Minecraft is so effective. I always feel compelled to dig out bases on the coast line and build a glass wall in the water to have light shine though the water and the glass.

Also I was rebuking that the rt gains are sub par. Going from cyberpunk running like ass with compromised settings to completely maxed at 60 minimum is a decent accomplishment. It’s like 60 min. Around ~70 average in the areas I tested with a 7950x on psycho rt. Keep in mind psycho rt ran horribly before. Ultra is what last gen cards typically run and is over 100fps

The 4090 is just that powerful but that isn't the point here. Look if we compare Ampere vs RDNA 2 let's say RDNA 2 (6800 XT) takes a massive 60% performance loss for turning on RT, Ampere (RTX 3080) will suffer a much less performance loss like let's say 35% instead of 60%. Ampere is much faster than RDNA 2 in RT so it makes sense that it will lose less performance and we'll assume that they are both equal in raster. So now when the RTX 4080 12GB relaunches as the 4070 Ti or whatever, that card has 92 RT Tflops and the 3090 Ti has 78 RT Tflops so if they end up being equal in raw raster performance, I will expect the 4070 Ti to take a smaller performance hit when turning on RT vs the 3090 Ti due to the difference in RT Tflops (Again 92 vs 78). If the 4070 Ti performs exactly the same as the 3090 Ti in both raster AND RT, taking the same performance penalty, then what is the point of the extra RT Tflops?
 
Last edited:
I think the point is that per Ray Tracing core (or whatever Nvidia calls them) there hasn't been much of an improvement. Basically all of the improvement is that Nvidia decided to put a whole lot of RTX cores on there. Maybe relatively more than they did, the previous two gens. Its another aspect of how the generation had to brute force the hardware, to get the performace up to a level to be comfortable with modern games in 4k.

They accomplished a huge uplift in performance over their previous gen, that their competitor cannot match. What's the problem?

Oh, the problem is that it is Nvidia.

Adored Guy: "The performance improvement per core from Nvidia's 3xxx to 4xxx is only about 10%. But AMD's RDNA increase per core is 60%... AMD Better?!" It's pretty lolz, really. Maybe it's per core per mhz. Anything in the plus is good.
And no, I didn't watch the video, so fill me in if I got the gist of it wrong.
 
I think the point is that per Ray Tracing core (or whatever Nvidia calls them) there hasn't been much of an improvement. Basically all of the improvement is that Nvidia decided to put a whole lot of RTX cores on there. Maybe relatively more than they did, the previous two gens. Its another aspect of how the generation had to brute force the hardware, to get the performace up to a level to be comfortable with modern games in 4k.
Not sure how to have a good notion of a RT core performance to start with, but is that the case ?:

Rt core
2080ti: 68
3090ti: 84
4090..: 128


RT Tflops:
2080TI: 42.9
3090TI: 78.1
4090..: 191.0


Tflops per core:
2080TI: 0.63 Tflops
3090TI: 0.93 Tflops + 47%
4090..: 1.49 Tflops + 60%


When people talk about the RT performance of a card, what do they use ?

To be simpler, a 4080 has less RT core (76) than a 3090 (82) and close to a 3080 (68), yet it destroy a 3090TI:
NVIDIA-GeForce-RTX-4080-Vray-RTX-Benchmark-Score-1.png


Less core but 112.7 tflops in RT versus 78.1 for the 3090TI, that +43% performance is almost matching 1:1 the real life performance upgrade in vray of +40%, why it does not translate has in RT game is not a bad question.

In Metrox Exodus it does show up (RTX 3080 loose 25% to 43 with RT on versus -13% to 37% for the 4080), Cyberpunk has well the drop for the 4080 is much smaller than for the 3080/3090.

Dying light 2 RT ultra a 4080 does seem to beat an 3090Ti by that number while being much closer otherwise on that title
 
Last edited:
Maybe they are talking about ipc since the 40 series is close to 3ghz stock.

I have no idea how much low hanging fruit there is on the tensor and rt cores. I remember reading the tensor cores do relatively simple fast matrix math. May be one of those things where it is more cost effective to iterate small ipc boosts and go wider as die space allows.

I am happy with the die allocation this time around. Still big enough boosts to bring in more rt features and performance and rop/tmu counts that would make me have a stroke 20 years ago.
 
TPU just posted their benchmarks for Portal RTX. This probably helps paint a more clear picture on the gen over gen RT performance increases. The RTX 3070 is roughly equal to the RTX 2080 Ti in raster but here it's giving it a good stomping when it comes to path tracing (If we can even consider 16fps a good stomping lol). Likewise the RTX 4080 is only around 15% faster than the 3090 Ti in raster and here it's really flexing it's muscles being 39% faster which is in line with the V Ray benchmark. So I guess in order to fully realize the RT performance uplifts you would need to be playing fully path traced games like this or Quake RTX or Minecraft RTX. The 4070 Ti should be around 18% faster than the 3090 Ti if we go purely by RT Tflops numbers so it should land around 33-34fps on that graph.

https://www.techpowerup.com/review/portal-with-rtx/3.html
 

Attachments

  • 1670359544069.png
    1670359544069.png
    165.2 KB · Views: 0
Last edited:
  • Like
Reactions: noko
like this
That +40% in native 1440p over a 3090TI would be in line with the raw RT Tlops increase and significantly bigger than the average 1440p/4k boost in non RT scenario of +15/16%.

But when number get that small it can get noisy, but yes for a full ignorant like me my first reflex would be for those very RT heavy title a la portal to be much better proxy for RT performance if we do not have direct breakdown of how ms the rt steps are taking (which I imagine we could have would we look for them and have a much more direct answer)
 
The dlss 3 numbers for the 4090 are in the image quality comparison. Wonder why they didn’t add it for the 4080 and 4090 to the chart?
 
Last edited:
AdoredTV is a fanboy.
Some games are hard to see if raytracing is doing anything. Look at examples like Spider-Man,Cyberpunk,Control. The quake,quake2,doom rt ports, Minecraft.

Hell, raytracing on Minecraft is so effective. I always feel compelled to dig out bases on the coast line and build a glass wall in the water to have light shine though the water and the glass.

Also I was rebuking that the rt gains are sub par. Going from cyberpunk running like ass with compromised settings to completely maxed at 60 minimum is a decent accomplishment. It’s like 60 min. Around ~70 average in the areas I tested with a 7950x on psycho rt. Keep in mind psycho rt ran horribly before. Ultra is what last gen cards typically run and is over 100fps
Yes indeed, impressive 60 FPS with a $1600+ card. Of course other games are alot better.

How about the 4070, 4060 performance? Since now Amper RT is crap and Turing cards are basically a useless RT card. Who really are benefitting? 3070, game at 4k high settings or turn on RT and game at 1080p with lower frame rates at times. DLSS less than quality starts to degrade fast and even more so at lower resolutions.

Then there are games that use RT and run good enough on AMD and Nvidia hardware while looking better than the RT so called heavy games like Cyberpunk with a cardboard city (my opinion). Anyways if the majority user gets a better experience, more fps, higher resolutions, less lag, higher noticeable settings not using RT then RT has failed so far. Not taking avout 4090 users who have extra performance to spare but the huge majority of GPU owners.

What I see is Nvidia failed to deliver time and time again their RT promises.
 
Yes indeed, impressive 60 FPS with a $1600+ card. Of course other games are alot better.

How about the 4070, 4060 performance? Since now Amper RT is crap and Turing cards are basically a useless RT card. Who really are benefitting? 3070, game at 4k high settings or turn on RT and game at 1080p with lower frame rates at times. DLSS less than quality starts to degrade fast and even more so at lower resolutions.

Then there are games that use RT and run good enough on AMD and Nvidia hardware while looking better than the RT so called heavy games like Cyberpunk with a cardboard city (my opinion). Anyways if the majority user gets a better experience, more fps, higher resolutions, less lag, higher noticeable settings not using RT then RT has failed so far. Not taking avout 4090 users who have extra performance to spare but the huge majority of GPU owners.

What I see is Nvidia failed to deliver time and time again their RT promises.

They are increasing RT performance a lot with each gen it seems. But now they create another problem in that the newest RT creations make the last gen stuff kinda crap. Just look at Portal RTX, anyone without Lovelace is not gonna have a comparable experience to those who do, and Turing users are completely left in the dust except for the 2080 Ti but even that's borderline. Once the RTX 5090 comes out I bet the 30 series will no longer be viable at whatever RT creation nvidia debuts with it and the 4090 will be considered "crap" and probably barely mustering 60fps at 1440p. So in order to continously enjoy RT one has to upgrade to the fastest GPU of the newest gen or you'll get left behind.
 
Last edited:
They are increasing RT performance a lot with each gen it seems. But now they create another problem in that the newest RT creations make the last gen stuff kinda crap. Just look at Portal RTX, anyone without Lovelace is not gonna have a comparable experience to those who do, and Turing users are completely left in the dust except for the 2080 Ti but even that's borderline. Once the RTX 5090 comes out I bet the 30 series will no longer be viable at whatever RT creation nvidia debuts with it and the 4090 will be considered "crap" and probably barely mustering 60fps at 1440p. So in order to continously enjoy RT one has to upgrade to the fastest GPU of the newest gen or you'll get left behind.

Welcome to progress with a new technology. There's not a lot of ways around it. Same shit happened with shaders. The GeForce 3 hard programmable shaders but they sucked, and the cards after it grew leaps and bounds in shader capabilities. If a game wanted to hit the shaders hard for effects, it just wasn't going to run well on older shit. Even if the ROPs and TMUs were fine, the shaders were too weak to pull off the new effects.

Going to be how it is with RT for awhile. We are pushing hardware really hard. In 4 generations it went from no hardware support at all, to hardware that could technically do it in realtime, but not really usefully, to hardware that could do realtime RT but at lower resolutions to hardware that can, barely, pull it off at 4k realtime. It's been a big set of development and I expect it to continue. Progress can be painful. Ultimately though it will be really nice not just because of the realism of RT, but because it makes things way easier on artists. A lot of shit you have to manually do in rasterization is just a part of ray tracing, meaning it is easier and faster for artists to design what they want. They don't have to worry about making an object look properly shadowed, just put the objects and lights in the scene and the shadows happen. Stuff like that.

Yeah about like when I was a kid, but video cards aren’t $100-$300
Well two things to keep in mind:

1) $300 when the Voodoo launched is closer to $600 today.

2) Graphics cards do a lot more today. The Voodoo basically had 3 chips: An ROP, a TMU and a DAC. It took polygons that were already converted to screen space, put one texture on them, rasterized them, and then converted that to analogue for your VGA monitor. That's it. All animation, lighting, setup, translation, etc, etc all had to be done by the CPU. If you look at a modern GPU, the space taken up by the ROPs and TMUs is pretty tiny, despite there being tons, sometimes more than a hundred, on the chip. If they just did what they did back then but faster, the chips could be much smaller. However they now have absolutely massive vector processors for the shaders, and now with RT a whole bunch of specialized hardware for that, and a bunch of matrix units for denoising/upscaling/etc. They are big, they are complicated, and they are produced on high end lithography that costs a lot of money.
 
Welcome to progress with a new technology. There's not a lot of ways around it. Same shit happened with shaders. The GeForce 3 hard programmable shaders but they sucked, and the cards after it grew leaps and bounds in shader capabilities. If a game wanted to hit the shaders hard for effects, it just wasn't going to run well on older shit. Even if the ROPs and TMUs were fine, the shaders were too weak to pull off the new effects.

Going to be how it is with RT for awhile. We are pushing hardware really hard. In 4 generations it went from no hardware support at all, to hardware that could technically do it in realtime, but not really usefully, to hardware that could do realtime RT but at lower resolutions to hardware that can, barely, pull it off at 4k realtime. It's been a big set of development and I expect it to continue. Progress can be painful. Ultimately though it will be really nice not just because of the realism of RT, but because it makes things way easier on artists. A lot of shit you have to manually do in rasterization is just a part of ray tracing, meaning it is easier and faster for artists to design what they want. They don't have to worry about making an object look properly shadowed, just put the objects and lights in the scene and the shadows happen. Stuff like that.


Well two things to keep in mind:

1) $300 when the Voodoo launched is closer to $600 today.

2) Graphics cards do a lot more today. The Voodoo basically had 3 chips: An ROP, a TMU and a DAC. It took polygons that were already converted to screen space, put one texture on them, rasterized them, and then converted that to analogue for your VGA monitor. That's it. All animation, lighting, setup, translation, etc, etc all had to be done by the CPU. If you look at a modern GPU, the space taken up by the ROPs and TMUs is pretty tiny, despite there being tons, sometimes more than a hundred, on the chip. If they just did what they did back then but faster, the chips could be much smaller. However they now have absolutely massive vector processors for the shaders, and now with RT a whole bunch of specialized hardware for that, and a bunch of matrix units for denoising/upscaling/etc. They are big, they are complicated, and they are produced on high end lithography that costs a lot of money.
That's what people don't understand with raytracing - Once we can have consumer products do full path tracing and we can ditch raster - It's really going to be a huge boon for developers, just like the Voodoo was back in the 90's and they could finally do raster on the cheap without needing a $100k SGI box.
 
That's what people don't understand with raytracing - Once we can have consumer products do full path tracing and we can ditch raster - It's really going to be a huge boon for developers, just like the Voodoo was back in the 90's and they could finally do raster on the cheap without needing a $100k SGI box.

Ray tracing is the future for sure. That's also the keyword though: Future. It is going to take a while, don't forget games still need to be made for consoles and there's no way in hell those are doing full path tracing. Maybe the PS6 in 2028? So at least 6 years off, and if the PS6 isn't capable of doing it with good results and we have to wait for the PS7 then we're going to be waiting for that transition until 2035+. In the meantime though the "hybrid" approach of raster and RT will have to do.
 
That's what people don't understand with raytracing - Once we can have consumer products do full path tracing and we can ditch raster - It's really going to be a huge boon for developers, just like the Voodoo was back in the 90's and they could finally do raster on the cheap without needing a $100k SGI box.
Which means very little as of today when your best, top dollar card, 4090 can still struggle in RT, a simple game (Portal) compared to the much more complex games of today. How much are folks paying for a feature that are limited in usability due to performance especially on older and lower end cards? Not saying I do not like having these options, I use RT as much as I can if performance and IQ is good(yep, RT can degrade IQ if you have to make too many other compromises to get acceptable performance).

Using RT smartly with a hybrid approach to me is an improvement and is doable. Metro Exodus shows the potential, Crytek method is also very interesting and less hardware required other than shaders. Smart use when the tech works well such as Calisto, small areas, close ups where an accurate reflection is noticeable, subsurface scattering, not an accurate reflection you cannot even see being done blocks away (Cyberpunk), or reflections in a warzone where all the cars are freshly washed, polished and troops coming in from the mud on a perfectly cleaned and polished floor! Reminds me of the highly tessellated objects in Crysis 2 where the concrete road blocks were more tessellated than the rest of the scene. All kindly influence by Nvidia. Games such as FarCry 6 use RT where it makes a noticeable difference in a smarter way while maintaining acceptable performance. What I find interesting is that current gen console games are using RT in a usable fashion but yet AMD RT is supposedly weak. I say if one uses RT in stupid ways for things making very little difference, yes Nvidia would appear to be better but at times only usable on Nvidia's top end show boat cards and useless on everything else they sell.

RT is not the goalpost, not the one I see, it is the overall experience/price and usable feature set, not the gimmicks promoted as better but when used is actually a degradation.
 
Ray tracing is the future for sure. That's also the keyword though: Future. It is going to take a while, don't forget games still need to be made for consoles and there's no way in hell those are doing full path tracing. Maybe the PS6 in 2028? So at least 6 years off, and if the PS6 isn't capable of doing it with good results and we have to wait for the PS7 then we're going to be waiting for that transition until 2035+. In the meantime though the "hybrid" approach of raster and RT will have to do.
Have to start somewhere. RT isn't just going to spring up overnight and run with no problems.
 
Which means very little as of today when your best, top dollar card, 4090 can still struggle in RT, a simple game (Portal) compared to the much more complex games of today. How much are folks paying for a feature that are limited in usability due to performance especially on older and lower end cards? Not saying I do not like having these options, I use RT as much as I can if performance and IQ is good(yep, RT can degrade IQ if you have to make too many other compromises to get acceptable performance).

Using RT smartly with a hybrid approach to me is an improvement and is doable. Metro Exodus shows the potential, Crytek method is also very interesting and less hardware required other than shaders. Smart use when the tech works well such as Calisto, small areas, close ups where an accurate reflection is noticeable, subsurface scattering, not an accurate reflection you cannot even see being done blocks away (Cyberpunk), or reflections in a warzone where all the cars are freshly washed, polished and troops coming in from the mud on a perfectly cleaned and polished floor! Reminds me of the highly tessellated objects in Crysis 2 where the concrete road blocks were more tessellated than the rest of the scene. All kindly influence by Nvidia. Games such as FarCry 6 use RT where it makes a noticeable difference in a smarter way while maintaining acceptable performance. What I find interesting is that current gen console games are using RT in a usable fashion but yet AMD RT is supposedly weak. I say if one uses RT in stupid ways for things making very little difference, yes Nvidia would appear to be better but at times only usable on Nvidia's top end show boat cards and useless on everything else they sell.

RT is not the goalpost, not the one I see, it is the overall experience/price and usable feature set, not the gimmicks promoted as better but when used is actually a degradation.
You're right. NV should quit promoting RT until AMD catches up and exceeds them. RT sucks and shouldn't be used unless the lowest of cards and crappy consoles can use it to the max.

What a joke.
 
Have to start somewhere. RT isn't just going to spring up overnight and run with no problems.
Exactly! RT has some serious teething problems or maybe the costs associated with it is starting to hit really hard, pricing, performance, implementation etc.
 
You're right. NV should quit promoting RT until AMD catches up and exceeds them. RT sucks and shouldn't be used unless the lowest of cards and crappy consoles can use it to the max.

What a joke.
Yeah, not really sure what he's getting at. Obviously, we all get that full raytracing isn't possible right now. I guess no one should make any software or build any hardware to bother.
 
You're right. NV should quit promoting RT until AMD catches up and exceeds them. RT sucks and shouldn't be used unless the lowest of cards and crappy consoles can use it to the max.

What a joke.
AMD is doing fine in RT, consoles show this clearly. Smarter use in combination of other techniques to make a better experience count more than just an old game dressed up, path trace. A holistic approach that works on the hardware that more people buy and use vice the Goldie locks unobtanium hardware used as sort of a carrot so you can finance more their pockets than the tech so to speak. Smart RT usage in things that work.
 
AMD is doing fine in RT, consoles show this clearly. Smarter use in combination of other techniques to make a better experience count more than just an old game dressed up, path trace. A holistic approach that works on the hardware that more people buy and use vice the Goldie locks unobtanium hardware used as sort of a carrot so you can finance more their pockets than the tech so to speak. Smart RT usage in things that work.
Got it. Nvidia sucks. AMD is clearly doing it better, because you like them.
 
Yeah, not really sure what he's getting at. Obviously, we all get that full raytracing isn't possible right now. I guess no one should make any software or build any hardware to bother.
So you don't know what I am getting at and then no one should make any RT software or build any hardware to bother??? If you really want to know what real RT performance would look like, with Nvidia or AMD or Intel -> VRay rendering, which is highly optimized for Nvidia hardware is the benchmark to look at. Everything else is mostly a hybrid approach with a few path trace unique interesting simple games.
 
So you don't know what I am getting at and then no one should make any RT software or build any hardware to bother??? If you really want to know what real RT performance would look like, with Nvidia or AMD or Intel -> VRay rendering, which is highly optimized for Nvidia hardware is the benchmark to look at. Everything else is mostly a hybrid approach with a few path trace unique interesting simple games.
No, I think I get it. You're upset that the completely free update to Portal really only runs well on Nvidia hardware since nvidia is the only one that has invested any real effort into putting the work in. Not sure why so many people are butthurt at a free update that nvidia paid for, that is for nvidia hardware. No one is forcing you to buy anything.
 
Have to start somewhere. RT isn't just going to spring up overnight and run with no problems.

For sure. I guess I'm just pointing out the obvious that's all lol. It's exciting but yeah this is gonna take a while and people who buy a GPU specifically for RT shouldn't expect it to keep up for too long. As amazing as the 4090 is right now I'm fully expecting it to get destroyed in RT after 2 more generations.
 
No, I think I get it. You're upset that the completely free update to Portal really only runs well on Nvidia hardware since nvidia is the only one that has invested any real effort into putting the work in. Not sure why so many people are butthurt at a free update that nvidia paid for, that is for nvidia hardware. No one is forcing you to buy anything.
You either worry too much, too concern or maybe upset what others think. Seriously who cares? Great, if Portal is your thing go for it with RT and I hope you enjoy it. Does it represent what current and previous generation of cards will do for the next generation of games -> nope, other then they will suck for the sake of RT I guess. As for what I buy, once again why would you even be concerned about it, people maybe should just buy what is best for their usage.
 
They are increasing RT performance a lot with each gen it seems. But now they create another problem in that the newest RT creations make the last gen stuff kinda crap. Just look at Portal RTX, anyone without Lovelace is not gonna have a comparable experience to those who do, and Turing users are completely left in the dust except for the 2080 Ti but even that's borderline. Once the RTX 5090 comes out I bet the 30 series will no longer be viable at whatever RT creation nvidia debuts with it and the 4090 will be considered "crap" and probably barely mustering 60fps at 1440p. So in order to continously enjoy RT one has to upgrade to the fastest GPU of the newest gen or you'll get left behind.
Which indicates to me, RT performance is not one of the main reasons to buy a GPU, since usefulness diminishes quickly generation to generation. Maybe even a minor selling point. I am talking about my usage here of 3 - 5 years per video card. Others with shorter use cycles, RT could be a higher selling point for a decision. Personally never thought RT was a dominate selling point for any video card. Over 99.9% if not higher of all PC games have no real time RT involved. The ones that do may not even have a significant reason to enable RT from performance standpoint, IQ degradation overall due to performance requirements etc. Anyways like in the video, paraphrasing from memory, RT is not the goalpost even when others try to make it so (like tessellation previously).
 
You either worry too much, too concern or maybe upset what others think. Seriously who cares? Great, if Portal is your thing go for it with RT and I hope you enjoy it. Does it represent what current and previous generation of cards will do for the next generation of games -> nope, other then they will suck for the sake of RT I guess. As for what I buy, once again why would you even be concerned about it, people maybe should just buy what is best for their usage.
And with current games that generally are raster + raytracing, the nvidia cards perform basically a complete generation better compared to what Intel or AMD is offering - So again, i'm not sure what you're getting at here. It's not as if nvidia only works in a fully raytraced game.
 
And with current games that generally are raster + raytracing, the nvidia cards perform basically a complete generation better compared to what Intel or AMD is offering - So again, i'm not sure what you're getting at here. It's not as if nvidia only works in a fully raytraced game.
Believe it or not, for most folks, the games with RT, it is better for them to play with RT off or should I say is the choice that is made. When you have the predominate owners who have RTX cards, 2060, 3050, 3060 leaving RT off because it makes the overall experience worst when on -> RT is not the goalpost for buyers, it is more a luxury that one could care less for. If it takes a 4090 to use RT not even effectively but in a usable fashion at higher resolutions, that makes RT even less of a factor in buying a GPU. Yes I would take better RT performance if the experience to price is better but to have a huge performance impact with very little visual difference or significance makes RT less of a factor overall in my buying decision. Besides I am playing plenty of games that don't even have RT besides the games that RT is totally worthless in my eyes like Battlefield 5, tack on out of place. If one is going to play 80% to 90% games without RT or with it off due to performance reasons, insignificant IQ differences, that makes RT a rather low priority.

Then top it off with RT games that do play well with AMD, FarCry 6, Resident Evil, Crysis Remasters etc. just in how implemented (SMARTLY) makes the RT differences of AMD and Nvidia just not so important in my eyes. RT seems more of a nice to have if usable otherwise one is not missing much unless one is really a fan of Quake II, Portal and maybe a few others - each their own.

RT becomes a factor for my buying decision if it significantly or mostly can increase my gaming experience. So far it has had almost zero impact overall.
 
AMD is doing fine in RT, consoles show this clearly. Smarter use in combination of other techniques to make a better experience count more than just an old game dressed up, path trace. A holistic approach that works on the hardware that more people buy and use vice the Goldie locks unobtanium hardware used as sort of a carrot so you can finance more their pockets than the tech so to speak. Smart RT usage in things that work.
Ray tracing on consoles is a disaster. They can barely run at 30 FPS with dynamic resolution scaling, and that is usually only with reflections and maybe ambient occlusion at extremely low resolution. The consoles are not powerful enough to do the most important things like global illumination and projection. I don't call that "fine."
 
Ray tracing on consoles is a disaster. They can barely run at 30 FPS with dynamic resolution scaling, and that is usually only with reflections and maybe ambient occlusion at extremely low resolution. The consoles are not powerful enough to do the most important things like global illumination and projection. I don't call that "fine."

Doesn't Metro Exodus use RTGI on consoles targeting 60fps with somewhat OK results? https://www.eurogamer.net/digitalfoundry-2021-metro-exodus-enhanced-edition-ps5-vs-xbox-series-x

" it's using the latest rendition of the 4A Engine, rebuilt from the ground up to support hardware accelerated ray tracing, which is used to deliver phenomenally realistic global illumination. And similar to its console brethren, 60 frames per second is the target."

"In terms of typical resolutions in the open world, the Sony machine operates between 1296p and 1512p, while Series X runs in similar scenarios at 1512p to 1728p."

I know the actual quality of the RTGI also isn't the greatest, but I don't really think it's fair to call the end result a disaster. Had this game used FSR 2.2 it could probably deliver even better results, all while delivering 60fps with RTGI.
 
Doesn't Metro Exodus use RTGI on consoles targeting 60fps with somewhat OK results? https://www.eurogamer.net/digitalfoundry-2021-metro-exodus-enhanced-edition-ps5-vs-xbox-series-x

" it's using the latest rendition of the 4A Engine, rebuilt from the ground up to support hardware accelerated ray tracing, which is used to deliver phenomenally realistic global illumination. And similar to its console brethren, 60 frames per second is the target."

"In terms of typical resolutions in the open world, the Sony machine operates between 1296p and 1512p, while Series X runs in similar scenarios at 1512p to 1728p."

I know the actual quality of the RTGI also isn't the greatest, but I don't really think it's fair to call the end result a disaster. Had this game used FSR 2.2 it could probably deliver even better results, all while delivering 60fps with RTGI.
As far as I know RTGI is the only thing it does on consoles. Add to that the game is internally rendered at 1080p and then dynamically upscaled. I doubt FSR would make much of a difference considering that approach is already similar.
 
As far as I know RTGI is the only thing it does on consoles. Add to that the game is internally rendered at 1080p and then dynamically upscaled. I doubt FSR would make much of a difference considering that approach is already similar.

Where are you reading that it's an internal 1080p? The article claims it's anywhere from 1296p to 1728p depending on the console. Unless you mean the RTGI itself is being rendered at 1080p or something, or the game is rendering at 1080p then upscaling to 1728p.
 
  • Like
Reactions: noko
like this
6700xt seem to be able to play Metro Exodus enhanced well enough, not at RT ultra/shader quality extreme but at 1440p (like you need for high quality upscaled 4K but at High type of setting close enough to 60fps at 1440p to make it work with dynamic shaders/res, why would XboxX-PS5 not be able too ?
 
  • Like
Reactions: noko
like this
Ray tracing on consoles is a disaster. They can barely run at 30 FPS with dynamic resolution scaling, and that is usually only with reflections and maybe ambient occlusion at extremely low resolution. The consoles are not powerful enough to do the most important things like global illumination and projection. I don't call that "fine."


Really? there are quite a few RT titles that run just fine for a console, we are not talking 4090 level RT hardware yet these consoles can do RT in a smart way.

Now for those that run RT for everything with a smile is one data point each, just because you do does not mean most do.
 
AMD RT performance (up to? whatever that means) 4K max settings (no upsampling)
https://www.amd.com/en/graphics/radeon-rx-graphics

Resident Evil Village 7900XTX 138fps
On my 3090 I get around 95fps outside and 120 or so inside

Doom Eternal 7900XTX 135fps
On my 3090, around 115fps during a fight

Looks like waiting for reviews for RNDA 3 performance is in order. From what I see I would see an upgrade for RT over the 3090, definitely over the 6900XT. How the RT performance compares to the 4080/4070Ti (higher and lower price) is to be seen.
 
I don't really care that much about RT, even the regular old reflections in Spider Man look fine in motion, which you are in 99 percent of the time. RTGI looks great and seems to be the most transformative feature. Rest of it I can live without TBH. If I have a choice between higher frame rate at 4K and RT I'm leaning framerate.
 
The 4070ti released it can give some clues

3070Ti_vray_rtx-1.png



4070ti has 60 rt cores
3090-3090ti have 82-84 rt cores
 
Back
Top