Metro 2033 Exodus and Ray Tracing...

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
11,826
I swapped out a 1080TI for a 2080 in a pretty lateral move so I could experience Ray Tracing.

I've downloaded the Star Wars Ray Tracing demo, since you can't interact with it, and it's capped at 24 frames per second - it might as well just be pre-recorded FMV game video - how could I tell the difference? Meh.

I loaded up the RTX benchmarks in FutureMark - double Meh...doesn't look any better than anything else they show in their benchmarks and runs even slower.

I tried Metro 2033 Exodus on the Xbox PC Pass Beta Subscription service. I turned on DLS and Ray Tracing and though this looks good - but not sure it's necessarily next gen looking. I mean Vermantide 2 looked as good or better for instance and that doesn't have Ray Tracing.

Anyway - turned off Ray Tracing while turning a desk lamp on and off and low and behold...
COULD
NOT
TELL
THE
DIFFERENCE


So I looked it up to learn if I should be able to tell the difference? I mean I heard it was subtle, but I couldn't tell any difference at all.

Yes it's subtle. In this techspot reviews screenshots I can't even tell the difference.

https://www.techspot.com/article/1793-metro-exodus-ray-tracing-benchmark/

So I spent the next half hour turning ray tracing on and off (all settings at extreme, and Raytracing to Ultra) and finally found one room that made a difference. It was a room with a small window and shreds of curtain in the way. That room had a fairly sizable difference, but nothing else I saw switching back and forth at my random intervals made the game any better or any more realistic -- subjectively.

Here is the one scene I found that made a sizable difference:

RTX Off
Metro Exodus (Windows) 6_23_2019 3_47_47 PM.png
RTX On
Metro Exodus (Windows) 6_23_2019 3_48_30 PM.png
 
Last edited:
In Metro Exodus ray-tracing does make lighting more consistent and in some places fixes shadows and removes fake SSAO shadows in places light position suggest there should be no shadows.
This game even with RTX ON is still mostly rasterized game and ray-tracing is mostly helping to improve global illumination a little. It does not implement reflections or any other fancy effects and because of rather monochrome graphical themes it is hard to spot global illumination at work. It is more like dynamic lighting baking and it is not even done fully. It is hard to expect that much of a difference on static maps on which they already did static light baking.

And even then many of dynamic global illumination effects that ray-tracing can do can also be approximated with rasterization methods which do not require RTX hardware like eg. VXGI: https://developer.nvidia.com/vxgi
Reflections can also be for the most part done without RT as many games have done for many years, even before shaders were a thing.

Does this mean RT is dead end? Absolutely no!
As a consumer technology hardware accelerated RT is new thing and obviously Nvidia did not went full batshit crazy with it. Like with first shader GPUs (DX8 era) we can expect few effects here and there, improved dynamic lighting, etc. but not eye-popping difference.
 
To me the RTX off looks better then with it on due to too much contrast with the RTX on, maybe with HDR that would render better. Of the many screen shots and video I've seen, it just does not strike me as a significant game changer or enhancement - at times nice and other times unrealistic as in over reflections or too much. Anyways a great write up and honest experience.
 
  • Like
Reactions: N4CR
like this
i am kinda exited for the future of DXR - but am also surprised at how close the fake pre-baked graphics looks next to raytracing, i kinda expected an evolution but instead got more of a side grade.. even when looking at Quake 2 - there are other modern ways of rendering that, that comes very close - but offer waay better fps.

But am patiently waiting for control, Doom eternal or cyberpunk.. hoping to be positively surprised.

On a sidenote.. should have waited this generation out.. boring performance and under-baked raytracing.
 
Shadows, occlusion and light casting are all much improved with ray tracing in this game. It's supposed to be subtle. This is why developers are overdoing the reflections in current games because normies who are used to everything popping are not going to be impressed by what is happening to have truly realistic lighting.
 
i am kinda exited for the future of DXR - but am also surprised at how close the fake pre-baked graphics looks next to raytracing...

This is key. They have gotten so good at faking things out, that the difference is not large.

I expect what will happen in the future when RT cards are common from NVidia/AMD/Intel, is that less effort will go into faking everything out (It's a lot of work), such that something like "High" Settings will be raster, and "Ultra" will be Ray Tracing only, and the delta will be larger.

But this will probably make some people mad, and claim it is some kind of conspiracy to get them to buy RT cards, when really it will just be developer reducing their workload.
 
well, if they have to make a high version that is pre-baked.. then alot of the reduced workload goes out the window..

but i agree - once they make a fully raytraced game.. that will free up resources for artwork..

or, well - bigger profits for developers.. lets not pretend that we live in a fairy land where all savings goes to better the product.. :)
 
Normies.. :) well, i expected to be told that i looked at it wrong, and you did not disappoint


Much improved - subtle difference.. - tanking the framerate, sounds like a fair trade - groovy!!!
I wasn't picking on you or the OP, specifically, just that I know how the internet in general thinks. If it wasn't comments pointing out how a floor looks wet because of the overdone reflections it would be the "no difference" observation and then being convinced that the advancement of the rendering tech isn't worth it.
 
But we need all those extra transistors on the die taking up space to make it bigger to justify the $500 premium over the previous generation. God forbid they would give us the card we wanted...a $700 GTX 2080Ti.
 
I don't think you are going to find anything exciting in this Gen of GPU.
The idea is too new and the games are few.

Next Gen nvidia will maybe be where it should be.

I think that crazy Atomic game with a next Gen card is going to be wild.
 
But we need all those extra transistors on the die taking up space to make it bigger to justify the $500 premium over the previous generation. God forbid they would give us the card we wanted...a $700 GTX 2080Ti.

There has been a fair bit of analysis on this, the RTX components only appear to be using about 15% of the die area. Excluding it wouldn't have given you what you wanted.
 
I don't think Metro was ever going to be the poster boy for RTX.

And I don't think RTX was ever really planned out too well. I think it was a pro-active result of finding out what AMD has in the pipe (Navi, PS5, Xbox Scarlett). Nvidia was afraid and jumped the gun.

Looking forward to that new game coming...Control? I think it's called. Sometime in August I think it was. It's supposed to touch on all aspects of what RTX can do.
 
Part of the issue is of course that the "faked" rendering paths are very mature and actually incredibly good. We've spent decades working on how to make things look realistic without raytracing.

I believe titles will use the hardware more effectively as understanding matures, especially in cases where the hardware is actually dedicated. That would allow offloading of expensive operations while the main shaders remain occupied with duties they are well equipped to handle already.

I'm not at all trying to be an NV apologist here, but I do think we're really just seeing the bottom of a new learning curve, coupled of course with first-gen hardware limitations. It is absolutely true that the experience with current cards and titles is a bit wanting, no argument.
 
That is ludicrously backwards. NVidia RTX is years in the making. AMD is playing catch up.

We've been waiting on Navi how long? And now we know it's going into both consoles and will have raytracing? Of which one if not both will be out next year?
 
I don't think Metro was ever going to be the poster boy for RTX.

And I don't think RTX was ever really planned out too well. I think it was a pro-active result of finding out what AMD has in the pipe (Navi, PS5, Xbox Scarlett). Nvidia was afraid and jumped the gun.

Looking forward to that new game coming...Control? I think it's called. Sometime in August I think it was. It's supposed to touch on all aspects of what RTX can do.

Yeah, no.

LOL
 
RTX is completely made up they are just removing a few shadows. In a flea circus(can you see them?) move nvidia will soon pull the rug out and reveal its was all a marketing sham. how foolish the shills will feel then.

i have this info from TOP insiders.
 
Yeah, no.

LOL
Just like everything Intel is doing is an immediate reaction to AMD. They're just scared of the competition, am I right?
RTX is completely made up they are just removing a few shadows. In a flea circus(can you see them?) move nvidia will soon pull the rug out and reveal its was all a marketing sham. how foolish the shills will feel then.

i have this info from TOP insiders.
upload_2019-6-24_15-9-54.png
 
I think it was a pro-active result of finding out what AMD has in the pipe (Navi, PS5, Xbox Scarlett). Nvidia was afraid and jumped the gun.

Not even remotely within the dimension of being likely. Nvidia has no reason to react to anything AMD is doing right now. Navi is not the Ryzen of GPUs. Navi could be good, it could even be great, but its not enough to make Nvidia react. Being in the PS4 and XB1 did roughly jack all for AMD this generation and it won't do anything for them next generation. Its not like their name is stamped on the box, their logos don't appear when you boot the consoles. The only mention of AMD is when Sony and MS talk specs and the vast majority of people get that "deer in the headlights" look when you try to talk about specs. Consoles are also such low margin products that Nvidia is unlikely to be concerned about it. They're not going to bend over backwards to give Sony and MS the deals they'd want. The only company, besides themselves, that has a chance of forcing Nvidia to react is Intel and how their dedicated plans pan out remains to be seen.
 
Yeah, no.

LOL

They had ONE launch game. And it was a joke.

It still is a joke. That tells me they were NOT ready to launch in the slightest.

And every game coming out after PS5/Xbox launch is going to be designed for AMD hardware. PC gaming is dying. I don't see how RTX can prop itself up for long.

I think I mentioned it in another thread - we'll know a lot more when CoD:Modern Warfare hits (with its RTX support)
 
I have a 2080 but RT was never a factor in my choice of the card. RT is NOT FEASIBLE with this gen of cards, pure and simple. Nor probably the next gen of cards when considering higher resolutions. Anyone who bought their RTX cards expecting to enjoy RT in its (half-baked) glory at this point in time was sadly deluded. But what you get (from 2080 upwards) are the fastest cards available. Pretty sure that was the main criteria for the majority of 2080+ buyers, not to enjoy RT in any reasonable form (unless they are on 1080p monitors, but who would buy a 2080 or 2080ti to game at 1080p?).
 
They had ONE launch game. And it was a joke.

It still is a joke. That tells me they were NOT ready to launch in the slightest.

And every game coming out after PS5/Xbox launch is going to be designed for AMD hardware. PC gaming is dying. I don't see how RTX can prop itself up for long.

I think I mentioned it in another thread - we'll know a lot more when CoD:Modern Warfare hits (with its RTX support)

The PS4 and XB1 had AMDA hardware as well. It didn’t matter then, it won’t now. PC Gaming is getting BIGGER. There is no sign of it dying and people like you have been predicting it’s death for the past 30 years.
 
The PS4 and XB1 had AMDA hardware as well. It didn’t matter then, it won’t now. PC Gaming is getting BIGGER. There is no sign of it dying and people like you have been predicting it’s death for the past 30 years.
yeah but if they keep saying it sooner or later (most likely) it will become true :rolleyes::rolleyes::D:D
 
The only way PC gaming out right dies is if the consoles move to a different type of CPU that is not compatible with current CISC processors.

And even then it would take more than one or two generations to try to kill off PC gaming due to unique proprietary hardware. As you recall one generation of PS3 cell processors didn’t even put a dent in PC gaming.

As it is consoles are more like high-end PC gaming computers than ever before and more compatible too. Next GEN Sony and Xbox will feature eight core Ryzen processors, Navi GPUs, Larger hard drives, play anywhere titles that also work cross platform against PC players.
 
The only way PC gaming out right dies is if the consoles move to a different type of CPU that is not compatible with current CISC processors.

And even then it would take more than one or two generations to try to kill off PC gaming due to unique proprietary hardware. As you recall one generation of PS3 cell processors didn’t even put a dent in PC gaming.

As it is consoles are more like high-end PC gaming computers than ever before and more compatible too. Next GEN Sony and Xbox will feature eight core Ryzen processors, Navi GPUs, Larger hard drives, play anywhere titles that also work cross platform against PC players.

We don't really program to the metal anymore, so it's the OS and APIs that matter more than HW.

Essentially having Windows and DirectX on Xbox matters more than which GPU it uses for porting to PC purposes.

Porting Xbox --> Windows PC is trivial compared to porting from the PS4.

The real impact of consoles on PC gaming, is when lazy ports are done, and the assets are console level assets which are inadequate on PC. Or when controls like KB/Mouse are not properly tuned, or when options are much more limited than what is considered a proper level for a PC game.

Which GPU "faction" powers the console is irrelevant.
 
I don't think Metro was ever going to be the poster boy for RTX.

And I don't think RTX was ever really planned out too well. I think it was a pro-active result of finding out what AMD has in the pipe (Navi, PS5, Xbox Scarlett). Nvidia was afraid and jumped the gun.

Looking forward to that new game coming...Control? I think it's called. Sometime in August I think it was. It's supposed to touch on all aspects of what RTX can do.

So we're still waiting for a real showcase title for RTX a full year after release now. Bravo, Nvidia, well played....
 
So we're still waiting for a real showcase title for RTX a full year after release now. Bravo, Nvidia, well played....

Software lags behind hardware. How many YEARS did AMD talk about tessellation support before it became a thing in games? Real time ray tracing is new for games and it will take a while for it to be more than a “cool” marketing thing. People have been saying this since the RTX cards were announced and nothing has changed.
 
I swapped out a 1080TI for a 2080 in a pretty lateral move so I could experience Ray Tracing.

I've downloaded the Star Wars Ray Tracing demo, since you can't interact with it, and it's capped at 24 frames per second - it might as well just be pre-recorded FMV game video - how could I tell the difference? Meh.

I loaded up the RTX benchmarks in FutureMark - double Meh...doesn't look any better than anything else they show in their benchmarks and runs even slower.

I tried Metro 2033 Exodus on the Xbox PC Pass Beta Subscription service. I turned on DLS and Ray Tracing and though this looks good - but not sure it's necessarily next gen looking. I mean Vermantide 2 looked as good or better for instance and that doesn't have Ray Tracing.

Anyway - turned off Ray Tracing while turning a desk lamp on and off and low and behold...
COULD
NOT
TELL
THE
DIFFERENCE


So I looked it up to learn if I should be able to tell the difference? I mean I heard it was subtle, but I couldn't tell any difference at all.

Yes it's subtle. In this techspot reviews screenshots I can't even tell the difference.

https://www.techspot.com/article/1793-metro-exodus-ray-tracing-benchmark/

So I spent the next half hour turning ray tracing on and off (all settings at extreme, and Raytracing to Ultra) and finally found one room that made a difference. It was a room with a small window and shreds of curtain in the way. That room had a fairly sizable difference, but nothing else I saw switching back and forth at my random intervals made the game any better or any more realistic -- subjectively.

Here is the one scene I found that made a sizable difference:

RTX Off
View attachment 169698 RTX On
View attachment 169699

Yeah sadly it's pretty much a tacked on gimmick at best for now, just dont have the power to do much more then that. They also know very few have the cards capable of running RT so I doubt a ton of effort goes into making it better.
 
same here i have seen like those blind test videos. I couldn't tell for shit and almost everyone at the end said it makes little to no difference and they wouldn't mind it off as it didn't really improve their game play experience.
 
Shadows, occlusion and light casting are all much improved with ray tracing in this game. It's supposed to be subtle. This is why developers are overdoing the reflections in current games because normies who are used to everything popping are not going to be impressed by what is happening to have truly realistic lighting.

Thing is, if doing it "properly" with ray tracing is such a subtle improvement over doing it the "hacked" way we have been that people don't notice the difference then why bother? Remember what ultimately matters when talking graphical fidelity is how good a game looks to players. Doesn't matter if it is done through brute force computation or clever trickery, what matters is the end result. So for it to be worthwhile, particularly with large FPS losses, it needs to offer an improvement that isn't subtle. That doesn't mean it needs to be gaudy or unrealistic, just that it is very noticeable. If not, if it is something where you have to scrutinize stills to maybe see a difference, well then just leave it off.
 
Thing is, if doing it "properly" with ray tracing is such a subtle improvement over doing it the "hacked" way we have been that people don't notice the difference then why bother? Remember what ultimately matters when talking graphical fidelity is how good a game looks to players. Doesn't matter if it is done through brute force computation or clever trickery, what matters is the end result. So for it to be worthwhile, particularly with large FPS losses, it needs to offer an improvement that isn't subtle. That doesn't mean it needs to be gaudy or unrealistic, just that it is very noticeable. If not, if it is something where you have to scrutinize stills to maybe see a difference, well then just leave it off.

Because, once the hardware power is there for it, it should be easier on the developers. Even if it is near identical if one way is easier and the other is harder then the easier solution is what should be used. "Work smarter, not harder" as the saying goes. Ray-tracing was a HUGE boon to the movie industry. Even with all the raw power needed to render scenes, it made things so much easier on CG studios. I've been saying since these cards were announced that we're probably 5-10 years from it being more than a "gimmick". Someone had to get the ball rolling or it would take even longer for RT to be viable in games, but anyone expecting miracles right now is in for some serious disappointment.
 
There has been a fair bit of analysis on this, the RTX components only appear to be using about 15% of the die area. Excluding it wouldn't have given you what you wanted.
Yup, but many people do not realize Nvidia increased prices because they simply wanted to increase prices :)
Releasing RTX series with the same price as previous series when you have tons of overproduced Pascal GPU's lying in warehouse would not be smart move and share holders would definitely not want that.

Higher price means less demand with more or less similar profit and more room for price drops. Both Nvidia and AMD wanted higher pricing on their consumer cards.

But simple minded haters need to have simple reason to get angry about and dwelling into GPU manufacturing costs, market related issues and history of company (Nv) misjudged moves regarding crypto mining boom, would be all too bothersome so: "Nvidia increased prices because of ray-tracing" and "Without ray-tracing cards would be twice as fast" it is... =(
 
So we're still waiting for a real showcase title for RTX a full year after release now. Bravo, Nvidia, well played....
You are confused dude, Nvidia is GPU company, not game developer.

Nvidia might provide technical support but all the real effort still goes to developer. At software dev companies effort directly translates to money and given amount of gamers with RTX hardware and its performance (which does not allow these effects to be enabled without significant performance drop) it often does not seem to be worth it.

Especially designing game solely around ray-tracing sounds like very bad idea. Such thing will happen but not until year or two after new consoles are released.

EDIT://
Oh, and Nvidia did release some software to showcase ray-tracing... heck, path-tracing even. With the help/inspiration from independent developer they released Quake 2 RTX and it does have far superior lighting and relfections than anything else except reality itself =)
 
Because, once the hardware power is there for it, it should be easier on the developers. Even if it is near identical if one way is easier and the other is harder then the easier solution is what should be used. "Work smarter, not harder" as the saying goes. Ray-tracing was a HUGE boon to the movie industry. Even with all the raw power needed to render scenes, it made things so much easier on CG studios. I've been saying since these cards were announced that we're probably 5-10 years from it being more than a "gimmick". Someone had to get the ball rolling or it would take even longer for RT to be viable in games, but anyone expecting miracles right now is in for some serious disappointment.

And that's fine, when they are powerful enough to be useful in 5-10 years. The issue is that they aren't right now, and I think many people are a little miffed at the price hike for what is more or less a useless feature at this point. Now I fully realise that it wasn't a causal thing like "raytracing happened so price jumped" but nVidia certainly tries to use it as a selling point to justify the higher price. They also hyped it hard, and continue to hype it. It isn't like they added the feature and noted it was new and talked about possible applications for high end computation (as they did when they first added FP64), no they went on about how amazin' it would be for games and all that.

Also you have to understand some skepticism from people when it comes to ray tracing as we have been hearing about it for a loooooong time. Ray tracing as we think of it now came about in 1979 and it seems like every since then you've had poorly educated CS students go about how it is O(log n) and thus is the One True Way. Likewise real-time raytracing has been talked about for a long time. Intel got real big on it in 2008 as a kind of "you should want to CPU cores not more GPU" thing.

Combine that with the fanboyism you see and it shouldn't be a surprise that some people, like me, shit on it a bit. Not because I hate the concept of ray tracing but because I know how intense it is to do right. Goes double if you want to have the idea of a really easy solution whereby shadows, caustics, occlusion, indirect lighting, and such aren't things you want to have to think about but just have automatically done by the rendering engine. Traditional eye-based ray tracing doesn't do a good job with that, you need something like photon mapping, which is of course much more intense and thus even further off from a realtime implementation.
 
You are confused dude, Nvidia is GPU company, not game developer.

Nvidia might provide technical support but all the real effort still goes to developer. At software dev companies effort directly translates to money and given amount of gamers with RTX hardware and its performance (which does not allow these effects to be enabled without significant performance drop) it often does not seem to be worth it.

Especially designing game solely around ray-tracing sounds like very bad idea. Such thing will happen but not until year or two after new consoles are released.

EDIT://
Oh, and Nvidia did release some software to showcase ray-tracing... heck, path-tracing even. With the help/inspiration from independent developer they released Quake 2 RTX and it does have far superior lighting and relfections than anything else except reality itself =)

Am I confused though? Sure seems like they are trying to create a solution to a problem that doesn't exist and then charge people out the nose for it.
 
i am kinda exited for the future of DXR - but am also surprised at how close the fake pre-baked graphics looks next to raytracing, i kinda expected an evolution but instead got more of a side grade.. even when looking at Quake 2 - there are other modern ways of rendering that, that comes very close - but offer waay better fps.

But am patiently waiting for control, Doom eternal or cyberpunk.. hoping to be positively surprised.

On a sidenote.. should have waited this generation out.. boring performance and under-baked raytracing.

I would like to remind you about Battlefield V where they have had patches to reduce "ray tracing" I am wondering how people feel that they experience that as fake tracing as well (great binary thinkers).

You can wait for titles that would differentiate the difference better but is that without developers being paid to make the experience worse as in the case with tessellation. So where in this first generation is something you can justify being excited about another feature that is abusive towards what it is really doing, can you call it ray tracing if the effect from ray tracing is minimal but the underlying effect to exaggerate this would hurt image quality on the other side.

This is key. They have gotten so good at faking things out, that the difference is not large.

I expect what will happen in the future when RT cards are common from NVidia/AMD/Intel, is that less effort will go into faking everything out (It's a lot of work), such that something like "High" Settings will be raster, and "Ultra" will be Ray Tracing only, and the delta will be larger.

But this will probably make some people mad, and claim it is some kind of conspiracy to get them to buy RT cards, when really it will just be developer reducing their workload.

Is that the future where you still have a spare kidney that you would like to sell to be able to afford your future Nvidia purchase, you know for those ray(s) that needs to be traced ?
 
Yup, but many people do not realize Nvidia increased prices because they simply wanted to increase prices :)
Releasing RTX series with the same price as previous series when you have tons of overproduced Pascal GPU's lying in warehouse would not be smart move and share holders would definitely not want that.

Higher price means less demand with more or less similar profit and more room for price drops. Both Nvidia and AMD wanted higher pricing on their consumer cards.

But simple minded haters need to have simple reason to get angry about and dwelling into GPU manufacturing costs, market related issues and history of company (Nv) misjudged moves regarding crypto mining boom, would be all too bothersome so: "Nvidia increased prices because of ray-tracing" and "Without ray-tracing cards would be twice as fast" it is... =(

I'll pretend that you didn't just say that someone voting with their wallet is simple minded. The fact of the matter is the price increases dramatically, especially on the high end, and not everyone is going to pay it. The reason why it increased is irrelevant. When you upcharge some 70% from the previous generation you should expect some push back from consumers.
 
Back
Top