More DLSS...

So UE5 looks good, but it's not as good as DLSS. It looks great in static shots, but in heavy motion there are dis-occlusion artifacts and blurring. See this video I recorded.

It's on 1080p native monitor with scaling at 50% (so an actual render resolution of 960x540). It does look very nice considering for 540p. But notice as I spin the camera around, there is a halo around the character.




This is common with temporal based solutions, and probably can't be fixed completely. DLSS is also temporal based, but it's more complex and the artifacts are definitely less noticeable than Unreal 5.

540p is a pretty low internal res, which can also highlight issues with DLSS. The point being.....I don't think DLSS is amazing, yet, either. and seems to be game dependant. A couple of games, it is pretty darn great. But several, its ok. and a few, its pretty bad.
 
The same artifacts are there at higher resolutions. But on that test I was running on a GTX 1060, so only 30 fps, which is the main issue. At 60 fps and above it does look better.
 
I was also kinda thinking more forward. If I'm a developer making a game on UE5, why would I bother with DLSS if the built in scaller is about 90 of the way there and hardware and platform agnostic? The answer is you wouldn't. Especially as Epic further develop's that scaler. Don't forget this it just how is looks in preview release form, it's not even complete yet. What I'm getting at is that it doesn't bode well for thinking DLSS is here to stay, if something like the a wildly popular engine like Unreal is going to have an option that competes this well and is completely hardware agnostic.

That said, nVidia's not done with DLSS. The versions coming up might completely turn the tables on that thought process :)
 
Yes, that is a good point. I do think DLSS today is slightly better, but TAAU can also improve, and it's already pretty good.

And if you were developing in Unreal 5, and it comes with a solution that is good enough and works on everything without any effort on your part, then why not just use that.
 
It's also worth noting you get DLSS 2.X enable games right now. Plus game devs can implement DLSS into just about any engine with just a little work. Bah I think I stirred the pot to much on that post. Sorry guys :)
 
I went from a 3440 x 1440 monitor to a 3840 x 2160 48" LG C1 OLED recently and I have to say, DLSS image quality scales much better at 4K output resolution than 1440p or less. It's almost as if the tech is really optimized for UHD output. DLSS has better clarity and is just as good or better than native. Even DLSS performance can look decent if the output resolution is 4K while 1440p output looks obviously upscaled.
 
Yeah, I have to agree. I have a 4K monitor on my RTX 3060 rig, and I can actually play Cyberpunk 2077 at 4K (granted with low settings and DLSS performance, but it works).

I'm getting around 60 fps, but with FreeSync that is decent enough. It's not as good as actual 4K, obviously, but looks better than 1440p native for sure. I mean it's softer but I kind of like that console look.
 
I was also kinda thinking more forward. If I'm a developer making a game on UE5, why would I bother with DLSS if the built in scaller is about 90 of the way there and hardware and platform agnostic? The answer is you wouldn't. Especially as Epic further develop's that scaler. Don't forget this it just how is looks in preview release form, it's not even complete yet. What I'm getting at is that it doesn't bode well for thinking DLSS is here to stay, if something like the a wildly popular engine like Unreal is going to have an option that competes this well and is completely hardware agnostic.

That said, nVidia's not done with DLSS. The versions coming up might completely turn the tables on that thought process :)
DLSS is just as easy since it's a plug in for unreal engine (unity too now).
 
DLSS is just as easy since it's a plug in for unreal engine (unity too now).
That is a good point, except it will not address all the players hardware out there. It may actually be easier just to use DLSS if an engine makes it easy but FSR options are probably prudent since it is also easy and covers more hardware. As for games having each??? Intel eventually will make a mark as well, if successful it will most likely hit on Nvidia Market share which developers will have to address for best game support and sells. If Nvidia can keep getting the quality up, more games, it may just win by default or more exactly, AI based sampling and other methods using AI becomes dominate, Nvidia basically forces AMD and Intel to adopt something similar or a standard for AI based reconstruction becomes the norm.

DLSS basically gives a generational performance bump+ when available and a good implementation.
 
DLSS is just as easy since it's a plug in for unreal engine (unity too now).
Sorta. You still have to troubleshoot to ensure elements of your game are treated properly by the DLSS to prevent things like ghosting/trailing, blurring, removal of an effect etc etc. But we don't know if you wouldn't have to do similar for TSR anyway. So true enough :)
 
Intel based version like DLSS, XeSS, which is AI based and will work with AMD and Nvidia Cards while Intel GPU has an added hardware level acceleration may prove to be the one that wins out overall.
https://www.pcgamer.com/intel-xess-xe-super-sampling/

Will or can Nvidia make DLSS also available to non RTX owners of AMD and future Intel GPUs? While the world waits for Intel GPUs per se, how available and good they are is another factor and since 6nm TSMC the availability factor may just cloud everything unless Intel can make some versions on their own process (?). This may take some time. As for AMD, FSR is better than DLSS 1.0 as in more usable, beneficial but it is not in the same league as DLSS 2+ (at least in my experience). If AMD follows Intel lead and make it AI based as well (most likely temporal like DLSS requiring similar integration into games), then you would have 3 standards competing with 2 of them available to most current hardware.
 
Intel based version like DLSS, XeSS, which is AI based and will work with AMD and Nvidia Cards while Intel GPU has an added hardware level acceleration may prove to be the one that wins out overall.
https://www.pcgamer.com/intel-xess-xe-super-sampling/

Will or can Nvidia make DLSS also available to non RTX owners of AMD and future Intel GPUs? While the world waits for Intel GPUs per se, how available and good they are is another factor and since 6nm TSMC the availability factor may just cloud everything unless Intel can make some versions on their own process (?). This may take some time. As for AMD, FSR is better than DLSS 1.0 as in more usable, beneficial but it is not in the same league as DLSS 2+ (at least in my experience). If AMD follows Intel lead and make it AI based as well (most likely temporal like DLSS requiring similar integration into games), then you would have 3 standards competing with 2 of them available to most current hardware.
I'll wait on anything from intel in terms of graphics stuff with... well, trepidation isn't quite the right word.
 
Yeah, I'll believe it when I see it. Not like Intel has a great track record for high-end graphics, but you never know.
 
Impressive! Rise Of The Tomb Raider with DLSS never looked this good before. This game always had an aliasing issue until now. Looks like a remastered game, using DSR 4K resolution, DLSS Quality, monitor resolution of 1440p with maxed out settings except no motion blur, film grain or vignetta blur. One game I started playing and it is very nice having this option now, visually a whole new experience.

ROTR_DLSS.jpg

ROTR_DLSS2.jpg
 
"We're aware a recent update to Tomb Raider (2013), ROTTR, & SOTTR on Steam is causing multiple issues. We have reverted the patch while we investigate further. If you are still experiencing problems after reverting the patch, please let us know here. Sorry for the inconvenience!."

I guess you forget to mention this update broke the goddamn game for many people. Gaming becomes a bigger clusterfuck with each passing year. I have never seen so many games launched with massive bugs, stuttering issues, and then broken updates and half assed drivers like as been the case recently. But hey dont worry as so many of you will give those same games great reviews and even go so far as to claim you magically have no issues so it "must be your pc".
 
I guess you forget to mention this update broke the goddamn game for many people. Gaming becomes a bigger clusterfuck with each passing year. I have never seen so many games launched with massive bugs, stuttering issues, and then broken updates and half assed drivers like as been the case recently. But hey dont worry as so many of you will give those same games great reviews and even go so far as to claim you magically have no issues so it "must be your pc".

"We're aware a recent update to Tomb Raider (2013), ROTTR, & SOTTR on Steam is causing multiple issues. We have reverted the patch while we investigate further. If you are still experiencing problems after reverting the patch, please let us know here. Sorry for the inconvenience!."
Sounds like they're taking care of it already.
 
Impressive! Rise Of The Tomb Raider with DLSS never looked this good before. This game always had an aliasing issue until now. Looks like a remastered game, using DSR 4K resolution, DLSS Quality, monitor resolution of 1440p with maxed out settings except no motion blur, film grain or vignetta blur. One game I started playing and it is very nice having this option now, visually a whole new experience.


View attachment 404333
Nice! I haven't gone through the trilogy yet but own them. Sounds like it's time to do so!
 
What the fuck are you rolling your goddamn eyes about? All they did was admit the patch was an issue and pull it. And look I can roll my eyes too. :rolleyes:
Settle down. You already quoted the relevant section as I said. I'm sure they'll re-release the patch once it's fixed. People can play in the meantime :rolleyes:.
 
Settle down. You already quoted the relevant section as I said. I'm sure they'll re-release the patch once it's fixed. People can play in the meantime :rolleyes:.
Well dont act like a douchebag for no reason. I thought you meant there was some update to the issue as I cant read you mind.
 
I didn't act like anything. You got upset over a silly emoji...
You know damn well what the point of using that rolling eye emoji was so cut out the innocent act. If someone asks you a simple question then answer it without acting like a tool.
 
You can still play with the new ROTR patch, it is listed as beta. I was wondering why DLSS mysteriously disappeared. Wasted some time troubleshooting until I went to the forums. Upgraded with the beta to reestablish the patch. I only played for about 30 min or so and had no problem. Game Steam property, Beta, it is listed as build1013-1013. I did try to play it in VR and had all sorts of problems, not sure if it was the patch or it was already reverted to the previous version. Shadow of the Tomb Raider also looks really good with the updated DLSS version.
 
Frankly I am astounded! Way better than I expected. Played over an hour more. Using a DSR resolution of 3620 x 2036, DLSS Quality setting, Display 1440p. This gave me close to 144fps and the visuals are way better than I've ever remembered, seriously it feels like a totally different game, at least visually.


ROTR_DLSS3.jpgROTR_DLSS4.jpgROTR_DLSS5.jpg
 
Nice! I haven't gone through the trilogy yet but own them. Sounds like it's time to do so!
There great games, fun to play and get lost in. The puzzles are interesting and most are not too hard nor are they super easy either. Plus the normal fighting of animals and bad guys with all sorts of guns, gadgets, fire bombs . . .
 
There great games, fun to play and get lost in. The puzzles are interesting and most are not too hard nor are they super easy either. Plus the normal fighting of animals and bad guys with all sorts of guns, gadgets, fire bombs . . .
:) Thanks for the synopsis.
 
I feel if we have to show magnified static screenshots to see even a tiny difference, we're firmly in "mission accomplished" land.

Take 20 FPS more, or render the moustache on an NPC 30m away with greater fidelity. Well, okay. Tough call.
 
Honestly, aside from better performance, DLSS 2.0 and beyond with the Quality setting and sometimes even Balanced looks better than native with any AA solution. I always hated modern TXAA as it added blur, and other forms of AA had shimmering. However, DLSS looks so much better and no blur with quality, plus more frames! And as I game at 4K, every frame matters when maxing games out.
 
I usually play on performance mode since I need all the frames I can get (144Hz 4K monitor). You do lose some picture quality but honestly it still looks great.
 
Honestly, aside from better performance, DLSS 2.0 and beyond with the Quality setting and sometimes even Balanced looks better than native with any AA solution. I always hated modern TXAA as it added blur, and other forms of AA had shimmering. However, DLSS looks so much better and no blur with quality, plus more frames! And as I game at 4K, every frame matters when maxing games out.
Good way to reduce FXAA blur and its other issues (eg. the way it screws up UI elements when forced globally) is to combine it with small levels of DSR.

As far as getting the best AA goes combining post-processing based AA method with super sampling is the best AA method you can force. It should work with pretty much any game as long as it can run at higher resolution than your monitor supports.
Of course DSR can anti-alias by itself without any other AA method but if you are already using it then adding FXAA does visibly improve anti-aliasing.

FXAA or MLAA or SMAA. I mention FXAA because:
- it is the most popular of this bunch and most games have support for it.
- it is also supported by Nvidia cards by default and can be forced without using any external programs like shader injectors (eg. Reshade)
- it is the fastest algorithm of this type
- it is pretty universal when it comes to removing aliasing at the expense of image sharpness...
- and because it blurs image the most using super-sampling helps it the most ;)
Otherwise MLAA is also a good pick, especially for users of AMD GPUs for which there is an option to enable it in similar way as FXAA can be enabled on Nvidia cards. MLAA is generally less blurry and more targeted at specific shapes of edges. It anti-aliases lines very well without too much bloat at expense of slightly higher performance impact and being slightly less universal (apparently FXAA is better at anti-aliasing things like shaders and shadows, etc. though I never confirmed it myself)
SMAA is another algorithm recommended by some people. Its main feature (at least of one of its types) which is working with subpixels is not useful for combining with super sampling. Also SMAA is the least popular of the bunch, is slower than FXAA and lacks support from GPU vendors.

DLSS works wonders when it comes to anti-aliasing but unfortunately cannot be forced in unsupported games. Note: Something like DLAA (DLSS at native resolution) can be forced in any game that support DLSS by using DSR

Edit://
I read my post and part about why FXAA is good choice was nonsense so I have rewritten it along changing wording all over the post :)
 
Last edited:
Back
Top