More DLSS...

TBH, a few games (Control for example) actually looked better at 4k with quality DLSS vs. Native resolution. If it comes down to DLSS vs. TAA, I pick DLSS anyday for both image quality and performance.
This was my experience. I don’t know about looked better, but on a 3080, I was up near my 120hz refresh with DLSS... and I’m good with that.
 
I don't know why some people lose their minds when the phrase "better than Native" is used about DLSS. Yes the original DLSS was bad, but DLSS 2.0 is much better. And surely when people say "better than Native" you have to ask what they mean by Native.

For me DLSS 2.0 has been better than Native in every game that I have tried. And by Native I mean Native with no AA applied or Native with TAA/FXAA applied.

Native without any AA applied, no can't stand the jaggies.
Native with TAA - TAA can be terrible but some implementations are ok.
Native with FXAA - Blurry mess.

DLSS 2.0 solves the first problem without the limitations of the second two solutions. So it looks better and has better performance.

Does it look better than Native with better forms of AA applied, nope, but it was never meant to.
 
TBH, a few games (Control for example) actually looked better at 4k with quality DLSS vs. Native resolution. If it comes down to DLSS vs. TAA, I pick DLSS anyday for both image quality and performance.
I just bought Control a couple days ago. How do you selected "quality" DLSS? There is only a simple binary checkbox for DLSS on/off for me in the game
 
TBH, a few games (Control for example) actually looked better at 4k with quality DLSS vs. Native resolution. If it comes down to DLSS vs. TAA, I pick DLSS anyday for both image quality and performance.
IIRC you can pick aliasing in control, which means you can get superior image quality than both taa or dlss.
 
I don't know why some people lose their minds when the phrase "better than Native" is used about DLSS. Yes the original DLSS was bad, but DLSS 2.0 is much better. And surely when people say "better than Native" you have to ask what they mean by Native.

For me DLSS 2.0 has been better than Native in every game that I have tried. And by Native I mean Native with no AA applied or Native with TAA/FXAA applied.

Native without any AA applied, no can't stand the jaggies.
Native with TAA - TAA can be terrible but some implementations are ok.
Native with FXAA - Blurry mess.

DLSS 2.0 solves the first problem without the limitations of the second two solutions. So it looks better and has better performance.

Does it look better than Native with better forms of AA applied, nope, but it was never meant to.

Because the phrase is marketing junk thrown out to sell a product, there is no qualifier, no follow up, no better than TAA/FXAA, just better than native and a staunch insistence from some that it is universally true regardless of other options (MSAA).
 
I just bought Control a couple days ago. How do you selected "quality" DLSS? There is only a simple binary checkbox for DLSS on/off for me in the game
wasn't control an early release of DLSS before they had more options for how much of a sacrifice you were willing to make for visuals in the quest for more frames? It might only have limited DLSS options, but you can modify the resolution scaling which will also significantly impact both fps and image quality.
 
wasn't control an early release of DLSS before they had more options for how much of a sacrifice you were willing to make for visuals in the quest for more frames? It might only have limited DLSS options, but you can modify the resolution scaling which will also significantly impact both fps and image quality.
Control released with the original DLSS, but it was updated to DLSS 2.0. I guess they never actually updated it fully to have quality, balanced, and performance mods. My guess is it just defaults to one of those. It's still DLSS 2.0 though.
 
I don't know why some people lose their minds when the phrase "better than Native" is used about DLSS. Yes the original DLSS was bad, but DLSS 2.0 is much better. And surely when people say "better than Native" you have to ask what they mean by Native.
I think some think/have a mental imagine that DLSS was trained with the 4K native footage of the very game DLSS is currently running and trying is best to reproduce that from a lower resolution source, making it possible to ever be has good and certainly not better.

And not that it is train from rendering that have 16 time more pixel than 4K and trying to reach that, making it possible/probable to have things that it know well and "easily" to predict like text and building lines look better when the image is stable at least and that the numbers of things that will look better than native will growth over time.
 
I just bought Control a couple days ago. How do you selected "quality" DLSS? There is only a simple binary checkbox for DLSS on/off for me in the game
What resolution do you game at? For me at 4k with DLSS on, I had 3 render resolution choices for DLSS under my base resolution (monitor), which were essentially Quality, Balanced and Performance.

Make sure your game is updated, as DLSS 2.0 is supported on it now, not that 1.9 garbage it had before.
 
What resolution do you game at? For me at 4k with DLSS on, I had 3 render resolution choices for DLSS under my base resolution (monitor), which were essentially Quality, Balanced and Performance.

Make sure your game is updated, as DLSS 2.0 is supported on it now, not that 1.9 garbage it had before.
Ah! You could look at the render resolutions in that way, I suppose. Not sure what differentiates quality/balanced/performance otherwise (am new to the world)
 
Ah! You could look at the render resolutions in that way, I suppose. Not sure what differentiates quality/balanced/performance otherwise (am new to the world)
They typically list it with those names, but in control its a little more transparent and you can see the resolutions DLSS uses. So the higher the DLSS resolution, the higher the quality.
 
And remember- static images are not motion. I notice DLSS standing there. Most games, you don’t stand there the entire time. You are doing things.::
For sure. For me TAA is like that but the opposite. You look at screen shots and say "Man, that looks good with TAA. Little blurring maybe, kinda like FXAA, but smooth overall. Probably a good tradeoff." However then in a game, in motion, it becomes so much more noticeable, and I don't like it and disable it. While screenshots are useful for doing comparisons because, let's face it, with the compression on videos they just aren't that useful we do need to remember that a game in motion is really where it matters. I don't mind something that looks impressive when you are snooping around, zooming in on textures and standing still, but that isn't how we play games. The important part is how it looks during gameplay.
 
I think I developed an easy system to find what graphical settings work best for you:
turn everything down except resolution to lowest settings, then play around with the texture related settings first, see if the performance/quality tradeoffs are for you and pick the highest you can settle on
then just play with the rest of the settings one by one and see if you even like the addition of things like motion blur, blooms, AA methods, etc.

It's a lot easier and more productive doing lowest->highest, then going in reverse, I find.
 
My first take with DLSS:

In Control version 1.13, at 1440p, Max settings, no film grain or motion blur, RT max, system in sig:
  • Performance, FPS goes up around 120fps with DLSS at max rendering resolution, presume Quality mode, without DLSS FPS is around 80fps -> this is with max RT quality
  • Finding DLSS a mixed bag, particularly specular highlights has gross constant moving noise which is distracting
  • Hair is rendered better with DLSS, cleaner
  • Text is worst with DLSS, which is interesting, text can be blurry until motion and then it cleans up, delayed action but more noticeable than rendering at native resolution or 1440p
  • Game uses MSAA unless DLSS is selected. By turning MSAA down you get some performance back. Not sure if game has some other form of AA since aliasing is not bad even with MSAA turned off. Anyways about 10fps increase with MSAA off, 90fps.
  • There can be flickering lines or objects that are vertical which when DLSS is turned off goes away
  • Mostly the added noise or artifacts using DLSS I find distracting enough to not use DLSS in this game, I am sure others would not even notice this but maybe something else. Since performance is fine without it for this title at this resolution, I just leave it off
I will be testing Control at 4K as well since I have a 4K monitor hooked up to the 3090 as well, just have not gotten around to it.

In Metro Exodos, 1440p, Ultra settings, Ultra RT, DLSS causes a performance loss??? Just using the benchmark tool which makes no sense to me what is going on in this case. Have not tried it in the game. Save that for later.
 
Right now, DLSS is an on/off switch with various levels of detail. The more I think about it, the more I hope that Nvidia will push this towards a variable shading technology as seen in some newer DX12 features.

How do we do this?

  • Base level of DLSS on 2 factors
    • Number of Tensor Cores
    • Desired Framerate

This would allow DLSS to vary intensity based on desired performance level, not based on a predefined resolution. Since Variable Rate Shading is still a new technology, expect the combination of VRS and DLSS (or other AI-upscaling tech) to become a HUGE part of next generation visuals as it can provide a huge performance benefit without compromising visuals to any noticeable degree. So instead of continuing to brute force performance with more shader power augmented by Tensor/AI processing, GPU tech will become more focused with certain render areas being given a higher priority than others.

Thoughts?
 
Right now, DLSS is an on/off switch with various levels of detail. The more I think about it, the more I hope that Nvidia will push this towards a variable shading technology as seen in some newer DX12 features.

How do we do this?

  • Base level of DLSS on 2 factors
    • Number of Tensor Cores
    • Desired Framerate

This would allow DLSS to vary intensity based on desired performance level, not based on a predefined resolution. Since Variable Rate Shading is still a new technology, expect the combination of VRS and DLSS (or other AI-upscaling tech) to become a HUGE part of next generation visuals as it can provide a huge performance benefit without compromising visuals to any noticeable degree. So instead of continuing to brute force performance with more shader power augmented by Tensor/AI processing, GPU tech will become more focused with certain render areas being given a higher priority than others.

Thoughts?
The resulting detail of DLSS is entirely dependant upon the input resolution. DLSS is always doing its very best to give you the best possible image it can. Lower DLSS modes are simply starting with a lower input resolution. So, dynamic resolution scaling would be the way to achieve what you say. And that's already working in Unreal Engine. And could presumably be implemented in a game engine if devs bothered to focus on it as a goal.
 
DLSS really doesn't help enough to matter in this crappy remaster because you're going to be so damn CPU limited anyway at times that you can't maintain even 60 FPS. I have a 9900k and 2080 TI and I still drop down to 60% and even 50% GPU utilization with frame rates in the freaking 50s and 40s at 1440p in spots. I can be looking in one direction getting 80 to 90 FPS and turn the other direction and cut the frame rate in half and GPU usage just plummets. What kind of piss poor optimization keeps a 9900k from even getting 60 FPS?
 
I feel there has not been enough attention on the fact that there's now a UE4 DLSS plugin. DLSS on its own is pushing me more towards getting a new Nvidia cards, despite the many negatives there are with Nvidia in general (many of them pointed out in my 3060 12GB thread). Today I saw this from the new System Shock developer:

1615995426140.png


Beyond the performance benefits, notice how easy he makes implementing the DLSS plugin into UE4 games sound. That will be a powerful factor for game adoption, which will drive userbase, and deservedly so. If you keep reading the WCCFtech post, the performance gains achieved by developer CyberPunch on their game are kind of ridiculous(ly impressive and huge).

AMD needs to have their super resolution DirectML competitor out... like last month.
 
IDK about needing an answer now, GPUs on shelf is a much bigger priority than this. Most people still run the 1000 series or older so can't use DLSS anyway.
 
View attachment 339837

Beyond the performance benefits, notice how easy he makes implementing the DLSS plugin into UE4 games sound. That will be a powerful factor for game adoption, which will drive userbase, and deservedly so. If you keep reading the WCCFtech post, the performance gains achieved by developer CyberPunch on their game are kind of ridiculous(ly impressive and huge).

AMD needs to have their super resolution DirectML competitor out... like last month.
Its worth noting here that DLSS "Performance" means that the internal resolutuion is 25% of the output resolution. So for 4K output, the game renders at about 1080p.

And generally speaking, Performance mode usually has a lot of visual issues. Not only softness, but blurring, grainyness, etc. Its very possible a person might prefer simply scaling 1080p on their monitor or with their GPU scaler. And DLSS does incur a small performance hit. So simply upscaling 1080p with your monitor or GPU, should perform even a little better. And could look better, particularly in motion.

*And for that matter, both AMD and Nvidia support integer scaling. So 1080p on a 4K screen could be pretty crisp. Haven't ever seen it myself, however. Never had a 4K screen or used one with a gaming GPU.
 
I played Cyberpunk with Ultra Performance so I could use ray tracing.

The render resolution may have been 720P (or something like that) and it was playable.

Blurry, yes, and some artifacts but 1000x better than running in 720p on a 1440p monitor and using monitor scaler.
 
Blurry, yes, and some artifacts but 1000x better than running in 720p on a 1440p monitor and using monitor scaler.
Yeah I think that's what chameleoneel was referring to - indeed performance mode on DLSS doesn't look great, Balanced is usually the sweet spot of gaining a bunch of FPS and looking the same as native. As for DLSS vs monitor scaling, that's an interesting question - technically even if you use DLSS to render at 720p, that's not what you see, so there's no way simply upscaling 720p on the monitor would look better. DLSS on performance from a source resolution of 720p would still be blurry, but 720p native has to be blurrier for sure, as there's barely any work being done to the image at that point.
 
I played Cyberpunk with Ultra Performance so I could use ray tracing.

The render resolution may have been 720P (or something like that) and it was playable.

Blurry, yes, and some artifacts but 1000x better than running in 720p on a 1440p monitor and using monitor scaler.
Yeah I think that's what chameleoneel was referring to - indeed performance mode on DLSS doesn't look great, Balanced is usually the sweet spot of gaining a bunch of FPS and looking the same as native. As for DLSS vs monitor scaling, that's an interesting question - technically even if you use DLSS to render at 720p, that's not what you see, so there's no way simply upscaling 720p on the monitor would look better. DLSS on performance from a source resolution of 720p would still be blurry, but 720p native has to be blurrier for sure, as there's barely any work being done to the image at that point.
As I said, I haven't seen it myself. Because I don't have a 4K monitor. But It would be an interesting experiment to compare. While standing still, DLSS Performance or Ultra Performance probably looks generally better.. But integer scaling on the GPU or using Nvidia's special GPU scaling mode (accessed in the sharpening options and then choosing one of the new upscaled resolution options, which appear in your resolution list in the driver control panel) will not have all of the motion artifacts. The net result could be a better experience. And it will vary, depending upon the game. DLSS in some games is only okay, while others its pretty darned good.


I do have 2K monitors. I will test it out tonight. Compare DLSS Performance and Ultra Performance in Death Stranding and Nioh 2, compared to Nvidia's 5 tap sharpened GPU scaling and Nvidia's integer scaling.

P.S. I've never seen "balanced" look the same as native. It usually looks pretty good. But not imperceptible from native. NIoh 2's DLSS does look nearly imperceptable in quality mode (only small artifacts you don't usually notice when actually playing). Aside from the subtle softness to edges, from the AA it adds. When using regular rendering without DLSS, the game has no AA of any kind and is super sharp.
 
Last edited:
Has anyone else played Metro Exodus - Enhanced Edition?

I just tried it with DLSS Performance, and I can't tell a difference in fidelity between DLSS Performance and Native 4K. And the performance on my RTX 3080 is really smooth.

Seriously... this is unreal! What sort of voodoo magic is this?
 
Has anyone else played Metro Exodus - Enhanced Edition?

I just tried it with DLSS Performance, and I can't tell a difference in fidelity between DLSS Performance and Native 4K. And the performance on my RTX 3080 is really smooth.

Seriously... this is unreal! What sort of voodoo magic is this?
Post some side by side screenshots pls. I had a very hard time noticing the difference in still images of Mechwarrior 5 with DLSS on Quality vs native 1440P. The motion artifacting is a bit more noticeable, but have to look for it.
 
Nvidia DLSS is now supported in VR, starting with No Man's Sky

Nvidia is bringing Deep Learning Super Sampling (DLSS) to virtual reality. The framerate-boosting technology has already worked wonders in delivering higher frame rates to desktop gaming and now has its sights set on demanding PC VR games...the first VR games to support DLSS are:

No Man's Sky
Wrench
Into The Radius

https://www.pcgamer.com/nvidia-dlss-vr-support/
 
I think upscaling done well is a good tech and human eye playing a game you would really never notice the difference. really hard to tell image quality difference even if there were some with naked eye. You really have to be good at it. Unless you have a blur fest its not something one would notice.
 
That's interesting that it gets WORSE with a higher render resolution. Counter intuitive considering a higher render resolution means more information for the algorithm to work with for the upscaling. I'd be curious if it is just an issue with them slapping it on Warzone after the fact, or if other games with DLSS 2.0 have this issue and no one has really noticed yet...
 
meh, dlss in the games I play screw up point of aim.

Lot of refinement still needs to be done.
 
Nvidia DLSS 2.2 was quietly released in Rainbow Six Siege a few days ago...users on Reddit (and elsewhere) are reporting that it's the best implementation yet of the already excellent Deep Learning Super Sampling image reconstruction technique, delivering a sharp image that's also virtually free of any ghosting or artifacts

more interestingly, they've had success in applying the new DLSS 2.2 (2.2.6.0 to be precise) dll file to other games that support DLSS, such as Call of Duty: Black Ops Cold War, Nioh 2, No Man's Sky, Death Stranding, Cyberpunk 2077, Metro Exodus, Necromunda: Hired Gun...not only did this work, it actually improved the minor issues that DLSS 2.1 still had in those titles...
 
Last edited:
IDK how they did it, but when I apply the R6S's nvngx_dlss.dll file to Metro Exodus, it refuses to toggle on DLSS (even if it was already on prior to swapping nvngx_dlss.dll).
 
I just bought Exodus EE yesterday, looks like you have to manually enable "Enhanced Edition" from the beta options. I'll try it again in a moment.
Hm, on Epic it just added a whole new game to my library of games specifically called Enhanced Edition.
 
Back
Top