DLSS Performance Gains Explored

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
TechReport posted an excellent deep dive into the performance and image quality tradeoffs of Nvidia's DLSS. Using a 55" 4K OLED as a display, Jeff Kampman seems to think the image quality reduction of DLSS is less noticeable in motion than it is in still frames, while the performance boost of DLSS gets you "a much more enjoyable sequence of motion." In addition to seat-of-the-pants impressions, TechReport also uploaded frame time analysis graphs that show substantial objective performance gains with DLSS turned on, as well as 4K comparison videos showing the difference between DLSS and TAA.


It's valid to note that all we have to go on so far for DLSS is a pair of largely canned demos, not real and interactive games with unpredictable inputs. That said, I think any gamer who is displeased with the smoothness and fluidity of their gaming experience on a 4K monitor-even a G-Sync monitor-is going to want to try DLSS for themselves when more games that support it come to market, if they can, and see whether the minor tradeoffs other reviewers have established for image quality are noticeable to their own eyes versus the major improvement in frame-time consistency and smooth motion we've observed thus far.
 
The fact that image quality doesn't suffer that much with movement is a "Duh" The same principals are applied to mpeg and other video compression codecs. You can let the image quality drop on fast moving objects as your eyes can't resolve detail as well on moving objects.

And I said before in another thread "image quality suffered on static scenes until appropriate motion vectors can be calculated for anti aliasing AI."

It's just a slight improvement on Temporal AA which analyzes where objects will be in future frames and decides on the noise factor of stacked objects based on their future placement. But this is more heavy on the noise prediction by using AI, which is sometimes wrong.
 
Last edited by a moderator:
The issue I have with all this is that if the 2080's can do 4k properly, then on a 20ish inch monitor you wouldn't need any sort of AA. Maybe for 1440p and 1080p, but 4k rarely benefits from AA. The whole reason why DLSS exists is because 1/4 of the Turing chip is has the AI core, which is useless for anything besides Ray-Tracing image clean up and AA.
 
The issue I have with all this is that if the 2080's can do 4k properly, then on a 20ish inch monitor you wouldn't need any sort of AA. Maybe for 1440p and 1080p, but 4k rarely benefits from AA. The whole reason why DLSS exists is because 1/4 of the Turing chip is has the AI core, which is useless for anything besides Ray-Tracing image clean up and AA.

Have you seen a chain link fence like in half life? or power lines? Anisotropic filters doesn't clean it all up.
 
Have you seen a chain link fence like in half life? or power lines? Anisotropic filters doesn't clean it all up.
Do you have a link i would be interested in what it looks like in this case. Not being sarcastic, i'm truly interested.
 
Do you have a link i would be interested in what it looks like in this case. Not being sarcastic, i'm truly interested.
[H] used to post these image quality shots.

But here is one. The problem isn't necessarily how well it looks on screen but the shimmering that is present when you move. True AA helps remove shimmering when your temporal movement stays below the AA level

3b1fab1c-1ecc-4169-9987-10163bbcdc29.jpg


referenceshot.jpg
 
[H] used to post these image quality shots.

But here is one. The problem isn't necessarily how well it looks on screen but the shimmering that is present when you move. True AA helps remove shimmering when your temporal movement stays below the AA level

View attachment 106044

View attachment 106045
Ahhh I gotcha, I was trying to figure out what you were saying but i get it now. I thought it was some sort of odd artifacting that was showing up. I see now what you are referring to. AA will never really be something you can get rid of as it always makes the image quality better. I agree.
 
I think DLSS is the one real architectural benefit Turing has over its predecessors.

Ray Tracing is for the birds. The performance impact is too steep for it to be practical. I don't think too many people are willing to drop their resolution to a quarter of what they are playing at now, to get the same frame rate with raytracing.

DLSS has promise though, if it performs as they say it does.
 
Last edited:
I think DLSS is the one real architectural benefit Turing has over it's predecessors.

Ray Tracing is for the birds. The performance impact is too steep for it to me practical. I don't think too many people are willing to drop their resolution to a quarter of what they are playing at now, to get the same frame rate with raytracing.

DLSS has promise though, if it performs as they say it does.

Completely agree on both accounts.
 
I think DLSS is the one real architectural benefit Turing has over it's predecessors.

Ray Tracing is for the birds. The performance impact is too steep for it to me practical. I don't think too many people are willing to drop their resolution to a quarter of what they are playing at now, to get the same frame rate with raytracing.

DLSS has promise though, if it performs as they say it does.

Exactly, everyone is so hyped on ray tracing and here I am hoping DLSS actually ends up being a thing. Even if it's just something in between 2560x1440 and true 4K I'll take it. If it means I can play Cyberpunk at what is quality wise "almost 4k" it's better than "GPU scaled 2560p".
 
Aren't there 2 modes to DLSS - only 1 of which actually runs at native res? The mode where it's showing baller performance seems to be like a high quality upsampling from 1440p to 4k.

Unreal has a temporal upsampling mode that's surprisingly sharp. You can easily lop 20% off the actual resolution and it's generally pretty difficult to notice. I guess the "fair" benchmark would be comparing DLSS vs 67% screen resolution at 4k and then with the other DLSS mode that confusingly seems to be called DLSS 2x at native 4k.
 
I think DLSS is the one real architectural benefit Turing has over it's predecessors.

Ray Tracing is for the birds. The performance impact is too steep for it to be practical. I don't think too many people are willing to drop their resolution to a quarter of what they are playing at now, to get the same frame rate with raytracing.

DLSS has promise though, if it performs as they say it does.

I'm sure DLSS will be refined as well. The main issue now is not many games support it. But longer term, it may make the currently obscene pricing for the 2070/2080 a bit better of a value if they can utilize DLSS and achiever higher frame rates over older generation cards. Obviously, image quality with DLSS seems to have some shortcomings but again I believe the tech will get better with time. Overall it looks promising.
 
I'm sure DLSS will be refined as well. The main issue now is not many games support it. But longer term, it may make the currently obscene pricing for the 2070/2080 a bit better of a value if they can utilize DLSS and achiever higher frame rates over older generation cards. Obviously, image quality with DLSS seems to have some shortcomings but again I believe the tech will get better with time. Overall it looks promising.

Huh, so there is no option to force DLSS in the driver settings?
 
Huh, so there is no option to force DLSS in the driver settings?
No, that's the whole reason I don't think much will come from this. It's going to fade away like TXAA. It requires developers to submit their code to Nvidia and for them to add it to their drivers on a per-game basis. It's not a global solution, it's a gimmick.
 
No, that's the whole reason I don't think much will come from this. It's going to fade away like TXAA. It requires developers to submit their code to Nvidia and for them to add it to their drivers on a per-game basis. It's not a global solution, it's a gimmick.
Pretty sure nVidia will do their best to ensure devs of major titles choose DLSS.
Added benefit is that it's proprietary tech that could hurt AMD a whole lot more than HairWorks did.
Hell we could even end up with AAA games not being playable at 4k/60 on AMD cards because the lack of DLSS giving the performance boost to push you over the edge.
I honestly hope that it doesn't take off or that AMD can cope with it or we might end up with the 20X0 prices as a new baseline.

I'm not really against nVidia but it shows typical signs of a company that will abuse a monopoly as much as it can.
 
You get the performance of 1440p coupled with 4K visuals. Not bad IMO. Or 64x SSAA at native resolution (DLSS 2x). That's going to be insane IQ if that ever makes it.
 
No, that's the whole reason I don't think much will come from this. It's going to fade away like TXAA. It requires developers to submit their code to Nvidia and for them to add it to their drivers on a per-game basis. It's not a global solution, it's a gimmick.


That's a bit disappointing. Why would they do it this way, and not just throw an API out there, and allow game devs and engines to call the feature in the drivers, the old fashioned way?
 
That's a bit disappointing. Why would they do it this way, and not just throw an API out there, and allow game devs and engines to call the feature in the drivers, the old fashioned way?
For DLSS to work on the games, the AI has to be taught how to handle AA in each game. The devs do not have the AI training abilities in-house to make this happen.
 
Ah,

Thanks for the explanation.

When people say AI, I tend to think it as some sort of self teaching machine learning ability, not as something that needs to be statically set up in advance. (that's just code, right? :p )

Makes sense now.

This is a variety of self teaching. I imagine they run the game through many epochs using a recurrent neural network.

Although I would imagine at some point in time someone has to manually verify it is actually cleaning up what they would expect it to. They could probably use some form of image recognition to attempt to automate some of that process, but I am not really sure how they are handling the validation portion.
 
So correct me if I am wrong, but DLSS is essentially just using resolution scaling to lower the resolution, then using game specific preset AI tricks to sharpen and clean up the image as it scales back to the native resolution?
 
So correct me if I am wrong, but DLSS is essentially just using resolution scaling to lower the resolution, then using game specific preset AI tricks to sharpen and clean up the image as it scales back to the native resolution?
That would be a very basic explanation, but from what has been presented to me, yes. The image it basically trying to match is an 8K 64X MSAA in terms of edge smoothness from what I understand. Shit, I maybe wrong, I do not have access to NVIDIA to ask these basic questions. You know, trade secrets and 5 year NDAs and all...
 
No, that's the whole reason I don't think much will come from this. It's going to fade away like TXAA. It requires developers to submit their code to Nvidia and for them to add it to their drivers on a per-game basis. It's not a global solution, it's a gimmick.
I read they dont add it to the driver.
You need GFE installed which detects the games you have installed and will download the relevant data.
No clue on how much data but I imagine it will get sizeable if you have a lot of games installed.
 
I read they dont add it to the driver.
You need GFE installed which detects the games you have installed and will download the relevant data.
No clue on how much data but I imagine it will get sizeable if you have a lot of games installed.
It's the same difference. Point being options like DSR / FXAA = works on thousands of games. DLSS = works on 25 games, probably a dozen more in the future.
 
It's the same difference. Point being options like DSR / FXAA = works on thousands of games. DLSS = works on 25 games, probably a dozen more in the future.
I'm with you, its a poor show.
And I dont want GFE on my system.
 
As per TechSpot in one of the Demos DLSS was equivalent to 1800p sampling & upscaling to 4k. Basically it is like a 3k mode !?

Wonder how it will be in real games
 
The issue I have with all this is that if the 2080's can do 4k properly, then on a 20ish inch monitor you wouldn't need any sort of AA. Maybe for 1440p and 1080p, but 4k rarely benefits from AA. The whole reason why DLSS exists is because 1/4 of the Turing chip is has the AI core, which is useless for anything besides Ray-Tracing image clean up and AA.
I think it's more that at 4K, where the 20xx card is being taxed, you don't really need the AA. And at 1080p or 1440p, where you need the AA, the card isn't being taxed and you can run any AA you want.

It's great the DLSS is much more efficient than TAA, but so is MLAA and FXAA. Except that MLAA and FXAA can work with so many more games, and don't require a 20xx series card and a special arrangement with Nvidia.
 
Back
Top