Nvidia releases DLSS 2.5.1

The differences are amazing, this looks like lower quality DLSS settings are now more viable as well as lower resolutions using DLSS. I am not sure AMD would ever catch up to Nvidia upscaling if not using AI. Game world complexity is just to complicated and changing in my opinion for a streamline algorithm or scripted batch of algorithms to match the capability of dedicated hardware assisted AI. As for DLSS 3, that just looks like one has to test out for themselves. Videos of it, even by Nvidia does not look too good to me and I see significant enough differences to want to only use it as last resort but then unless I actually see it action, play with it, I may like or love it in the end.
 
Comparing the 2nd image, which is 2560x1440 with DLSS 2.5.1 vs 2.4.3 both set to Quality I can see some decent image quality improvements. Cyberpunk was a good game to test this on.

The holograms have a good bit more detail with 2.5.1. The holograms by the trees on the old version become very difficult to see, but with 2.5.1 they show up a lot more. Likewise for the holograms on the overpass. Reflections on the building windows in the upper left are more visible as well. The street sign becomes far more legible. Tree leaves become more defined.

And it apparently has a performance increase. Seems like a good upgrade.

I still prefer not using any up-scaling, but until ray tracing becomes much easier to GPUs to do this will have to be used in most games that have it. It is also a great tool for when your GPU gets a bit older and cannot get high FPS anymore.
 
https://www.techpowerup.com/review/nvidia-dlss-2-5-1/

Looks better while offering the same or better FPS.
I looked at it and I wouldn't call it better or even equally as good. Good enough for sure but the problem with DLSS and FSR are things like ghosting which you see in motion and not in a photo. Compared to Native the DLSS images look more blurry and some areas are missing details.
 
I looked at it and I wouldn't call it better or even equally as good. Good enough for sure but the problem with DLSS and FSR are things like ghosting which you see in motion and not in a photo. Compared to Native the DLSS images look more blurry and some areas are missing details.
I meant looks better than 2.4.3 which was the previous commercial version of DLSS.
 
I looked at it and I wouldn't call it better or even equally as good. Good enough for sure but the problem with DLSS and FSR are things like ghosting which you see in motion and not in a photo. Compared to Native the DLSS images look more blurry and some areas are missing details.

I don't think it will ever look as good as it is when set to off. But it can give you a good 15-35% performance increase depending on the game. The ghosting issues vary from game to game. It is less noticeable in some games.
 
I don't think it will ever look as good as it is when set to off. But it can give you a good 15-35% performance increase depending on the game. The ghosting issues vary from game to game. It is less noticeable in some games.
I feel it will in many ways, specially if renderer-artist-game engine start to assume the presence of a technology like that (say text resolution, electric line and other affair the AI could get really good at).

Would it be just 15-35% performance it would probably be not worth it, in the article example we go from 33 FPS from Native TAA to 59 FPS in quality mode and 70 fps in balanced mode at 4k (or 63 to 104/111), that 75%-115% type of performance increase.

The question become what setting do you use to have 70 fps instead off 33 fps at 4k and does it look has good with those lower setting than with DLSS
 
Last edited:
I always think ... if my job was to make screenshots to examine, and magnify, in very specific areas of very specific games, i might pick one set of settings.
In actual play in most, I would pick another.

It's the experience. Everything is a tradeoff. Running "native" is tradeoff of course as well. Artisanal pixels. Except, they too were the product of a number of tradeoffs, and it runs slower. May or may not matter to you. Pick what you'd like your experience to be.

I personally don't seem to notice shadowing issues much. I do see failures in ambient lighting / GI. I also see FPS as really critical, almost always.

It is cool to have tools to dial all this in. No magic bullets, just more bullets.
 
  • Like
Reactions: noko
like this
I meant looks better than 2.4.3 which was the previous commercial version of DLSS.
I'm not at all excited for DLSS. FSR is more promising because it works on all GPUs and even works on Linux. It's not as good as DLSS but I feel the people who would benefit from this the most would be users of older GPU's where losing image quality for a performance boost might get you playable frame rates. If DLSS wants to impress me then it should look exactly like the original image because it's also limited to newer Nvidia GPU's. Especially DLSS 3 which is only for the RTX 4000 series.
 
I'm not at all excited for DLSS. FSR is more promising because it works on all GPUs and even works on Linux. It's not as good as DLSS but I feel the people who would benefit from this the most would be users of older GPU's where losing image quality for a performance boost might get you playable frame rates. If DLSS wants to impress me then it should look exactly like the original image because it's also limited to newer Nvidia GPU's. Especially DLSS 3 which is only for the RTX 4000 series.
FSR works on everything, but 2.5.1 works all the way back to the 2000 series at least. That's still going back 4 years, and unless you are running ray tracing neither tech is really needed unless you are running something woefully underpowered. Like I appreciate that I can use FSR on my 5700g to make 1080p "playable" for some things but still, it's a stretch.
 
https://www.techpowerup.com/review/nvidia-dlss-2-5-1/

Looks better while offering the same or better FPS.
I'm excited. Despite how much some want to hate on things like DLSS, I think it is great. While it would be nice if GPUs were powerful enough to do everything we want, at native rez, they aren't and we are already pushing some insane power amounts to get what we do. When you are at the limits of what hardware can do, you have three options:

1) Lower resolution.
2) Lower frame rate.
3) Back off on effects/details.

None are idea. But image reconstruction gives us a 4th option, by basically backing off on rez and then reconstructing the missing information. Is it 100% as good? No, but it is better than just lowering the resolution. If we want a future where we have photorealistic rendering, high resolutions, perfectly smooth 1000fps motion and so on it is going to involve some kind of image reconstruction. We just aren't getting there via brute force.
 
I'm excited. Despite how much some want to hate on things like DLSS, I think it is great. While it would be nice if GPUs were powerful enough to do everything we want, at native rez, they aren't and we are already pushing some insane power amounts to get what we do. When you are at the limits of what hardware can do, you have three options:

1) Lower resolution.
2) Lower frame rate.
3) Back off on effects/details.

None are idea. But image reconstruction gives us a 4th option, by basically backing off on rez and then reconstructing the missing information. Is it 100% as good? No, but it is better than just lowering the resolution. If we want a future where we have photorealistic rendering, high resolutions, perfectly smooth 1000fps motion and so on it is going to involve some kind of image reconstruction. We just aren't getting there via brute force.
The biggest problem is turning down the resolution never looks good, playing at 1080p native on a 1440p screen looks terrible, the scaling doesn't match, and things are warped or cutoff or both and it just makes a bad situation worse, same goes for trying to play 720p native on a 1080p screen. There are some exceptions to this as some monitors will internally rescale the image or add a black border, but those aren't exactly common unless you are using a TV as a monitor then some of them will do that pretty handily. But even still you are going to have all sorts of ghosting issues as those algorithms were designed for 24 and 29 fps TV/DVD playback.
1080p will scale correctly for 4K so things will be the correct proportions but it's going to be blocky as hell, and 1440p will have the scaling problems as other dissimilar resolutions, basically, DLSS and FSR are imperfect solutions to bad situation, but it's better than the alternatives. As I would argue that both DLSS and FSR manage to deliver a better visual look and gaming experience upscaling than you could achieve at native by turning down settings.
 
The biggest problem is turning down the resolution never looks good, playing at 1080p native on a 1440p screen looks terrible, the scaling doesn't match, and things are warped or cutoff or both and it just makes a bad situation worse, same goes for trying to play 720p native on a 1080p screen. There are some exceptions to this as some monitors will internally rescale the image or simply add a black border, but those aren't exactly common, unless you are using a TV as a monitor then some of them will do that pretty handily. But even still you are gonna have all sorts of ghosting issues as those algorithms were designed for 24 and 29 fps TV/DVD playback.
I mean, I've seen better and worse scaling algorithms, and some games do an ok job with lower render resolutions internally and then upscaling. None look as good as a good image reconstruction algorithm though. It isn't perfect, but it is what I will choose if available and native performance isn't what I want. Good example is Control. That game is gorgeous with Raytracing but even on my 3090 it chugs. However DLSS Ultra Quality is enough to get performance back up to a level that makes me happy and looks great.
 
There are some exceptions to this as some monitors will internally rescale the image
From my understanding they all do (if they accept under than native signal), unlike CRT an LCD must rescale.

1080p will scale correctly for 4K so things will be the correct proportions but it's going to be blocky as hell, and 1440p will have the scaling problems as other dissimilar resolutions,
Tv-monitor tend to use the same upscaling method regardless of the input res and higher the resolution the better. Some has integer scaling but that not the norm unless it changed.

I could be all wrong.
 
From my understanding they all do (if they accept under than native signal), unlike CRT an LCD must rescale.


Tv-monitor tend to use the same upscaling method regardless of the input res and higher the resolution the better. Some has integer scaling but that not the norm unless it changed.

I could be all wrong.
Each brand has its own method of doing it, last I checked Sony had one of the better upscaling engines on their TVs while TCL and Visio were fighting for the bottom of the list, but I am sure LG and Samsung were right up there so the 3 probably trade around from year to year.
 
Remember that back in the day, hardcore PC enthusiasts would call Anti-Aliasing "Fake Resolution" and say they always turn it off because running at higher res always looks better.

Now those same people probably hate the idea of jagged edges and shader aliasing.


I welcome our new upscaling overlords. I hope it gets better, and with certain algorithms, better than native.
 
So how do you use this? Just download it and throw it in the game directory?
 
I can't tell the difference on and off when in motion. Standing still? Maybe if I stare at it hard enough 3" from the screen. The FPS increase is always a improvement.
 
So how do you use this? Just download it and throw it in the game directory?
Tech Powerup hosts completed DLL files
https://www.techpowerup.com/download/nvidia-dlss-dll/

Instructions here:
https://www.tomshardware.com/news/dlss-client-library-manual-update

Basically go to the games directory, locate the nvngx_dlss.dll file, and replace it with the one from Tech Powerup.

You can get the DLL’s from Nvidia directly but last I checked it required you to be registered in their developer program.
 
Remember that back in the day, hardcore PC enthusiasts would call Anti-Aliasing "Fake Resolution" and say they always turn it off because running at higher res always looks better.

Now those same people probably hate the idea of jagged edges and shader aliasing.


I welcome our new upscaling overlords. I hope it gets better, and with certain algorithms, better than native.
And what is "fake" in computer graphics in the first place? Everything!

It's all a discrete numeric approximation of reality. We're coming up with a bunch of different tools we can dial in to provide the experience the player wants, with the gear they have.
 
Back
Top