More DLSS...

DLSS is Always Learning, New Improvements Coming Soon

Upcoming DLSS improvements include:

-AI network enhancements for DLSS Frame Generation that better take advantage of game engine data, improving UI stability and image quality during fast movement
-An updated AI network for DLSS Super Resolution Ultra Performance mode, improving fine detail stability and overall image quality
-An updated AI network for NVIDIA DLAA that improves image quality, reduces ghosting, and improves edge smoothness in high contrast scenarios

Cyberpunk 2077, The Witcher 3: Wild Hunt, and Portal with RTX will be the first to update with these new features in upcoming game patches

https://www.nvidia.com/en-us/geforce/news/ces-2023-rtx-dlss-game-updates/
 
Nvidia DLSS 2.5.1 Review- Significant Image Quality Improvements

The new DLSS 2.5.1 version automatically disables all of the built-in sharpening effects in all games where you apply it, and the in-game sharpening filter sliders are doing nothing with the newest DLSS version applied—this change alone will fix many image quality issues, as some games, such as God of War or Red Dead Redemption 2 for example, are applying some level of sharpening even if the sharpening slider was set to 0 value...if you need a sharpening filter in your game, you can still apply the setting manually in NVIDIA Control Panel or any other external sharpening filter tools...

https://www.techpowerup.com/review/nvidia-dlss-2-5-1/
 
Nvidia DLSS 2.5.1 Review- Significant Image Quality Improvements

The new DLSS 2.5.1 version automatically disables all of the built-in sharpening effects in all games where you apply it, and the in-game sharpening filter sliders are doing nothing with the newest DLSS version applied—this change alone will fix many image quality issues, as some games, such as God of War or Red Dead Redemption 2 for example, are applying some level of sharpening even if the sharpening slider was set to 0 value...if you need a sharpening filter in your game, you can still apply the setting manually in NVIDIA Control Panel or any other external sharpening filter tools...

https://www.techpowerup.com/review/nvidia-dlss-2-5-1/
DLSS 2.5.1 seems to benefit the image quality of DLAA greatly. I always found that DLAA looked oversharpened and aliased. It looked worse than TAA in Forza Horizon 5 to my eyes.

I tested the new DLL on Spider-Man Remastered, Miles Morales, No Man's Sky and Forza Horizon 5 and DLAA looks much improved and stable especially in motion.
 
DLSS 2.5.1 seems to benefit the image quality of DLAA greatly. I always found that DLAA looked oversharpened and aliased. It looked worse than TAA in Forza Horizon 5 to my eyes.

I tested the new DLL on Spider-Man Remastered, Miles Morales, No Man's Sky and Forza Horizon 5 and DLAA looks much improved and stable especially in motion.
This just shows much much can be personal taste too or game engine sprcific. I swapped 2.5.1 in FH5 with DLAA (4k max settings) and thought it was blurry, much worse that MSAA I had used before DLSS/AA was even a thing in FH5. I settled on 2.4.12 v2, which fixed the ghosting issue FH5 had with DLAA, but still gave me a sharper image which I prefer.
 
This just shows much much can be personal taste too or game engine sprcific. I swapped 2.5.1 in FH5 with DLAA (4k max settings) and thought it was blurry, much worse that MSAA I had used before DLSS/AA was even a thing in FH5. I settled on 2.4.12 v2, which fixed the ghosting issue FH5 had with DLAA, but still gave me a sharper image which I prefer.
Yep, it's all subjective and honestly, I haven't used MSAA instead of DLSS or TAA in a long time now. DLAA is still temporal AA so it's never going to be as sharp as MSAA. Anyway, I'd rather have a softer cinematic look with stability of graphics in motion than sharp graphics with more annoying and immersion breaking pixel crawling and shimmering artifacts here and there.

I guess that I was trying to say was that DLSS 2.5.1 really improved DLAA's image stability in terms of edge aliasing, Moiré patterns, transparencies and sub-pixel elements which are more noticeable in motion and the improvements in DLAA are already approaching SSAA levels image stability. That and how DLSS 2.5.1 removes the crappy artificial sharpening that degraded DLAA.
 
Last edited:
Is the 2.5.1 just a driver update, or do all of the games all have to patch it out support also?
The game needs an update, or you need to replace the .dll yourself (and doing it yourself may make things worse depending on the game).
 
Just got notification that DLSS 3 is available on Cyberpunk 2077. Anyone got a chance to play around with it yet?
 
DLSS 2.5.1 seems to benefit the image quality of DLAA greatly. I always found that DLAA looked oversharpened and aliased. It looked worse than TAA in Forza Horizon 5 to my eyes.

I tested the new DLL on Spider-Man Remastered, Miles Morales, No Man's Sky and Forza Horizon 5 and DLAA looks much improved and stable especially in motion.
This just shows much much can be personal taste too or game engine sprcific. I swapped 2.5.1 in FH5 with DLAA (4k max settings) and thought it was blurry, much worse that MSAA I had used before DLSS/AA was even a thing in FH5. I settled on 2.4.12 v2, which fixed the ghosting issue FH5 had with DLAA, but still gave me a sharper image which I prefer.
Yep, it's all subjective and honestly, I haven't used MSAA instead of DLSS or TAA in a long time now. DLAA is still temporal AA so it's never going to be as sharp as MSAA. Anyway, I'd rather have a softer cinematic look with stability of graphics in motion than sharp graphics with more annoying and immersion breaking pixel crawling and shimmering artifacts here and there.

I guess that I was trying to say was that DLSS 2.5.1 really improved DLAA's image stability in terms of edge aliasing, Moiré patterns, transparencies and sub-pixel elements which are more noticeable in motion and the improvements in DLAA are already approaching SSAA levels image stability. That and how DLSS 2.5.1 removes the crappy artificial sharpening that degraded DLAA.

That all mostly comes down to them removing the sharpening pass. This was done specifically because it was causing an over sharpen artifact called "Haloing" that would snap in and out of view during movement. This was super apparent on fine details or high contrast areas of the image. Trees in Marvels Spider-Man where a great example. Where the branches would almost light up during movement due to the haloing effect happening from the sharpen pass.

As mentioned earlier, you can still apply a sharpen pass in other ways.
 
Downloading the game again. Going to try it out this afternoon.
Max settings on everything including RT. DLSS quality with frame generation. Game looks great and holds a steady 97 FPS (max gsync setting is 97 of 100hz for my display) @ 3440x1440. Just played a little bit so have to keep going to see if any artifacts are noticeable.
 
Max settings on everything including RT. DLSS quality with frame generation. Game looks great and holds a steady 97 FPS (max gsync setting is 97 of 100hz for my display) @ 3440x1440. Just played a little bit so have to keep going to see if any artifacts are noticeable.
Just drove around for a bit with 4K DLSS Quality, Max Settings, Frame Generation. Pretty stead around 100 FPS. Super smooth.

I don't recommend 4K Max settings, Frame Generation, NO DLSS. You get around 60 FPS, but the game feels goopy and mushy, probably due to the increased frametimes.
 
Just drove around for a bit with 4K DLSS Quality, Max Settings, Frame Generation. Pretty stead around 100 FPS. Super smooth.

I don't recommend 4K Max settings, Frame Generation, NO DLSS. You get around 60 FPS, but the game feels goopy and mushy, probably due to the increased frametimes.
I'm just impressed to get 60fps with no DLSS.
Above 100 does feel much better.
 
Just got notification that DLSS 3 is available on Cyberpunk 2077. Anyone got a chance to play around with it yet?

I did, FPS are crazy fast.
Don't forget, that for DLSS 3 to work on windows 10, you have to enable "Hardware Accelerated GPU Scheduling" in windows settings, then reboot.
If you don't do this, DLSS 3 doesn't show up in the settings.

1676999616060.png

Graphics settings is the bottom option on the Display settings page.
 
One thing I noticed using DLSS 3.0 is GPU utilization goes down to 85-92%, saving about 50W in power. DLSS 225W vs Native 275W in 2077, and the DLSS image looks better because of the ability to apply sharpness.

More than double the frames, less power usage so the gpu runs ~8C cooler and the game looks better, all because of DLSS.
 
DLSS 3 revisit since launch



Overtime UI element flicker and sudden whole screen change (like a camera change) should overtime get massed out everywhere has it seem already possible to do so (cyberpunk and some title fixed one or both), but yet to happen has even new title still show the issues.

Newer title that achieved to get back to being heavily CPU limited even at low resolution (like the Potter game) making the tech more attractive even for highest end gpus
 
One thing I noticed using DLSS 3.0 is GPU utilization goes down to 85-92%, saving about 50W in power. DLSS 225W vs Native 275W in 2077, and the DLSS image looks better because of the ability to apply sharpness.

More than double the frames, less power usage so the gpu runs ~8C cooler and the game looks better, all because of DLSS.
You can use sharpness filters without DLSS. There are options for it in the driver control panel. Of if you use Geforce experience, you can adjust it from Nvidia's on screen overlay.
 
Nvidia's DLSS 2 vs. AMD's FSR 2 in 26 Games, Which Looks Better?- The Ultimate Analysis

 
Nvidia's DLSS 2 vs. AMD's FSR 2 in 26 Games, Which Looks Better?- The Ultimate Analysis


Lol, AMD getting bodied. My eyes aren't to obad. When I switched to a 4K monitor I thought FSR Quality was pretty good so I was convinced it was just good tech, turns out it's the only decent use case, and for the most part is still inferior to DLSS. We'll see if they use deep learning assisted algorithms in the future.
 
One interesting (although quite complicated) would be at matching FPS, how many time does FSR-DLSS look significantly better than native if you try to stay at 90fps for example, using setting or reducing native resolution to achieve it without using them.

I wonder if it would not have been the slight to very clear winner in all the tested games versus native.

It is one thing to say slighty to much worst image in some ways, better an others way but you have 100 fps instead of 60fps.

But if you can have a clearly better image at 60 fps instead of native 60 fps, let alone clearly better at 70 fps instead of 60 native, then it is pure clear win.
 
Last edited:
  • Like
Reactions: noko
like this
Is DLSS Really "Better Than Native"?- 24 Game Comparison, DLSS 2 vs FSR 2 vs Native


Experience in Rise Of The Tomb Raider when DLSS became available was phenomenal. DLSS was clearly better than native mostly due to the very poor AA that game has.

So far FSR has been bla, FSR 1unsuable due bad motion artifacting. Been running native when using AMD cards. Having a LG 42in OLED, which has a 21:9 aspect ratio option,, allows a 3440 x 1440p resolution, which I don't scale but center on the display. Looks and runs great. Image is virtually the same size as my 3440x1440p monitor sitting right next to it. The black borders in a way makes it better, OLED, they are pitch black.
 
I've been playing Death Stranding Director's Cut lately and I've swapped the dll with version 2.5.1 and the result is 100% better than native to me, at least with that one game. The AA is superior with DLSS quality setting and things like powerlines are rendered more completely compared to native AA and there's no more annoying ghosting especially with those floating cryptobiotes when using DLSS 2.5.1.

Side bonus is that the GPU uses less power with DLSS compared to native when frame capped just below my monitors max refresh.
 
One interesting (although quite complicated) would be at matching FPS, how many time does FSR-DLSS look significantly better than native if you try to stay at 90fps for example, using setting or reducing native resolution to achieve it without using them.
This is a good example at 14:00


Impressive how better the image can look if you aim to get about the same performance with DLSS versus native, something that seem rarely talked about when reviewing the tech with the focus more on looking worst but with way more frames.
And would indicate that there should be a significant window of better performance with better quality that exist.
 
Last edited:
Back
Top