I spent the last year with Wake Forest fighting a brain infection , so yes there are side effects to my intelligence .Yeah I'm sure it was a real joke. And by the way it is "you are" not "your" and "than" not "then" but please act like my intelligence is in question here..
Nvidia's solution looks cleaner than FSR. When I enable it and "overlay" option green letters NIS appear so let's call it NIS for short.So there's a more direct answer to FSR in the newer drivers. You literally just enable it in the control panel and use a lower resolution. It's a spatial upscaler just like FSR.
It's alright, I guess - but DLSS is clearly way ahead. I tried with 1080p to 1440p and it doesn't look great...
Nvidia already used Lanczos when using GPU to upscale lower resolutions rather than monitor. Maybe the difference is in amount of taps...If your seeing artifacting with NIS it's likely because it's regular Lanczos. Part of why FSR is favoured so heavily is because it resolved many of the "artifacts" or issues with using Lanczos. Notably rining, which could be mistaken for the classic over sharpening artifact of the same name.
The sharpening filter used by AMD in both FSR and Radeon Image Sharpening is called CAS = Contrast Adaptive Sharpening.
It would be nice if devs let you control the sharpness slider when using FSR. Not everyone likes over sharpened images.
Paste screenshot (at least part of it with most offending part if its very large) so we can assess it tooOk so "Lossless Scaling" updated to add NIS. So I gave it a test using the AMID EVIL demo. If this is nVidia response to FSR... they have lost the plot. It looks like hot garbage. The ringing artifacting is crazy!
You also can in the Lossless Scaling app. Which also just added NIS, so I'll have to give that a whirl for comparisons sake later
According to this the version in the latest driver is using 6 taps versus 2 in the old version, along with 4 directional scaling and adaptive sharpening filters.Nvidia already used Lanczos when using GPU to upscale lower resolutions rather than monitor. Maybe the difference is in amount of taps...
View attachment 414172
I do not particularly like oversharpened images either and test FSR at 0% RCAS.
Paste screenshot (at least part of it with most offending part if its very large) so we can assess it too
I am comparing FSR and NIS (done via drivers so I cannot make screenshots) and yes NIS has permanent sharpening built-in even at 0% sharpening while FSR has not.
The main difference between the two is that Nvidia does not hide the sharpening but this also means image looks more consistent, otherwise both upscaling methods look very similar.
I would say FSR is higher quality and would stick with this conclusion except one slight annoyance: at certain times certain scene elements stick out way too much with FSR. When I test FSR in Cyberpunk people look like cartoon cutouts with FSR while they do not do that with NIS or Lanczos (previous GPU upscaler - though better looking than Irfan View's Lanczos), or on my monitor scaler, bilinear or ingeter/point scaling. Only FSR does that...
View attachment 414162
View attachment 414163
View attachment 414164
View attachment 414165
Does FSR looks good here? To me FSR's clipping artifacts look ridiculous.
NIS does have ringing artifacts but they make image look more consistent. The same way older GPU upscaling, also Lanczos based, looked more consistent.
You can see the same effect on pretty much all screenshots of FSR games.
It is of course much more pronounced at 2x scale ("Performance" preset) than when doing something like 1440p->2160p which is more sensible use case but effect being less visible does not make it gone.
Really for now it all comes down to what is available and what looks the best for person who uses it. Obviously when DLSS can be enabled it is the best option (even for quality - DLSS can be always made to run at native with DSR and it does get rid of any blurriness DLSS might normally have) and when it is not available it is some sort of compromise. Except maybe when using integer scaling because it is technically exactly like playing at native resolution just on monitor which has lower resolution and less screen door effect
There is quick solution to sharpen-mask-like effect of NIS upscaling.
Not something tools like Lossless Scaling will be able to do it (unless they implement it directly in to their pipeline) but for genuine Nvidia card users just enable FXAA
FXAA's inherent blur cancels out NIC's inherent sharpening
It also improves anti-aliasing beyond what TAA normally gives so it is a win-win.
Not quite DLSS but I guess for not DLSS enabled games it will have to do
Where?You can see how the NIS lost some colour information due to ringing.
Differences are very subtle or very noticeable depending on the case.They are going to be exceedingly close because they use the same under lying Lanczos method. Honestly we are being pretty picky to find the differences.
By DLSS you mean literal pixel magic?But this is a DLSS thread. So I'll stop distracting, we can always pixel peep further in the FSR thread
It's really hard to say anything based on those two shots. You upscaled them poorly (multiple times by the looks of it or forum actually downscaled them since they were too large):
Well crap, I thought I did the best I could. Well if anyone can recommend a place I can upload the png's where they won't get messed with I'd be happy to upload them and you guys can zoom and pixel peep away.It's really hard to say anything based on those two shots. You upscaled them poorly (multiple times by the looks of it or forum actually downscaled them since they were too large):
View attachment 414532
Imgur ones are just a bunch of jpg compression artifacts and chroma subsampling kills the rest of what little detail there was.
I know how to manually swap them out. The POINT was it was supposedly going to be a game update but the game itself is not updating the file and even if you go through GFE it will show the old DLSS version in the game files. You have to use something like process explorer to even see if the damn game is using the newer version which is silly. I don't want to use GFE and hope its not going to be the way they implement new DLSS versions going forward.You don't even need that. Google it, there are sites that show you how to swap in newer DLSS 2.X versions into different games that have DLSS 2.X. Nice that nVidia is starting to do it them selves, but not needed.
If game developers do not have anything to update in the game it is very unlikely they will bother releasing path with new DLSS version. Especially that in order to do that they would need to put game with new file they would have to somehow test it.I know how to manually swap them out. The POINT was it was supposedly going to be a game update but the game itself is not updating the file and even if you go through GFE it will show the old DLSS version in the game files. You have to use something like process explorer to even see if the damn game is using the newer version which is silly. I don't want to use GFE and hope its not going to be the way they implement new DLSS versions going forward.
I take it that means launching the game through GFE to get the update. If anybody installs GFE remember to go into the settings and disable automatic "optimizing" for your games, disable shadow play, disable streaming, disable free style, and disable Ansel. Ansel in particular can play havoc with your games.According to the Nvidia site, several games should have been updated to the 2.3 version but only Shadow of the Tomb Raider has for me.
I fired up Cyberpunk 2077 earlier and no update for that either even though it was supposed to be updated 4 days ago according to Nvidia: "In Cyberpunk 2077, which updates to NVIDIA DLSS 2.3 today, it more smartly uses motion vectors to reduce ghosting on fine detail in motion."
EDIT: Ok what the hell is this nonsense? Apparently I have to use GFE to get the update for the DLSS 2.3.
Cyberpunk 2077 Will Not Be Updated to Implement DLSS 2.3. It is the NVIDIA GeForce Experience App Responsible For Injecting the DLSS 2.3 (.dll) Version into the Game Files
Thats not quite how it functions.Yesterday, Nvidia added DLDSR in the new drivers. Its an AI assisted version of DSR. It supposedly allows you to downscale with a trivial peformance loss. The exampel they show is to downscale 1620p to 1080p.
I am trying exactly that and it doesn't seem to be working. It looks just like regular DSR and gives me a big performance hit, just like DSR. **actually, using regular DSR is smoother for me. With DLDSR, the frametimes are jacked up and it gives me a slight rubber-banding effect.
View attachment 432715
Well then the screenshot and wording on the page are not clear.Thats not quite how it functions.
Its reported to give DSR 4x quality when using DLDSR 2.25x, with the normal performance hit of DSR 2.25x.
Its benefit is to give better quality than the old DSR, it doesnt change the performance hit.
However many people see no quality difference when used above 1080p screen res.
Seems there are bugs to work out.