Nvidia DSR and GeForce Experience

biggles

2[H]4U
Joined
Jul 25, 2005
Messages
2,215
So, I went into the GeForce experience to see what it recommended for F1 2018. This is on a 3060 ti and 2560x1080 100hz monitor. It said optimum resolution is 3620x1527 DSR. Does this mean GeForce experience thinks that the 3060 ti GPU is basically too powerful for the monitor's native resolution, hence the higher suggested res? Running the benchmark program with recommended settings in the game it does say avg fps is 99, so the GFE settings seem fine. Just curious about this since I thought DSR was something only used in older games. I believe that DSR is like supersampling, so good for removing antialiasing. Does it do anything else for image quality? Do people here use DSR manually, or only when recommended by GFE?
 
GFE likes to recommend that when it feels your power exceeds the needs of the game by a significant amount, because they always want you to feel like an upgrade could so something for you.

I personally never use it. IMO either the image quality change is irrelevant, or I'd prefer more FPS. I'm sure there are shining examples of when it looks great, but generally the other AA options work fine.

YMMV!
 
GFE won't recommend DSR resolutions if you have DSR disabled in the control panel.

The suggested settings in GFE target 60 FPS, last I checked. If you want more FPS then you can customize the target quality settings in GFE for that game. Or just tweak the game settings yourself.

I have not used DSR since getting a 4K display. When I had a 2560x1440 monitor I would sometimes manually DSR it up to 5K. I would stick to integer multipliers with DSR so you don't need to add excessive sharpening, but see what works for you.
 
I found some DSR + 2xMSAA is faster than 4xMSAA and looks better.
 
DSR is just a proprietary name for the same process as supersampling. You render at a higher resolution and then downscale it to the target resolution.

I use it fairly often on my 1440p display, as a boost to 1.25x or 1.5x can often remove most aliasing present without significantly tanking performance. In addition to the scale factor, you can modify the blurriness that Nvidia adds to the end product, which is attempting to soften the output. I turn this down to about 10% or so as it softens everything so much, it easily negates any advantages of upscaling the textures at higher %'s.

On my 4k display, i just about never use it, as it can quickly destroy whatever performance you had going for you.

Also - most of us don't use geforce experience (bloatware, spyware, telemetry ect) or the recommendations - a little trial and error in the nvidia control panel, and the games settings never hurt anyone!
 
DSR is just a proprietary name for the same process as supersampling. You render at a higher resolution and then downscale it to the target resolution.

I use it fairly often on my 1440p display, as a boost to 1.25x or 1.5x can often remove most aliasing present without significantly tanking performance. In addition to the scale factor, you can modify the blurriness that Nvidia adds to the end product, which is attempting to soften the output. I turn this down to about 10% or so as it softens everything so much, it easily negates any advantages of upscaling the textures at higher %'s.

On my 4k display, i just about never use it, as it can quickly destroy whatever performance you had going for you.

Also - most of us don't use geforce experience (bloatware, spyware, telemetry ect) or the recommendations - a little trial and error in the nvidia control panel, and the games settings never hurt anyone!
The blurriness comes from scaling using a non-integer multiplier. NVIDIA also idiotically uses bilinear sampling, if I remember correctly, just like they do with their (non-DLSS) upscaling method.
 
DSR is just a proprietary name for the same process as supersampling. You render at a higher resolution and then downscale it to the target resolution.

I use it fairly often on my 1440p display, as a boost to 1.25x or 1.5x can often remove most aliasing present without significantly tanking performance. In addition to the scale factor, you can modify the blurriness that Nvidia adds to the end product, which is attempting to soften the output. I turn this down to about 10% or so as it softens everything so much, it easily negates any advantages of upscaling the textures at higher %'s.

On my 4k display, i just about never use it, as it can quickly destroy whatever performance you had going for you.

Also - most of us don't use geforce experience (bloatware, spyware, telemetry ect) or the recommendations - a little trial and error in the nvidia control panel, and the games settings never hurt anyone!
I'd recommend also using nv clean install to further remove the telemetry.
 
Nvidia's Shadowplay recording in GeForce Experience is now broken more than not.
Every time you try to launch it, the infamous 0x00003 error pops up after the few successful initial uses at driver installation.
It is still one of the smoothest, best recording suites....when it works....but that seems to happen less and less.
 
Nvidia's Shadowplay recording in GeForce Experience is now broken more than not.
Every time you try to launch it, the infamous 0x00003 error pops up after the few successful initial uses at driver installation.
It is still one of the smoothest, best recording suites....when it works....but that seems to happen less and less.
I have not run into this issue.
 
Back
Top