Monitor settings questions. 144/160hz and 8/10bit

ChrisUlrich

Weaksauce
Joined
Aug 4, 2015
Messages
117

LG 34GP83A​


That's the monitor I have.

I can run 144hz and 10bit
I can run 160hz and 8bit

My gaming never sees 144fps let alone 160fps.

Is there any reason to leave it at 160hz and 8bit over 144hz and 10bit?
 

LG 34GP83A​


That's the monitor I have.

I can run 144hz and 10bit
I can run 160hz and 8bit

My gaming never sees 144fps let alone 160fps.

Is there any reason to leave it at 160hz and 8bit over 144hz and 10bit?
No. And honestly you wouldn’t be able to tell the 8bit from 10bit. I’d leave it at 144 and call it a day.
 
No. And honestly you wouldn’t be able to tell the 8bit from 10bit. I’d leave it at 144 and call it a day.
Why leave the monitor at 144hz? Is there a benefit to the 144hz 10bit over the 160hz 8bit? Considering you just mentioning noticing no difference between 8 and 10 bit
 
Why leave the monitor at 144hz? Is there a benefit to the 144hz 10bit over the 160hz 8bit? Considering you just mentioning noticing no difference between 8 and 10 bit
In all honesty it really does not matter. All video games are still rendered in 8-bit SRGB unless they have an HDR option, so it has to be normalized for the 10-bit color space if you leave the monitor on that setting which can result in an oversaturated image. Feel free to set it to 160 Hz. Whichever refresh rate you choose, make sure you are using G-SYNC. That way it doesn't matter if you can't reach a framerate that is close or equal to your monitor's refresh rate.
 
In all honesty it really does not matter. All video games are still rendered in 8-bit SRGB unless they have an HDR option, so it has to be normalized for the 10-bit color space if you leave the monitor on that setting which can result in an oversaturated image. Feel free to set it to 160 Hz. Whichever refresh rate you choose, make sure you are using G-SYNC. That way it doesn't matter if you can't reach a framerate that is close or equal to your monitor's refresh rate.
So it sounds like I should keep it at 8bit/160hz since it's primarily a gaming monitor.

I play mostly new games, so they support HDR. But this monitor in particular doesn't have great HDR support.
 
If I don't enable 10bit on my monitor, I see banding in a lot of desktop wallpapers. My monitor works 10bit at 144hertz. I always enable 10bit.
 
If I don't enable 10bit on my monitor, I see banding in a lot of desktop wallpapers. My monitor works 10bit at 144hertz. I always enable 10bit.
If you use Nvidia card this might happen because Nvidia despite having dithering capabilities doesn't enable them by default and these settings are buried deep inside windows registry. Any modification of graphics card color lock up tables will result in reduced color gradation because you cannot do color correction of 8-bit desktop with 8-bit color precission. Otherwise Windows desktop and wallpaper you use are 8-bit.

If you do not use any ICC profile and did not change any color setting in Nvidia control panel then you can check "Override to reference mode" under "Adjust desktop color settings" to prevent changes to video card LUT. Otherwise colors need to be set to 50%/50%/1.00 for brightness/contrast/gamma and then there should be no banding.

Note: Without this reference mode windows or applications and more specifically games (especially old ones) can modify gamma and this will cause slight banding on 8-bit. Otherwise applications with 10-bit output should be actually dithered to 8-bit on 8-bit displays so no issues should happen there. There is also no benefit from using 10-bit on today's LCD monitors because they always use 8-bit panels + A-FCR and dithering algorithms on GPU's are identical to A-FCR.

It is possible to enable dithering on Nvidia and get better color precission. In fact for color calibration dithering is still recommended because LUT's have 16-bit precision so it is best to enable dithering before calibration and have it enabled. Might be 10bit but might as well be 8-bit. Heck, 8-bit might yield better result depending on how monitor handles 10-bit.

ps. On AMD you have dithering enabled by default so no issues there.
 
Is there any reason to leave it at 160hz and 8bit over 144hz and 10bit?
If you do not use any color correction (ICC profiles or adjusting colors in Nvidia control panel) then set "Override to reference mode" in "Adjust color desktop settings" and you can use 8-bit 160Hz "safely" from the point of view of avoiding any possible banding in desktop and games. (Note: with this setting to not have banding you cannot change Brightness/Contrast/Gamma - set them to 50%/50%/1.00 if you have changed them)

Downside of this is that you cannot change gamma in some (mostly older) games. Of course if you are sensitive banding then you should really be using ReShade with deband filters, especially for older games as most of these had terrible color precision and without deband shader they look terrible anyway. And if you use reshade there are gamma correction shaders.

Other option to use 160Hz but not loose ability to modify LUT's (you will need it if you have these fancy color calibration probes and want to to calibrate your display) is to use dithering. Procedure is somewhat complicated... or rather bothersome: https://hub.displaycal.net/forums/topic/how-to-enable-dithering-on-nvidia-geforce-with-windows-os/ but yields great results. I get better results on my 10-bit display with dithering, even using 8-bit mode than native 10-bit without dithering!

But why bother if quick and easy solution to not have to bother with all this is using 10-bit?
160Hz will still give slightly lower input lag than 144Hz. The difference here is rather minuscule but will happen even if you run games at framerate lower than 144fps.
Obvious reason for it is that the higher refresh rate you have the faster screen is refreshed. You can however get only something like 0.68ms input lag reduction at the bottom of screen and half that in the middle so not much.

Another advantage is movie playback. You cannot pretty much avoid judder but the higher refresh rate you have the less noticeable it becomes. This is why even for normal desktop usage 60Hz monitors suck. Especially 60fps YT videos you would think will run correctly on 60Hz monitor but of course they do not :C

Lastly mouse will be slightly smootcher.

BTW. I am not sure if you are aware of this but for VRR (G-Sync/Freesync) your games should run at framerates below monitor refresh rates. At least 3-4fps less so 140Hz for 144Hz and 156fps for 160Hz. It is of course harder to hit ceiling the higher framerate you have. This of course can be achieved with frame rate limiters. I am not sure how Nvidia framer rate limiter performs as I have not tested it or have read about it.

tl;dr
If any of this seems too complicated I'd say safer from image quality point of view setting is 144Hz 10bit just to avoid potential banding.
Otherwise if you did not change any color settings use 160Hz 8bit and on "Adjust desktop color setting" page enable "Override to reference mode" which should be slightly better for games, movies and mouse cursor.

ps. I have no idea if you actually need 10bit for HDR or not as I do not have HDR monitor. It might still be preferred if your monitor has DCI-P3 gamut... and if it has you should enable sRGB emulation mode in your monitor to avoid over-saturated image.
 
Is there an app to enable dithering without trudging through the registry? To me that is totally assinine for nvidia to do that. Why not have that ability in the ncp? Makes no sense. What is the best way to enable dithering?

Both my panels are 10bit panels. So I just figured to enable 10bit on my 2070. I have both ICC profiles installed for both monitors. Unreal that shit that should be simple as hell is still made hidden and out of sight in 2021.
 
Back
Top