Adjusting color via drivers vs monitor

Enhanced Interrogator

[H]ard|Gawd
Joined
Mar 23, 2013
Messages
1,428
So, if you were to lower contrast via CCC, because your whites are too bright, are you losing information in the sub-white zone? In other words, are some of the steps on the 0-255 RGB scale being merged?

What are the pros/cons of doing it on your monitor's OSD instead?
 
Even though this is probably irrelevant to your question..

If I'm using my screen/s I use the OSD to adjust Brightness, Contrast, Other. If it's someone elses screen then I just use the video drivers to adjust those settings. I do find using the drivers better for lightly 'tweaking' any brightness/colour but I have to keep doing it again every time I update the drivers.
 
Yeah, I'm not noticing any banding.

I'm just wondering if any of you know how the color adjustment really works. Like, over DVI, you get the 32-bit 0-255 RGB signal. I'm wondering how reducing the contrast from 100 to 90 doesn't crush detail somewhere in there. I guess it just makes the differences between colors in the upper range smaller? I think that would cause banding somewhere in the gradient but I'm not seeing it.
 
So, if you were to lower contrast via CCC, because your whites are too bright, are you losing information in the sub-white zone? In other words, are some of the steps on the 0-255 RGB scale being merged?
With NVIDIA, yes. With AMD, no. AMD uses a higher precision lookup table and temporal dithering to emulate more shades, which is why you don't see any banding.

EnhancedInterrogator said:
What are the pros/cons of doing it on your monitor's OSD instead?
That depends on the monitor. Some monitors also do temporal dithering, while others will just duplicate shades and cause banding. If the monitor's contrast setting doesn't produce banding, then it's best to use that because some full screen games won't use the video card's color settings.
 
With NVIDIA, yes. With AMD, no. AMD uses a higher precision lookup table and temporal dithering to emulate more shades, which is why you don't see any banding.


That depends on the monitor. Some monitors also do temporal dithering, while others will just duplicate shades and cause banding. If the monitor's contrast setting doesn't produce banding, then it's best to use that because some full screen games won't use the video card's color settings.

/Sigh

You just sold me a video card. Go back to your genius think tank Toasty! <3
 
Thanks Toasty, that's the kind of information I was looking for.

I didn't mention my setup at first because it's kind of unique, and would have just added confusion to the conversation. It's a 290x>DVI>HDFury 1 DAC>Samsung CRT. Somewhere along the line, the gamma is thrown off a little. So I found that if I cranked the monitor's contrast to max, then lowered the contrast in CCC so whites wouldn't be blinding, and give a slight gamma boost in CCC, I get the best picture. I still need to hook up a card with a built in DAC to compare, though.
 
Back
Top