Tip for nVidia users using HDMI and getting accurate color format

BatJoe

Gawd
Joined
Apr 4, 2012
Messages
836
This is only for nVidia users who are using HDMI on their display.

If you're display supports Full RGB, which most do, you may notice you aren't getting the right colors and the blacks don't seem so black and the whites don't seem so white. This is because despite under Display->Adjust Desktop Color Settings your digital color format being set at RGB, its actually by default set at Limited RGB, not Full. And you will see there is no option for Full RGB. Don't use ycbcr444 as it will oversaturate reds and magentas.

Full RGB is available only under Video->Adjust Video Color Settings, but this doesn't affect your games, applications, etc. Only videos.

The trick to enable Full RGB for everything is a reg hack, and luckily you can use a tool to do instead. Find it here

You must use this tool every time you install a new driver.
Next, complain to nVidia to let this be a option in future drivers here:
http://surveys.nvidia.com/index.jsp?pi=6e7ea6bb4a02641fa8f07694a40f8ac6
 
Yup, I've posted about this many times. However, there are still some who don't realize this. Nvidia does NOT output RGB properly over HDMI. AMD doesn't either, but their drivers give you a toggle to correct it. I've been reporting this to Nvidia with every driver release for awhile now.
 
Yes very helpfull for those who do not know . makes colors look like crap when not set correctly
 
Yes, I can't believe this hasn't been resolved yet. Do nVidia engineers even use their own product?!
 
That they don't add options for this is utterly baffling. Do they not want their cards to be taken seriously for video and color? If you don't have full RGB output to the TV it creates tons of banding as it tries to compress the colors into 16-235. My TV detects if you're using a PC and sets the HDMI black level to "high", which means it expects a 0-255 signal. When the Nvidia card instead sends out 16-235 everything looks washed out and terrible, so whoever designed this behavior at Nvidia for novices or whatever hasn't actually tried to plug in a card into a TV to see how they behave in reality.
 
That they don't add options for this is utterly baffling. Do they not want their cards to be taken seriously for video and color? If you don't have full RGB output to the TV it creates tons of banding as it tries to compress the colors into 16-235. My TV detects if you're using a PC and sets the HDMI black level to "high", which means it expects a 0-255 signal. When the Nvidia card instead sends out 16-235 everything looks washed out and terrible, so whoever designed this behavior at Nvidia for novices or whatever hasn't actually tried to plug in a card into a TV to see how they behave in reality.

Problem is magnified by so many new monitors using HDMI/VGA with no DVI options (Samsung's new monitors and BenQ as well). I've observed this behavior on several Samsung and BenQ monitors this past year, and in all cases I had to modify the driver to make it work (before the utility from the OP), or set the monitor to the lower setting and live with it.
 
Back
Top