1080p RGB (Full Range) 8/10-bit + HDR?

Monstieur

Limp Gawd
Joined
Jun 10, 2011
Messages
440
1. Does any TV support 10-bit RGB full range at 1080p? So far I have only seen 10-bit in YCbCr444 with limited range even at 1080p - RGB was always locked to 8-bit.
2. Is HDR metadata even compatible with RGB, or does it have to be YCbCr444 with limited range?

This is for desktop usage / gaming at 1080p and not 4K movies (where it's better to switch the display to 4K@60 10-bit YCbCr420 anyway).
 
There is effectively no visual difference between RGB and YCbCr444. In the HDMI specification YCbCr is the only way to transmit 10- and 12-bit color, which is why RGB is limited to 8-bit on televisions. YCbCr doesn't require as much bandwidth as RGB.

The way YCbCr is encoded means that brightness values below 16 and above 235 are used for footroom and headroom, meaning YCbCr will always be "limited" range. This is because YCbCr is calculating the difference in brightness from the red and blue signals instead of getting absolute values in each color channel. Some devices have a "superwhite" function that extends the brightness range from 16-235 to 16-240. If your graphic driver settings are set appropriately then you won't get any clipping errors such as black crush as the brightness values are appropriately translated when decoded.

HDR can be transmitted using RGB with monitors that support it. You're not going to find it in any television, though.

Most games actually have their LUT baked for limited range, especially if there is a console version. You can read more about that here:
https://reshade.me/forum/shader-presentation/2948-extended-levels-w-b-point-histogram-port
 
I'm worried about YCbCr444 in desktop applications like the browser. In 8-bit mode, will it cause more banding than RGB full range? I assume in 10-bit mode it will be a non-issue, but what if the application is only 8-bit aware?
 
I can select RGB 12-bit and YCbCr444 12-bit on my LG 42LD450N. Dunno which is better though.
 
Both 10-bit RGB and YCbCr are broken on LG OLED / Nvidia, but 12-bit works fine. There's no need to use YCbCr at all on a HTPC since it's an unnecessary conversion - the video is always converted to RGB somewhere in the chain so it's better to output RGB.

RGB 12-bit is only selectable at 30 Hz and below, but the Nvidia driver doesn't remember the setting when changing refresh rates, so when the video player switches to 24 Hz it uses 8-bit. The only way to get 12-bit is to manually switch to 12-bit 24 Hz before playback.

However it seems that 8-bit with madVR's high quality dithering is actually superior to 10-bit input on most TVs for both BT.2020 as well as HDR. I compared 8-bit with dithering to 12-bit input on the TV and could not tell the difference in a 16-bit gradient test image. Switching off dithering on 8-bit resulted in wider banding.
 
Last edited:
Back
Top