RGB vs YCbCr444 in Nvidia Control Panel

Geezus

Limp Gawd
Joined
Apr 9, 2018
Messages
420
I'm connecting my Gigabyte 15x to my LG OLED B9.

Is there any difference between the two? Any benefits using one over the other?
 
You're going to need to be a lot more specific in your question, because it doesn't make any sense as it is.

RGB what? Analog? Digital? What bit depth?

4:4:4 what? YCbCr? YPbPr? YCoCg?
 
You're going to need to be a lot more specific in your question, because it doesn't make any sense as it is.

RGB what? Analog? Digital? What bit depth?

4:4:4 what? YCbCr? YPbPr? YCoCg?
I'm new to using a TV as a display so I'm kinda in new territory. These are the options I'm given. Does it make a difference in image qualily? In RGB I have the options of Limited and Full. The YCbCr444 only gives me the option of limited. There is also YCbCr422 but I can already tell that this option makes it look washed out.

Basically I'm asking what the optimal settings would be given the hardware and color settings I have. My Gigabyte 15x has a 1070MQ. have Lemme know if you need any further info.

I also wasn't sure if this is more of a video card issue since its in the Nvidia control panel or if it should be in the Display subforum.
1593050471740.png

1593050524579.png
 
I always use RGB 4:4:4 Pixel Format (Full RGB) but I have an AMD card.
 
RGB FULL. Even though thats 4:4:4, it will be 8-bit since need HDMI 2.1 to get above that. Still better than other options at this stage until HDMI 2.1 arrives.
 
RGB FULL. Even though thats 4:4:4, it will be 8-bit since need HDMI 2.1 to get above that. Still better than other options at this stage until HDMI 2.1 arrives.
Thanks. Side Question - My LG B9 has HDMI 2.1. Since Nvidia only supports GSync for my LG B9 and I only have 10 series cards (1080ti and 1070MQ), Would I benefit from anything else getting a current gen RTX? As far as 10 or 12 bit or 4K 120hz? Also, is there anything rumored on the 3000 series cards that also benefit from 2.1 that the 2000 series doesnt have?
 
Thanks. Side Question - My LG B9 has HDMI 2.1. Since Nvidia only supports GSync for my LG B9 and I only have 10 series cards (1080ti and 1070MQ), Would I benefit from anything else getting a current gen RTX? As far as 10 or 12 bit or 4K 120hz? Also, is there anything rumored on the 3000 series cards that also benefit from 2.1 that the 2000 series doesnt have?

not yet, 3k series will likely have HDMI 2.1 support. only thing you'd benefit from is better frame rates with a 2080 super/2080ti at 4k.
 
I'm new to using a TV as a display so I'm kinda in new territory. These are the options I'm given. Does it make a difference in image qualily? In RGB I have the options of Limited and Full. The YCbCr444 only gives me the option of limited. There is also YCbCr422 but I can already tell that this option makes it look washed out.

There's basically no difference between RGB 4:4:4 and YCbCr 4:4:4 IF the latter option supports full range. When it doesn't, you're limited to a color ramp of 16-235 vs 0-255. But, you'll always want to use RGB on computer monitors because it's been the standard since forever. TVs on the other hand have had a range of different color spaces due to the different formats they have to support, and may not support true RGB.

YCbCr 4:2:2 and 4:2:0 shouldn't be used unless that's the only thing the display supports. If you have to use these modes, it's probably a good time to look for a new display.
 
not yet, 3k series will likely have HDMI 2.1 support. only thing you'd benefit from is better frame rates with a 2080 super/2080ti at 4k.
Absolutely. Plus I'm missing out on GSync because the LG OLEDs only support with with RTX cards. Sounds like I now need a video card to use my display to its full potential, which is the opposite of what it's usually been.
 
Absolutely. Plus I'm missing out on GSync because the LG OLEDs only support with with RTX cards. Sounds like I now need a video card to use my display to its full potential, which is the opposite of what it's usually been.

didn't realize they locked it down to turing only.. that's dumb..
 
Last edited:
I thought you could force it on with the G-Sync options even with a G-Sync monitor? I have a Freesync v1 monitor that I've been able to use with it.
the gsync through HDMI on LG OLED TV's seems to be locked down to turing only apparently after looking it up when i read his post.
 
YCbCr uses limited range color. This is how the color space works. If you use a YCbCr format then make sure the black level on your TV is set accordingly. RGB can be set to limited or full range, so again, set your black level accordingly.
 
the gsync through HDMI on LG OLED TV's seems to be locked down to turing only apparently after looking it up when i read his post.
Oh yeah... I forgot that NVIDIA limited it to DisplayPort for the force G-Sync option, which I'm using.
 
the gsync through HDMI on LG OLED TV's seems to be locked down to turing only apparently after looking it up when i read his post.
Yeah, kinda sucks because that was one (but not the main) reason I bought this TV. Doesn't look like its a Pascal limitation but something Nvidia could get to work cause the hardware is capable on both the LG OLED and Pascal cards.
 
YCbCr uses limited range color. This is how the color space works. If you use a YCbCr format then make sure the black level on your TV is set accordingly. RGB can be set to limited or full range, so again, set your black level accordingly.

When you use Ycbcr most TV's autodetect the correct black level, so it is a pretty much fool proof method for people who are not savvy with stuff like this.

That said, for PC you want to use RGB because most PC content uses RGB. This way you avoid unnecessary back and forth conversions. But for your bluray player in your home theater you want to use Ycbcr if there is one in the settings because that is the format movies are in. Again to avoid unnecessary conversions.
 
What happens when you play blurry on your pc and have rgb enabled with full range? Will the black level be artificially enhanced in the movie?
The Black level options on the LG B9 TV is low, high, and auto. If I leave it on Auto and switch from full to limited it automatically picks the right one.

If I set it to limited in Nv control panel and set it to high black level, it looks washed out (as it should)
 
I'm connecting my Gigabyte 15x to my LG OLED B9.

Is there any difference between the two? Any benefits using one over the other?
madVR FAQ has a nice section on this:

Windows internally always "thinks" in RGB 0-255. Windows considers black to be 0 and white to be 255. That applies to the desktop, applications, games and videos. Windows itself never really thinks in terms of YCbCr or 16-235. Windows does know that videos might be YCbCr or 16-235, but still, all rendering is always done at RGB 0-255. (The exception proves the rule.)

So if you switch your GPU control panel to RGB 0-255, the GPU receives RGB 0-255 from Windows, and sends RGB 0-255 to the TV. Consequently, the GPU doesn't have to do any colorspace (RGB -> YCbCr) or range (0-255 -> 16-235) conversions. This is the best setup, because the GPU won't damage our precious pixels.

If you switch your GPU control panel to RGB 16-235, the GPU receives RGB 0-255 from Windows, but you ask the GPU to send 16-235 to the TV. Consequently, the GPU has to stretch the pixel data behind Windows' back in such a way that a black pixel is no longer 0, but now 16. And a white pixel is no longer 255, but now 235. So the pixel data is condensed from 0-255 to 16-235, and all the values between 0-15 and 236-255 are basically unused. Some GPU drivers might do this in high bitdepth with dithering, which may produce acceptable results. But some GPU drivers definitely do this in 8bit without any dithering which will introduce lots of nasty banding artifacts into the image. As a result I cannot recommend this configuration.

If you switch your GPU control panel to YCbCr, the GPU receives RGB from Windows, but you ask the GPU to send YCbCr to the TV. Consequently, the GPU has to convert the RGB pixels behind Windows' back to YCbCr. Some GPU drivers might do this in high bitdepth with dithering, which may produce acceptable results. But some GPU drivers definitely do this in 8bit without any dithering which will introduce lots of nasty banding artifacts into the image. Furthermore, there are various different RGB <-> YCbCr matrixes available. E.g. there's one each for BT.601, BT.709 and BT.2020. Now which of these will the GPU use for the conversion? And which will the TV use to convert back to RGB? If the GPU and the TV use different matrixes, color errors will be introduced. As a result I cannot recommend this configuration.

Summed up: In order to get the best possible image quality, I strongly recommend to set your GPU control panel to RGB Full (0-255).
 
Back
Top