Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I'm new to using a TV as a display so I'm kinda in new territory. These are the options I'm given. Does it make a difference in image qualily? In RGB I have the options of Limited and Full. The YCbCr444 only gives me the option of limited. There is also YCbCr422 but I can already tell that this option makes it look washed out.You're going to need to be a lot more specific in your question, because it doesn't make any sense as it is.
RGB what? Analog? Digital? What bit depth?
4:4:4 what? YCbCr? YPbPr? YCoCg?
Thanks. Side Question - My LG B9 has HDMI 2.1. Since Nvidia only supports GSync for my LG B9 and I only have 10 series cards (1080ti and 1070MQ), Would I benefit from anything else getting a current gen RTX? As far as 10 or 12 bit or 4K 120hz? Also, is there anything rumored on the 3000 series cards that also benefit from 2.1 that the 2000 series doesnt have?RGB FULL. Even though thats 4:4:4, it will be 8-bit since need HDMI 2.1 to get above that. Still better than other options at this stage until HDMI 2.1 arrives.
Thanks. Side Question - My LG B9 has HDMI 2.1. Since Nvidia only supports GSync for my LG B9 and I only have 10 series cards (1080ti and 1070MQ), Would I benefit from anything else getting a current gen RTX? As far as 10 or 12 bit or 4K 120hz? Also, is there anything rumored on the 3000 series cards that also benefit from 2.1 that the 2000 series doesnt have?
I'm new to using a TV as a display so I'm kinda in new territory. These are the options I'm given. Does it make a difference in image qualily? In RGB I have the options of Limited and Full. The YCbCr444 only gives me the option of limited. There is also YCbCr422 but I can already tell that this option makes it look washed out.
Absolutely. Plus I'm missing out on GSync because the LG OLEDs only support with with RTX cards. Sounds like I now need a video card to use my display to its full potential, which is the opposite of what it's usually been.not yet, 3k series will likely have HDMI 2.1 support. only thing you'd benefit from is better frame rates with a 2080 super/2080ti at 4k.
Absolutely. Plus I'm missing out on GSync because the LG OLEDs only support with with RTX cards. Sounds like I now need a video card to use my display to its full potential, which is the opposite of what it's usually been.
didn't realize they locked it down to turing only.. that's dumb..
the gsync through HDMI on LG OLED TV's seems to be locked down to turing only apparently after looking it up when i read his post.I thought you could force it on with the G-Sync options even with a G-Sync monitor? I have a Freesync v1 monitor that I've been able to use with it.
Oh yeah... I forgot that NVIDIA limited it to DisplayPort for the force G-Sync option, which I'm using.the gsync through HDMI on LG OLED TV's seems to be locked down to turing only apparently after looking it up when i read his post.
Yeah, kinda sucks because that was one (but not the main) reason I bought this TV. Doesn't look like its a Pascal limitation but something Nvidia could get to work cause the hardware is capable on both the LG OLED and Pascal cards.the gsync through HDMI on LG OLED TV's seems to be locked down to turing only apparently after looking it up when i read his post.
Haha I guess my $700 card wasn't enough for themPascal limitation? They simply want you to buy a new card.
Haha I guess my $700 card wasn't enough for them
YCbCr uses limited range color. This is how the color space works. If you use a YCbCr format then make sure the black level on your TV is set accordingly. RGB can be set to limited or full range, so again, set your black level accordingly.
The Black level options on the LG B9 TV is low, high, and auto. If I leave it on Auto and switch from full to limited it automatically picks the right one.What happens when you play blurry on your pc and have rgb enabled with full range? Will the black level be artificially enhanced in the movie?
madVR FAQ has a nice section on this:I'm connecting my Gigabyte 15x to my LG OLED B9.
Is there any difference between the two? Any benefits using one over the other?
Windows internally always "thinks" in RGB 0-255. Windows considers black to be 0 and white to be 255. That applies to the desktop, applications, games and videos. Windows itself never really thinks in terms of YCbCr or 16-235. Windows does know that videos might be YCbCr or 16-235, but still, all rendering is always done at RGB 0-255. (The exception proves the rule.)
So if you switch your GPU control panel to RGB 0-255, the GPU receives RGB 0-255 from Windows, and sends RGB 0-255 to the TV. Consequently, the GPU doesn't have to do any colorspace (RGB -> YCbCr) or range (0-255 -> 16-235) conversions. This is the best setup, because the GPU won't damage our precious pixels.
If you switch your GPU control panel to RGB 16-235, the GPU receives RGB 0-255 from Windows, but you ask the GPU to send 16-235 to the TV. Consequently, the GPU has to stretch the pixel data behind Windows' back in such a way that a black pixel is no longer 0, but now 16. And a white pixel is no longer 255, but now 235. So the pixel data is condensed from 0-255 to 16-235, and all the values between 0-15 and 236-255 are basically unused. Some GPU drivers might do this in high bitdepth with dithering, which may produce acceptable results. But some GPU drivers definitely do this in 8bit without any dithering which will introduce lots of nasty banding artifacts into the image. As a result I cannot recommend this configuration.
If you switch your GPU control panel to YCbCr, the GPU receives RGB from Windows, but you ask the GPU to send YCbCr to the TV. Consequently, the GPU has to convert the RGB pixels behind Windows' back to YCbCr. Some GPU drivers might do this in high bitdepth with dithering, which may produce acceptable results. But some GPU drivers definitely do this in 8bit without any dithering which will introduce lots of nasty banding artifacts into the image. Furthermore, there are various different RGB <-> YCbCr matrixes available. E.g. there's one each for BT.601, BT.709 and BT.2020. Now which of these will the GPU use for the conversion? And which will the TV use to convert back to RGB? If the GPU and the TV use different matrixes, color errors will be introduced. As a result I cannot recommend this configuration.
Summed up: In order to get the best possible image quality, I strongly recommend to set your GPU control panel to RGB Full (0-255).