LG 48CX

Potentially stupid question: What's the "correct" setting for connecting to a PC via HDMI 2.0 @ 4k60:

A: RGB 8-bit, Full dynamic range
B: YCbCr 4:2:2 10-bit

On my B6, I find option A works better when the display is set to PC mode (which makes sense, since that tells the TV to expect a RGB signal), and I'm wondering if the same is true for the C9/CX series. When I try option B, I simply run into too many issues with black levels and text readability.
 
Potentially stupid question: What's the "correct" setting for connecting to a PC via HDMI 2.0 @ 4k60:

A: RGB 8-bit, Full dynamic range
B: YCbCr 4:2:2 10-bit

On my B6, I find option A works better when the display is set to PC mode (which makes sense, since that tells the TV to expect a RGB signal), and I'm wondering if the same is true for the C9/CX series. When I try option B, I simply run into too many issues with black levels and text readability.

Always go with Full range 444 whenever you can for PC usage. 10Bit 422 should be only be used for HDR...but as I mentioned earlier, it is actually NOT required. You can simply keep the TV in RGB 8Bit 444 at all times and still run HDR that way too.
 
Potentially stupid question: What's the "correct" setting for connecting to a PC via HDMI 2.0 @ 4k60:

A: RGB 8-bit, Full dynamic range
B: YCbCr 4:2:2 10-bit

On my B6, I find option A works better when the display is set to PC mode (which makes sense, since that tells the TV to expect a RGB signal), and I'm wondering if the same is true for the C9/CX series. When I try option B, I simply run into too many issues with black levels and text readability.


Option B is poor for text readability, because you're limiting chroma resolution.

Chroma is only noticeable for high-contrast color changes (like dark color text on a white background.)

You're less likely to notice the reduction in chroma in high-speed games, and this issue is so unlikely to show up during video playback, they already decided to encode all DVD/Bluray/4k Bluray discs with 4:2:0 Chroma.

I'm not sure I would run HDR at 8-bit only, but that''s on you to decide. I run my C7 in non-HDR mode 8-bit for desktop use, and only turn on HDR 10-bit if I'm gaming.
 
Last edited:
Apparently every HDMI cable I own is cheap janky 18gbs crapola.....so anybody testing the club3D adapter make sure you are using their 48gbs hdmi otherwise your probably gonna run into blanking and blackout issues.

My club3d hdmi arrives Thursday and I will let yall know if it solves the problems (which I think it will).
I think it was a mistake for Club3d to sell this adapter separate from their premium HDMI cable
 
  • Like
Reactions: Fleat
like this
Apparently every HDMI cable I own is cheap janky 18gbs crapola.....so anybody testing the club3D adapter make sure you are using their 48gbs hdmi otherwise your probably gonna run into blanking and blackout issues.

My club3d hdmi arrives Thursday and I will let yall know if it solves the problems (which I think it will).
I think it was a mistake for Club3d to sell this adapter separate from their premium HDMI cable
HDMI cables are a crapshoot. I always chuckle to myself when people talk about how it is digital and it either works or it doesn't. Those people have not dealt with long runs or high bandwidth HDMI needs and experienced very inconsistent behavior or just flat out unexplainable stuff.

I ended up with a fiber optic cable for my theater projector run after giving up on several "high quality" long HDMI cables and that resolved all of my problems.
 
HDMI cables are a crapshoot. I always chuckle to myself when people talk about how it is digital and it either works or it doesn't. Those people have not dealt with long runs or high bandwidth HDMI needs and experienced very inconsistent behavior or just flat out unexplainable stuff.

I ended up with a fiber optic cable for my theater projector run after giving up on several "high quality" long HDMI cables and that resolved all of my problems.

any 6'-10' fiber optic 48gbs cables available? I would like to order some more high bandwidth hdmi and THROW OUT all my old janky console cables
 
any 6'-10' fiber optic 48gbs cables available? I would like to order some more high bandwidth hdmi and THROW OUT all my old janky console cables

As far as I know, they really don't have HDMI fiber optic cables that short at this time.

This is my go to optical cable: https://www.monoprice.com/product?p_id=13700

I don't think anyone has 48gbps optical yet.

I see some available on Amazon, but it depends on how much you trust these no-name vendors.
 
Last edited:
Option B is poor for text readability, because you're limiting chroma resolution.

Chroma is only noticeable for high-contrast color changes (like dark color text on a white background.)

You're less likely to notice the reduction in chroma in high-speed games, and this issue is so unlikely to show up during video playback, they already decided to encode all DVD/Bluray/4k Bluray discs with 4:2:0 Chroma.

I'm not sure I would run HDR at 8-bit only, but that''s on you to decide. I run my C7 in non-HDR mode 8-bit for desktop use, and only turn on HDR 10-bit if I'm gaming.

That's kinda what I figured; I'm working from home at the moment doing code development, so I really can't stand 4:2:2 10-bit. RGB also seems to do better when processing near-black levels then YCbCr, based on the black level tests I've done on both. I'm not running HDR at the moment since Window's HDR implementation sucks (any OSD notification blanks the display for several seconds; not fixed as of 1909).

Which brings us to the next question: What the fundamental difference between RGB versus YCbCr anyways (despite being different formats)? Disregarding bandwidth limitations for a second, what's the fundamental difference between RGB and YCrCr at the same resolution/refresh/bit-depth settings? I'm generally good with this sort of thing, but this is one thing no one has ever explained (well) to me.
 
https://www.hdmi.org/spec/hdmi2_1

They said on that site that they would probably be available sometime in the first half of 2020 but that might have been just wishful thinking.

From that site:

" Existing HDMI High Speed Cables with Ethernet can only deliver some of the new features, and the new Ultra High Speed HDMI Cable is the best way to connect HDMI 2.1 enabled devices to ensure delivery of all the features with improved EMI characteristics. "

So be careful like Fleat said.

Other fiber hdmi cables that come out later claiming hdmi 2.1 might work without being "officially certified", and the ultra hdmi cable spec info on that site could be some marketing on top of the certification, but I'd get one of those ultra ones when available to be safe.

UltraHighSpeedHdmiCableWithLabel.png
 
That's kinda what I figured; I'm working from home at the moment doing code development, so I really can't stand 4:2:2 10-bit. RGB also seems to do better when processing near-black levels then YCbCr, based on the black level tests I've done on both. I'm not running HDR at the moment since Window's HDR implementation sucks (any OSD notification blanks the display for several seconds; not fixed as of 1909).

Which brings us to the next question: What the fundamental difference between RGB versus YCbCr anyways (despite being different formats)? Disregarding bandwidth limitations for a second, what's the fundamental difference between RGB and YCrCr at the same resolution/refresh/bit-depth settings? I'm generally good with this sort of thing, but this is one thing no one has ever explained (well) to me.


Both have the same number of colors they can resolve (YCrCr at 4:4:4 Chroma = RGB), but since they already used it for Chroma Subsampling of lossy video formats, they just reused it for the PC.

See here:

https://en.wikipedia.org/wiki/Chroma_subsampling

When you're talking about reducing Chroma resolution, you need something as descriptive as YCrCr, so why reinvent the wheel?
 
Option B is poor for text readability, because you're limiting chroma resolution.

Chroma is only noticeable for high-contrast color changes (like dark color text on a white background.)

You're less likely to notice the reduction in chroma in high-speed games, and this issue is so unlikely to show up during video playback, they already decided to encode all DVD/Bluray/4k Bluray discs with 4:2:0 Chroma.

I'm not sure I would run HDR at 8-bit only, but that''s on you to decide. I run my C7 in non-HDR mode 8-bit for desktop use, and only turn on HDR 10-bit if I'm gaming.

Someone in this thread mentioned that using HDR at 4:4:4 8Bit actually gives a better image than using HDR at 4:2:2 10Bit. Of course 4:4:4 10Bit would look the best but no GPU can output that atm. I personally have tried the first two and I honestly cannot tell a difference between them at all. Perhaps someone with a keener eye can spot the differences but if I can't notice anything between 10Bit 4:2:2 and 8Bit 4:4:4 I'd rather just leave the TV always in 4:4:4 mode and never worry about switching.
 
Someone on YouTube said he tried a fiber optic 48 Gbps cable with the CAC-1085 and it didn't work. There are non-certified 48 Gbps fiber optic cables, and Monoprice makes the most expensive ones with polymer instead of glass.

Which brings us to the next question: What the fundamental difference between RGB versus YCbCr anyways (despite being different formats)? Disregarding bandwidth limitations for a second, what's the fundamental difference between RGB and YCrCr at the same resolution/refresh/bit-depth settings? I'm generally good with this sort of thing, but this is one thing no one has ever explained (well) to me.
RGB uses primary colour intensities per pixel, so qualifying it with "full colour resolution" or "no subsampling" is nonsensical. YCbCr is generated by sampling (or subsampling) some form of an RGB-like image.

Both RGB and YCbCr444 are full colour resolution formats, but most software operates in RGB internally. Converting RGB to YCbCr444 is a floating point operation and isn't lossless, though the signal to the display itself is lossless.
 
Last edited:
  • Like
Reactions: Fleat
like this
That's kinda what I figured; I'm working from home at the moment doing code development, so I really can't stand 4:2:2 10-bit. RGB also seems to do better when processing near-black levels then YCbCr, based on the black level tests I've done on both. I'm not running HDR at the moment since Window's HDR implementation sucks (any OSD notification blanks the display for several seconds; not fixed as of 1909).

Which brings us to the next question: What the fundamental difference between RGB versus YCbCr anyways (despite being different formats)? Disregarding bandwidth limitations for a second, what's the fundamental difference between RGB and YCrCr at the same resolution/refresh/bit-depth settings? I'm generally good with this sort of thing, but this is one thing no one has ever explained (well) to me.

This is a pretty good explanation (though a little dated in the hdmi format support specs since it's from 2 yrs ago) from the OLED sub on reddit

https://www.reddit.com/r/OLED/comments/81avu0/rgb_vs_ycbcr_for_oled/

RGB was causing black crush only when there was a mismatch in how the Blu-Ray (or other media device) was outputting and how the TV was had it's Black level setting.
RGB comes in two variations, Limited (16-235) and Full (0-255). And whichever you set as an output, the TV must match it in it's Black level settings. Otherwise you'll get either washed out blacks (when output is RGB Limited but TVs black level is set to Full), or crushed blacks (when output is RGB Full but TVs black level is set to Limited).

As for how the content is passing through from mastering phase until you watch it, everything starts out in YCbCr (also known as YUV or YCC) and ends up in RGB to be displayed by the individual pixels of your TV (unless you have a CRT TV, in which case it remains YUV).

The bottom line is, everything digital is RGB, everything in the movie/TV/broadcast industry is YUV. Everything that's being displayed by a computer is generated in RGB, everything that is being sent to your TV (be it, TV, Blu-Ray, etc.) is YUV.

So at some point from the post-production/mastering until you watch it on the TV, you need to convert from YUV to RGB. The only question is, when do you do it. And the shortest answer is, as late as possible. Preferable at the very end, at the TV side.

Chroma subsampling was something added to decrease the bandwidth required to push picture information in any given media format. As 4K resolution goes, you can't send for example HDR 4K @60Hz as RGB, there's not enough bandwidth in the HDMI spec.

As it's been said, all movies and TV shows that comes from either a Blu-Ray player, or streaming services are "done" in YUV 4:2:0. So the ideal way is to give that exact signal to the TV, and then the TV will take that signal and start working some magic and transform it into RGB to send it to each individual pixel.
Now because of some weird and complicated reasons, HDMI standards limit the amount of available configurations of resolution, bit depth, chroma subsampling and color space. So, for example you can't send 4K @24Hz, 10bit, YUV420, Rec.2020 but you CAN send 4K @60Hz, 10bit, YUV420, Rec.2020 or 4K @24Hz, 12bit, YUV422, Rec.2020, both of which require more bandwidth.
Because of this, with the recent launch of 4K content, there's been a lot of talk about this, because UHD Blu-Ray is served as 4K @24Hz, 8bit, YUV420, Rec.709 and HDR UHD Blu-Ray is 4K @24Hz, 10bit, YUV420, Rec.2020 the exact setup that can't be send via HDMI to a TV. So instead of sending the content untouched to the TV, you have to do some chroma upsampling on the media device and upsample from YUV420 to YUV422 or YUV44 or RGB (if the framerate allows it) then send it to the TV where another conversion will take place.
So, to answer your initial question, always use YCbCr when the content is Movie/TV/Broadcasts, always use RGB when the content is Games.

https://en.wikipedia.org/wiki/YCbCr
YCbCr, Y′CbCr, or Y Pb/Cb Pr/Cr, also written as YCBCR or Y'CBCR, is a family of color spaces used as a part of the color image pipeline in video and digital photography systems. Y is the luma component and CB and CR are the blue-difference and red-difference chroma components. Y′ (with prime) is distinguished from Y, which is luminance, meaning that light intensity is nonlinearly encoded based on gamma corrected RGB primaries.

Y′CbCr color spaces are defined by a mathematical coordinate transformation from an associated RGB color space. If the underlying RGB color space is absolute, the Y′CbCr color space is an absolute color space as well; conversely, if the RGB space is ill-defined, so is Y′CbCr.
 
Last edited:
Of course 4:4:4 10Bit would look the best but no GPU can output that atm.
Actually 8-bit dithered RGB at 120 Hz is still better than 10-bit RGB. You get less banding from a temporally dithered lower bit-depth signal, especially at high refresh rates. If the source were to render to a 12-bit+ surface and output a 10-bit dithered signal, that would be better than 8-bit dithered.
 
Last edited:
Actually 8-bit dithered RGB at 120 Hz is still better than 10-bit RGB. You get less banding from a dithered lower bit-depth signal, especially at high refresh rates. If the source were to render to a 12-bit+ surface and output a 10-bit dithered signal, that would be better than 8-bit dithered.

Interesting. Did you test it out on an X27? As far as I'm aware that's the only way to get 10Bit 444 atm(98Hz). But the X27 is not a true 10Bit panel though.
 
Someone on YouTube said he tried a fiber optic 48 Gbps cable with the CAC-1085 and it didn't work. There are non-certified 48 Gbps fiber optic cables, and Monoprice makes the most expensive ones with polymer instead of glass.


RGB uses colour intensities per pixel, so qualifying it with "full colour resolution" or "no subsampling" is nonsensical. YCbCr is generated by sampling (or subsampling) some representation of an RGB image.

Both RGB and YCbCr444 are full colour resolution formats, but most software operates in RGB internally. Converting RGB to YCbCr444 is a floating point operation and could result in minor loss, though the signal to the display itself is lossless.

Really interesting! Do you happen to have a link to that youtube video? I wonder if that is some preshadowing to trying to get full 4k120hz working with these fiber optic cables.

One other thing I just noticed, RUIPRO does actually offer the 8K cables and I would consider them a trusted brand in fiber optic cables. They also offer shorter lengths now. You won't like the price though.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Interesting. Did you test it out on an X27? As far as I'm aware that's the only way to get 10Bit 444 atm(98Hz). But the X27 is not a true 10Bit panel though.
Tested it on both the X27 and CX. The X27 does 8-bit + FRC if you send it a 10-bit signal, so there is no sense in wasting bandwidth on 10-bit. Just do 8-bit + dithering at the source. The CX looks identical in 8-bit + dithering, 10-bit, and 12-bit (does it do 10-bit + FRC?).
At 30 Hz you will definitely see 15 Hz dithering flicker. At 120 Hz it's imperceptible.
 
Ok so I got the Club 3D HDMI cable today, and the Black screen / no signal issues are gone using with the CAC-1085 adapter. So if you guys use a different HDMI cable not from Club 3D, then u' will have the No signal issues if u change any resolution.
But I still get the Washed out color when turn on HDR in windows though.
Interesting, are you able to run 4k @ 120hz with 4:4:4/RGB?
 
Tested it on both the X27 and CX. The X27 does 8-bit + FRC if you send it a 10-bit signal, so there is no sense in wasting bandwidth on 10-bit. Just do 8-bit + dithering at the source. The CX looks identical in 8-bit + dithering, 10-bit, and 12-bit (does it do 10-bit + FRC?).

How are you able to compare 8bit 444 to 10 and 12bit 444 on a CX when hdmi 2.0 doesn't have the bandwidth? 30Hz?
 
Reduce the resolution or drop to 30 Hz. You can see dithering flicker at 30 Hz on a monitor so it's not suitable. If the display performs 5:5 pulldown to convert 30 Hz to 120 Hz, which the CX does, then it's fine.

Good stuff. I'll definitely do some comparisons myself once Ampere comes out, IF nvidia allows for 10Bit 444 over HDMI. If they don't, well I guess I'm forced to just use 8Bit anyway ha.
 
Good stuff. I'll definitely do some comparisons myself once Ampere comes out, IF nvidia allows for 10Bit 444 over HDMI. If they don't, well I guess I'm forced to just use 8Bit anyway ha.
Why do you want to use YCbCr444 at all when you can use RGB and avoid conversion? 10-bit RGB works.
 
I mean even if Club3D's own HDMI cable works with the CAC-1085, HDR still looks washed out and for me is the entire reason for buying the adapter (4K/120/HDR). It seems like right now it's only benefit is getting 120hz on the desktop and avoiding 4:2:0 text quality.
 
Over HDMI? Hey If I can use 4k120Hz 10Bit RGB Full over HDMI on Ampere then there's no question that I will use it.
Sorry I was thinking of DisplayPort. I believe 10-bit and 12-bit RGB / YCbCr444 use the same bandwidth on HDMI so 10-bit is redundant. We will have to see how this affects the CX as LG says it accepts 10-bit 4:4:4. Maybe only the internal transfer within the chipset is limited to 40 Gbps / 10-bit, but it can receive a 12-bit signal.
 
Last edited:
I mean even if Club3D's own HDMI cable works with the CAC-1085, HDR still looks washed out and for me is the entire reason for buying the adapter (4K/120/HDR). It seems like right now it's only benefit is getting 120hz on the desktop and avoiding 4:2:0 text quality.
Really the $55.99 is worth it if I can get the text to look a lot less blurry. I work numerous hours on the CX and the fuzzy text is starting to annoy me.
 
Sorry I was thinking of DisplayPort. I believe 10-bit and 12-bit RGB / YCbCr444 use the same bandwidth on HDMI so 10-bit is redundant. We will have to see how this affects the CX as LG says it accepts 10-bit 4:4:4. Maybe only the internal transfer within the chipset is limited to 40 Gbps / 10-bit, but it can receive a 12-bit signal.

Right. So then 10Bit RGB would look superior to 8Bit RGB on the CX in HDR then right? Or would it still be the same story where 8Bit is better or there's no difference between the two.
 
Sweet. I can retire my 49KU6300 when this comes out.

Seeing the speculation of the use of 'C' in the model name I think LG is smart enough to realize that a lot of us clamoring for a smaller OLED are wanting to give it double duty as a monitor or gaming display. Regardless, I think it would be a huge misstep for LG to go the Samsung route with their smaller displays.
I really want 4K and after reading your post, and checking my non existent budget, I found the series 8 version of what you have.
Thanks fo rthe tip.
Samsung - 43" Class - 8 Series - 4K UHD TV - Smart - LED - with HDR
 
Part of the reason I bought a 2 year warranty from Best Buy is because of this. I thought about buying the 5-year warranty, but my tastes change so frequently that I will more than likely have moved on to something better by then. If I can get 2 years worth of use out of this, I will be happy. At that point, I will be more than willing to buy whatever is the comparable tech to OLED at that time.

OLED has set the bar stupidly high when it comes to picture quality and gaming tech. It can only get better from here.
I live in Maine, you buy a TV in Maine and it comes by law, with a 4 year guarantee. Best Buy hates it, Wally World hates it.. The only difference is you are responsible for bringing it in and you must have the original receipt or have purchases with a BB card.
 
Got the club 3d adapter today and it has been a nightmare with my Janky Ass Old Ass PS4 era HDMI cable....got one of those fancy 48gbs ULTRA PRON edition HDMIs coming from 3D club and will see on Thursday

Which fancy 48gbs Ultra HDMI did you get? link?
 
Yes, right now I'm running RGB (not 4:4:4, prob the same), Full Dyanmic range, 12bit, 120hz in Nvidia Control panel with the Club 3D HDMI cable and the adapter of course.
I was able to run the same. No washed out HDR enabled colors but didnt want to wait for club3D cables and returned the adapter. I will wait for a more vanilla PnP solution with Ampere, should prove useful to get decent frame rates at 4K HDR too in recent titles.
 
Right. So then 10Bit RGB would look superior to 8Bit RGB on the CX in HDR then right? Or would it still be the same story where 8Bit is better or there's no difference between the two.
There wouldn't be any difference, with 10-bit RGB potentially being worse than 8-bit dithered RGB at 120 Hz.
 
I have an important request. Can someone PLEASE do a semi in-depth review of the Club3D adapter and Club3D HDMI cable?

There is literally no real "consistent" information out there. I am only hearing the bad, BUT, with a few people reporting on good results if the proper cable is used. I am interested in hearing from someone that is unbiased and honest and not from people that have unrealistic expectations and have a general attitude of "tech sourness" that will just trash this adapter.

What I would like to know is setup, daily usage, any type of software / hardware changes that have to constantly be made daily or hourly depending on use. An example of this would be gaming?

I am trying to decide if I should buy this.
 
What I would like to know is setup, daily usage, any type of software / hardware changes that have to constantly be made daily or hourly depending on use. An example of this would be gaming?
If you use HDR 4K 120 Hz 8-bit dithered RGB and never change resolution or disable HDR, there shouldn't be any issues. On a 2080 Ti with DSC you can do 10-bit RGB. If you need to change resolution / refresh rate in video players to match the content, you may run into the "no signal" issue.
 
Looks like I'm moving to Maine.

hahaha I was just thinking the same thing.

What's funny is that in Europe, by law, all products must carry a 2 year warranty. And, as a result, house hold electronics and goods are in many instances made to a higher level of quality. And you or anyone else can go Google that.

One famous example of this is Corning glass ware, the recipe that is used for European Corning products is the original formula from the 1950's that originated in American. The current American "formula" is made out of cheaper materials. I might have the manufactures name wrong but that is a true and ongoing fact.
 
Back
Top