LG 48CX

I don't have the Club3D adapter. I'm just using a straight HDMI connection from my PC to my CX. I tested some HDR games with 8bit RGB and 10bit 4:2:2 and between the two modes, I saw no differences. Unless the CX is still subsampling it to 4:2:2??
4K 60 Hz 12-bit RGB is not subsampled on the CX. 120 Hz 8-bit RGB and up is subsampled, though the HDMI Diagnostics shows the CX is receiving an RGB signal.
 
Last edited:
I got the CAC-1732 cable. The signal still drops out, but less frequently. HDR is still washed out, but less frequently.

4K 120 Hz, 8-bit / 10-bit RGB renders at 4:2:2.
I switched off HDMI Ultra HD Deep Colour on the TV and was still able to select 4K 60 Hz 10-bit / 12-bit RGB.

This means the adapter is handshaking YCbCr and still reporting RGB to the GPU, leading to washed out colours in RGB HDR. It's still unknown whether 120 Hz RGB can be displayed without subsampling by the CX with a native HDMI 2.1 GPU.
 
Last edited:
How are you guys seeing the signal info the TV is receiving? I figured out how to make it show the resolution and audio format, but not any detail beyond that.
 
Press 11111 while "Channel Tuning" is highlighted. Then select the "HDMI Mode" label to view more detailed information.
There seems to a bug where it reports YCbCr422 10-bit / 12-bit as 8-bit all the time, because I don't see 8-bit banding. This happens even when directly connected to the GPU, so it's likely a bug.
 
Last edited:
In regards to trying to get the best possible rendering of text (Windows 10). I have done most things discussed here both with regards to TV settings and Windows settings but even if I adjust for the size difference by increasing viewing distance, I just can't get it good enough. Now, I should perhaps add that I am comparing to a really good IPS and a much smaller screen (Acer X27) but still, the LCD is just better in this regard. Of course, this perhaps isn't really a complete surprise, as these TVs probably where never intended to be used as work monitors, but still a bit disappointing.

So, even though I think I know the answer and has already tried it, I still pop the question regarding what are the best settings on the TV and in Windows to get the best possible text rendering? One thing I have yet to experiment with is scaling. I know from previous experience that sometimes a bit scaling can actually improve text rendering, even if you don't really need it from a eye sight perspective, have anyone experienced with that to achieve better text (again, not because it is needed to actually see stuff but get them to render better).

Thanks.
 
In regards to trying to get the best possible rendering of text (Windows 10). I have done most things discussed here both with regards to TV settings and Windows settings but even if I adjust for the size difference by increasing viewing distance, I just can't get it good enough. Now, I should perhaps add that I am comparing to a really good IPS and a much smaller screen (Acer X27) but still, the LCD is just better in this regard. Of course, this perhaps isn't really a complete surprise, as these TVs probably where never intended to be used as work monitors, but still a bit disappointing.

So, even though I think I know the answer and has already tried it, I still pop the question regarding what are the best settings on the TV and in Windows to get the best possible text rendering?

Thanks.
8-bit RGB with PC icon is the only combination that does not subsample, resulting in sharp text. You can do 12-bit RGB without subsampling on HDMI 2.1, but only at 60 Hz as far as I can tell. There will still be some subpixel rendering issues because of the RGBW layout. Reset ClearType to default if you've changed anything.
 
Last edited:
I found a workaround for the washed out HDR. The TV reports RGB / YCbCr422 NODATA and uses the wrong gamut. If you force BT.2020 in the HDMI Signalling Override menu, the colours are correct.
 
Last night, I did a comparison between my CX and Q90R using some high contrast videos on youtube.

The Q90R gets blindingly bright, but if you set the brightness to equal between the CX and Q90R, the CX stomps all over the Q90R. While I think the FALD implementation on the Q90R is still one of the best in the industry, it just doesn't compares to the pixel-accurate lighting of OLED and the CX. If you look straight on, it's very difficult to see the individual backlight zones on the Q90R. However, if you get to a scene that has a lot of light points on a black background, and look at it off center, it becomes extremely obvious where the backlights are and what they're doing. For gaming, the difference is crazy as well. Everything just punches on the OLED, but it all seems muted on the Q90R. Motion is fine on the Q90R, but for high framerate content, the CX is definitely cleaner.

I honestly think that the ONLY reason people should consider QD-LED tech now is because they want something super bright, or they're just scared of burn-in. I've been beating the snot out of my CX since I got it a few months ago, and so far it's been fine. However, I am concerned that the panel being on as much as it has will lead to premature degradation of the organic substrate in some of the pixels. I continue to keep the OLED Light between 0-30 except for HDR content on youtube/movies, where I allow it to be 100. SDR is the norm for my content, though.

Otherwise, OLED is the better technology. Full stop.

I made a similar comparison between my 55" GX and the Q95T I had at the time, same conclusions. With one exception, I didn't really find the Q95T that much brighter. In absolute terms, I am sure it was, but in actual "perception", I just found the surprisingly equal. I guess it might be the better contrast and black levels that sort of evened out the playing field. To make sure I tried out different sources just to make sure, connected to same PC, NetFlix built in apps etc but the results were the same.
 
If it eases your mind at all, I owned (and still do) my B7 for over two and a half years. For the last year and a half, before I got my CX, I worked from home and used it as my primary monitor for all things and it would usually be on for 8-10 hours a day, if not more (if I decided to game on it after work). When the CX arrived and I took the B7 down, it was still just as bright and beautiful as ever. Honestly, outside of some crazy scenario like running it 20 hours a day on an OLED light setting of 100, I wouldn’t worry about it. By the time you notice any degradation, there will be something better available (even if it’s a future iteration of OLED) and I would have no problem buying another one at that time because I am NOT going back to LCD/LED.

Same experience, both with usage and with results. To be honest, there were some small residue of burn in but then I had not really taken any precations to prevent it (hide taskbar, dark theme etc) and had been running it at max brightness.
 
Here are my suggestions for using the CAC-1085 with the CX.
  1. Even with a short cable like the CAC-1371 / CAC-1372, the adapter sometimes loses signal when changing display modes. Unplugging the USB cable fixes this.
  2. The CAC-1085 frequently doesn't report the BT.2020 colour space to the TV when running HDR at 4K 60 Hz 10-bit and above, resulting in washed out colours. Use the HDMI Signalling Override menu on the CX to force BT.2020. If you have a C9, the only reliable mode is 4K 120 Hz 10-bit YCbCr422.
  3. The CAC-1085 sometimes drops to YCbCr420 on the HDMI side when changing display modes, though the GPU reports RGB. Unplugging the USB cable fixes this.
  4. The CX renders in 4:4:4 up to 4K 60 Hz 12-bit RGB with PC icon. At 120 Hz, the CX renders RGB at 4:2:2 even with PC icon, so it's the same as Game mode without PC icon.
  5. Since 4K 120 Hz RGB is always rendered at 4:2:2, just leave PC icon off for easier access to TruMotion when switching between Game & Cinema mode.
  6. Always run 4K 120 Hz 8-bit / 10-bit RGB to prevent double subsampling by the GPU and the CX.
 
Last edited:
Why would screen shift mess with text?

Eh I wouldn't know, I don't work for LG ;) but just check out feedback from other users who reported that turning screen shift off helped.

1595373063417.png
 

Attachments

  • 1595372950073.png
    1595372950073.png
    263.4 KB · Views: 0
Why would screen shift mess with text? Doesn't that just move the image a bit back and forth?

Because you can't shift a screen (and have it still completely fill the display) without zooming in on it ever so slightly first, which makes text no longer 1:1 pixel perfect, it's ever so slightly overscanned.
 
8-bit RGB and PC mode is the only combination that does not subsample, resulting in sharp text. You may be able to do 10-bit RGB on HDMI 2.1. There will still be some subpixel rendering issues because of the RGBW layout. Reset ClearType to default if you've changed anything.

Unfortunately, that's what I already have. And I have found that Grey scale rendering seem to be better than RGB rendering.
 
Because you can't shift a screen (and have it still completely fill the display) without zooming in on it ever so slightly first, which makes text no longer 1:1 pixel perfect, it's ever so slightly overscanned.

While I can accept the theory here, I tried it out and couldn't see any difference whatsoever regarding text rendering even from really close. Is this a a known fact that all agrees on, ie that it works like this? Or does it only work with specific settings (I use PC mode with Original screen size)
 
While I can accept the theory here, I tried it out and couldn't see any difference whatsoever regarding text rendering even from really close. Is this a a known fact that all agrees on, ie that it works like this? Or does it only work with specific settings (I use PC mode with Original screen size)

You can see the slight zoom when you turn screen shift on/off in the settings when you have stuff on the screen. You can't zoom and have something that's supposed to be 1 pixel wide always be 1 pixel wide.
 
You can see the slight zoom when you turn screen shift on/off in the settings when you have stuff on the screen. You can't zoom and have something that's supposed to be 1 pixel wide always be 1 pixel wide.

Have tried it both on my C8 and my GX now and I see no difference what so ever. That does not mean you are wrong though, but perhaps there are settings etc that affect it? What you describe sounds almost like overscan.

My understanding of the pixel shift was that it periodically just moved the image back and forth but only very little, like one pixel or something like that.
 
Last edited:
Have tried it both on my C8 and my GX now and I see no difference what so ever. That does not mean you are wrong though, but perhaps there are settings etc that affect it? What you describe sounds almost like overscan.

I'm on a CX outputting 4k/60hz/RGB game mode, input labeled as PC, aspect ratio set to original, I can see it happen in 4k/120hz/4:2:0 too. Firmware 03.10.20 from ~2 weeks ago. I heard from people that have had CX's for awhile that this update messed with pixel shift automatically turning on regardless of the setting in some situations, so it's possible they messed with it in other ways in this firmware and it either hasn't made it to GX's yet or was fixed/never broken there.
 
I found a workaround for the washed out HDR. The TV reports RGB / YCbCr422 NODATA and uses the wrong gamut. If you force BT.2020 in the HDMI Signalling Override menu, the colours are correct.

Can you do this with the C9? Or how do you do it for C9? Or how do you do it for C9?
 
Last edited:
By the way, you don't need to power cycle the adapter, you can just press Win+Ctrl+Shift+B
 
I was curious if anyone had thoughts on using this as a work monitor as well. I work in finance (and now forced to worked from home due to COVID), so I have tons of spreadsheets open, power point / word documents etc. But I do keep OLED light down to 10% (max 15%) when I work. My desk place is limited so I am too lazy to keep switching monitors for when I work vs when I dont. Do you think at such low OLED light the burn-in is a possibility?
 
You can see the slight zoom when you turn screen shift on/off in the settings when you have stuff on the screen. You can't zoom and have something that's supposed to be 1 pixel wide always be 1 pixel wide.
It doesn't zoom or overscan - it just shifts the pixels. When you do this with a double subsampled image, the GPU subsampling matrix is no longer aligned with the display's, resulting in further loss.
Even if the matrices are aligned but not identical, there is further loss. That's why outputting RGB from the GPU looks better than YCbCr422 without PC icon, even though the CX renders both at 4:2:2.
 
Can you do this with the C9?
No. Only 10-bit YCbCr422 activates BT.2020 reliably on the adapter.

There is a flaky workaround to get the colour working. First select 4K 60 Hz 8-bit RGB. HDR should work properly with BT.2020. Then switch to 4K 60 Hz 10-bit / 12-bit RGB or 4K 120 Hz 8-bit / 10-bit RGB. If you're lucky, it will retain BT.2020.

By the way, you don't need to power cycle the adapter, you can just press Win+Ctrl+Shift+B
It doesn't fix it most of the time.

Where is this menu mate?
Press 1113111 while Picture > Picture Mode Settings is highlighted.
 
Last edited:
  • Like
Reactions: elvn
like this
No. Only 10-bit YCbCr422 activates BT.2020 reliably on the adapter.

There is a flaky workaround to get the colour working. First select 4K 60 Hz 8-bit RGB. HDR should work properly with BT.2020. Then switch to 4K 60 Hz 10-bit RGB or 4K 120 Hz 8-bit / 10-bit RGB. If you're lucky, it will retain BT.2020.

Ok I did the switch to 4K 60hz 8bit RGB, then go back to 4K 120hz RGB 12bit, and it kept the BT.2020 for HDR. So now I'm running at 4k 120hz RGB 12bit FULL Dynamic range with HDR on and no washed out color. Just not sure how long this will last until the Adapter reset itseslf
 
It doesn't zoom or overscan - it just shifts the pixels. When you do this with a double subsampled image, the GPU subsampling matrix is no longer aligned with the display's, resulting in further loss.
Even if the matrices are aligned but not identical, there is further loss. That's why outputting RGB from the GPU looks better than YCbCr422 without PC icon, even though the CX renders both at 4:2:2.

I'm definitely looking at it zoom slightly when enabling/disabling, and definitely seeing no black border on any side when a full screen image (there would have to be for it to just shift the image without overscanning).
 
Any one using a 5700 XT with one of these? I just ordered one should be here but next Tuesday fingers crossed. Wondering how it was working with AMD Cards. I'm fine with being stuck with 4k@60 till the new cards drop if need be. I know the picture will still be amazing...
 
Any one using a 5700 XT with one of these? I just ordered one should be here but next Tuesday fingers crossed. Wondering how it was working with AMD Cards. I'm fine with being stuck with 4k@60 till the new cards drop if need be. I know the picture will still be amazing...

Sorry...never had any AMD cards in my entire life. Only Intel cpus and Nvidia cards
 
Any one using a 5700 XT with one of these? I just ordered one should be here but next Tuesday fingers crossed. Wondering how it was working with AMD Cards. I'm fine with being stuck with 4k@60 till the new cards drop if need be. I know the picture will still be amazing...

Since the 5700 XT has HDMI 2.0 I don't see why you wouldn't be able to just do 1440p@120Hz instead since that's an officially supported resolution by the TV. 4k@120Hz 4:2:0 is also highly likely to work as well. The TV also now supports AMD Freesync Premium so shouldn't be any problems getting that to work. Enjoy your CX, I've had mine for 3 weeks now and I have played more games this past 3 weeks than I have all throughout 2020 so far. It's that amazing.
 
Some more dithering info from a youtube source 10mo ago.. I haven't researched to verify or to see if nvidia changed it...

4K 60 Hz 8-bit RGB is not subsampled on the CX. You get full colour resolution. Any combination of bit-depth / refresh rate above this is subsampled, at least with the adapter.

Full chroma subsampling color (chroma = color separate from luminance info) for sure, and I guess with the adapter at 60hz you can get a "full" 8bit + dithering for a near 10bit

... but like you said later, the panel itself is capable (from a capable gpu) of receiving a true 10bit signal over hdmi and perhaps even able to make it dithered on top of that 10bit.
With the 40gbps limit I don't think it can "downsample" 12bit on the 10 bit panel itself off of hdmi 2.1 unless that was possible from the gpu end before the signal was sent to the TV as 10bit dithered somehow. I think that is what the links above are talking about - enabling dithering on the gpu (with 444 chroma) then sending the dithered signal over to the display. I know there is hdmi 2.1 DSC too but I haven't seen any info about how good the end result of that compression is yet and in relation to a 12bit signal on a 10bit 40gbps hdmi port display but that could be another avenue later.

by Chief Blur Buster » 08 May 2020, 15:06

Banding is visible even at 10-bits on a 10,000nit HDR monitor.

FRC is still advantageous even at 8bits (to generate 10bits) and 10bits (to generate 12bits).

It sounds to me like this kind of hierarchy for 10bit panels in general at 444 chroma if I'm not reading it wrong:

12bit "downsampled" to 10bit (on 48gps capable hdmi port displays, uncertain if DSC could be used to send a 12bit signal over 10bit. and whether it would lose any of the gains or have other issues from compressing it down within 40gbps hdmi if possible).
or
10bit + dithering (pseudo 12bit which could potentially mask any banding better than 12bit native~downsampled)

8bit dithered (pseudo 10bit that masks banding better than 10bit native)
or 10bit native sans dithering (with a much finer color resolution vs banding than 8bit native sans dithering)

That doesn't mean any other settings aren't usable or that technically "negligibly" less accurate dithering versions aren't desirable (as long as there aren't any artifacts during fast motion).. I mean anti aliasing blends/masks harsher pixel level accuracy way more but it looks better in most cases too.

From RTINGs review of the LG CX (not using the adapter of course)
-----------------------------------------------

Like the LG C9 OLED, the CX displays all common resolutions. For it to display proper chroma 4:4:4, which is important for reading text, it must be on 'PC' mode. However, 4:4:4 doesn't work on 1080p @ 120Hz.

According to the TV's specs, it supports 4k @ 120Hz with proper chroma 4:4:4, but it only works with HDMI 2.1 sources, which we can't test at the moment. We'll retest this when we have an HDMI 2.1 source. However, we were able to get it to 4k @ 120Hz with chroma 4:2:0.

Update 05/24/2020: 4k @ 120Hz is only displayed properly in Game mode. Outside of game mode, it skips frames.

So I think the full color capability of the CX should be among these, at least on hdmi 2.1..
444 10bit native, or you might argue 444 10bit dithered to approach 12bit and with the benefit of masking banding, or perhaps if possible - 12bit via DSC "downsampled" to 10bit .
 
We are talking about two different things that have different color level information that can be lowered in the final signal.

.....One is the chroma subsampling which cuts the color down in the subpixels.

From the Chroma wiki article:
---------------------------------------

4:4:4
Each of the three Y'CbCr components have the same sample rate, thus there is no chroma subsampling. This scheme is sometimes used in high-end film scanners and cinematic post production.

Note that "4:4:4" may instead be referring to R'G'B' color space, which implicitly also does not have any chroma subsampling.

4:2:2
The two chroma components are sampled at half the horizontal sample rate of luma: the horizontal chroma resolution is halved. This reduces the bandwidth of an uncompressed video signal by one-third with little to no visual difference.

4:2:0
In 4:2:0, the horizontal sampling is doubled compared to 4:1:1, but as the Cb and Cr channels are only sampled on each alternate line in this scheme, the vertical resolution is halved. The data rate is thus the same. This fits reasonably well with the PAL color encoding system since this has only half the vertical chrominance resolution of NTSC.

Chroma subsampling suffers from two main types of artifacts, causing degradation more noticeable than intended where colors change abruptly. Gamma error and out-of-gamut colors.

--------------------------------------------------

......The other is the bit depth bandwidth which is something like a color resolution itself:

k4UPRc2.png
 
So what is the model number of this magical 10,000 nit HDR monitor that Chief was able to spot banding on? :LOL: Anyways given that OLED's cap out around 800 nits, is there really going to be a massive reduction in banding when using this 12bit downsample method over just using 10bit?
 
So what is the model number of this magical 10,000 nit HDR monitor that Chief was able to spot banding on? :LOL: Anyways given that OLED's cap out around 800 nits, is there really going to be a massive reduction in banding when using this 12bit downsample method over just using 10bit?

Hah I know that sounds crazy. 10k nit color volume.

I think most people would be happy running 10bit dithered which results in a pseudo 12bit signal rather than 12bit native since dithering masks banding - even though 10bit native signal's banding is much finer than 8bit native's from the much higher color "resolution" to begin with. The "downsampling" of 12bit native to 10bit might have less banding than 10bit native too but dithering almost does an anti aliasing type effect and seems to mask banding completely even if not technically 100% pixel accurate to the decimal anymore.

I don't know what the difference would be in bandwidth between
(444chroma 120hz 4k) 10bit native
and
(444chroma 120hz 4k) 10bit dithered "pseudo 12bit"

..in relation to the 40gbps limit. 10bit dithered should be less than 12bit native size wise but idk by how much. Hopefully that would work fitting in 40gbps but DSC could be an option potentially if necessary.

It's probably splitting hairs at that point. Mainly I'm agreeing with Monstieur now that dithered 8bit ( ~"pseudo 10bit" ) and dithered 10bit ( ~ "pseudo 12bit" if possible like downsampling) would both mask banding so would be preferable, at least for gaming - as long as no artifacts or glowy flashing were visible during high speed motion.
 
Last edited:
..in relation to the 40gbps limit. 10bit dithered should be less than 12bit native size wise but idk by how much. Hopefully that would work fitting in 40gbps but DSC could be an option potentially if necessary.

Dithering doesn't have a bandwidth requirement, 10bit is 10bit whether the source sends rapidly changing colors or not. If 4k/120hz/12bit 4:4:4 fits in 48Gbps, 4k/120hz/10bit 4:4:4 fits in 40Gbps. 12bit is 20% more than 10bit, 48Gbps is 20% more than 40Gbps.
 
  • Like
Reactions: elvn
like this
Ok I did the switch to 4K 60hz 8bit RGB, then go back to 4K 120hz RGB 12bit, and it kept the BT.2020 for HDR. So now I'm running at 4k 120hz RGB 12bit FULL Dynamic range with HDR on and no washed out color. Just not sure how long this will last until the Adapter reset itseslf
120 Hz 12-bit RGB may work on the C9 but not on the CX. Check that that the adapter is not falling back to YCbCr420 in the HDMI info screen on the TV. The C9 will probably also downsample it to 4:2:2.

Full chroma subsampling color (chroma = color separate from luminance info) for sure, and I guess with the adapter at 60hz you can get a "full" 8bit + dithering for a near 10bit
The CX renders 4K 60 Hz 10-bit RGB as 4:4:4 with the adapter. However Windows won't enable dithering at 10-bit. You'll have to resort to a registry hack to force dithering on. There may be additional issues where applications detect the signal bit-depth and only output 10-bit, making the dithering useless - it would only work on certain surfaces and APIs.

as long as no artifacts or glowy flashing were visible during high speed motion.
I've been playing CS:GO and other FPS games on the Predator X27 at 120 Hz 8-bit RGB with dithering for over a year. I've never noticed any artifacts.
 
Last edited:
  • Like
Reactions: elvn
like this
I'm definitely looking at it zoom slightly when enabling/disabling, and definitely seeing no black border on any side when a full screen image (there would have to be for it to just shift the image without overscanning).
One edge of the image is clipped off-screen and the image remains pixel-perfect at 1:1. It's been like this since my old B7, at least with my PC as the source.
 
120 Hz 12-bit RGB may work on the C9 but not on the CX. Check that that the adapter is not falling back to YCbCr420 in the HDMI info screen on the TV.
The C9 will probably also downsample it to 4:2:2. You may want to check the maximum resolution & refresh rate that the TV does 4:4:4 for an RGB input. On the CX, the maximum seems to be 4K 60 Hz 10-bit RGB before it downsamples.

How do I check the HDMI info screen on my E9?
 
Back
Top