LG 48CX

Whoa hold on a minute. The black level bug happens with VRR regardless of HDR or SDR? At first I thought this issue was only limited to HDR but if it happens in SDR as well then this is finally the dealbreaker in me getting a CX.

Why wouldn’t rtings mention that in their review? Can someone who owns a C9 or CX speak more to this issue? I want to know how big of a deal this is before I make a judgment.
 
Why wouldn’t rtings mention that in their review? Can someone who owns a C9 or CX speak more to this issue? I want to know how big of a deal this is before I make a judgment.



This does not look good. If someone confirms the black level VRR bug happens in SDR then I will not be getting a CX. OLED isn't OLED without true blacks. I could forgive everything else from the uncompressed audio to the 40Gbps to even the lack of LFC below 40Hz but this I cannot accept.
 
This does not look good. If someone confirms the black level VRR bug happens in SDR then I will not be getting a CX. OLED isn't OLED without true blacks. I could forgive everything else from the uncompressed audio to the 40Gbps to even the lack of LFC below 40Hz but this I cannot accept.

I've played multiple games, and I never see black levels elevated. I've turned up brightness to the max, and in all cases, blacks are inky black. I just tested Modern Warfare, and it looks insanely good, especially on night levels.

I will test HDR to see if I get the same results. For reference, I never play games on my PC in HDR as I find myself having to adjust the picture too much. I'd rather just set it and forget it, which is why I game exclusively in SDR.
 
I've played multiple games, and I never see black levels elevated. I've turned up brightness to the max, and in all cases, blacks are inky black. I just tested Modern Warfare, and it looks insanely good, especially on night levels.

Just making sure but are you playing with Gsync enabled?
 


This does not look good. If someone confirms the black level VRR bug happens in SDR then I will not be getting a CX. OLED isn't OLED without true blacks. I could forgive everything else from the uncompressed audio to the 40Gbps to even the lack of LFC below 40Hz but this I cannot accept.


Thanks man. I admittedly hadn't watched all of the videos that were posted. So, VRR basically turns his set into an IPS monitor. :)

Am definitely interested in madpistol's continued testing and input. Surely if this bug exists for everyone, LG will fix it. That's crazy.
 
What's weird is that if this known VRR bug as been around since last year on the C9...how come there has been no mention of it on here until recently? Out of all the earlier discussion about the CX nobody has ever once mentioned that the C9 has a VRR black level bug until like 2 days ago. This makes me question if the issue is as obvious as it seems.
 
What's weird is that if this known VRR bug as been around since last year on the C9...how come there has been no mention of it on here until recently? Out of all the earlier discussion about the CX nobody has ever once mentioned that the C9 has a VRR black level bug until like 2 days ago. This makes me question if the issue is as obvious as it seems.

Just to note, there is a "black level" setting buried in the menus. "Auto" seems to default to the "High" setting, which seemingly elevates the brightness level of near-black parts of the picture. "Low" is more like OLED black, in that it creates a super dark image.

My guess is that "High" is for everyone that was crying about black crush, and "Low" is for everyone that wants the deep blacks OLED is known for.

Gotta give props to LG; very creative way to appeal to everyone.
 
So the short version is that LG and Samsung for 2020 managed to release new TVs that are almost as good as the 2019 ones? Seems like 2020 won't be considered the best year ever for TVs...

Like others have mentioned already, only the 48" seem to kind of relevant this year. Not that 2020 LGs are bad, just not better than previous versions.
 
This "VRR black level" thing seems like one of those near-black level issues that people who spend too much time looking at test patterns obsess over, and not a real problem.

See this tweet. OLED has always had some near-black issues and apparently LG applied some intentional black crush to hide them, but you can calibrate it out if you want. In no case does this involve black levels comparable to an LCD, and it's probably highly content dependent as well.

I don't see any evidence that this is any kind of major real-world issue.
 
Just to be sure you should probably do some A/B testing with VRR on and off like what was shown in the video. Doing some further digging and it seems like this issue is indeed real and LG is now aware of it. But maybe you're right and it's just down to people using incorrect settings then blaming the TV for it.
https://twitter.com/Flatpanels/status/1260889781896183808?s=19

Look at my most recent post. There's a "Black Level" setting. "High" elevates blacks. "Low" creates an OLED black (what some people might call "black crush")

This "VRR black level" thing seems like one of those near-black level issues that people who spend too much time looking at test patterns obsess over, and not a real problem.

See this tweet. OLED has always had some near-black issues and apparently LG applied some intentional black crush to hide them, but you can calibrate it out if you want. In no case does this involve black levels comparable to an LCD, and it's probably highly content dependent as well.

I don't see any evidence that this is any kind of major real-world issue.

People will nitpick anything. I get it that OLED TVs are expensive, and everybody wants them to be perfect, but until LG has full control over the environment that the TV is placed (brightness levels, size, etc.), people will always find something to complain about.

Myself, using the CX as a PC monitor, I'm loving every moment of it. It is far an away the best PC monitor I've ever owned. Period.
 
Look at my most recent post. There's a "Black Level" setting. "High" elevates blacks. "Low" creates an OLED black (what some people might call "black crush")



People will nitpick anything. I get it that OLED TVs are expensive, and everybody wants them to be perfect, but until LG has full control over the environment that the TV is placed (brightness levels, size, etc.), people will always find something to complain about.

Myself, using the CX as a PC monitor, I'm loving every moment of it. It is far an away the best PC monitor I've ever owned. Period.

Nice. Guess that dispels the black level VRR bug which was my one and only concern.
 
OK, resume the hype train. :)

It's pretty mind-boggling that the supposed "videophiles" over at AVSForum didn't pick up on this and see it for what it is. Or maybe they're just extreme nitpickers as mentioned.
 
The black level setting is for whether your source device is sending limited or full RGB, btw. If you're using it with a PC, there should be a setting to change it, but I think Nvidia defaults to Limited(which pairs correctly with the "low" black level setting) because that's what most TVs expect. In theory using full RGB output along with high black level would produce the best contrast, though. Doesn't matter too much in the end though as long as you're using the right pair of settings.
 
OK, resume the hype train. :)

It's pretty mind-boggling that the supposed "videophiles" over at AVSForum didn't pick up on this and see it for what it is. Or maybe they're just extreme nitpickers as mentioned.

This makes me wonder if the whole Dolby Vision/HDR10 bug is actually a bug too or just people using the wrong settings lmao.
 
This makes me wonder if the whole Dolby Vision/HDR10 bug is actually a bug too or just people using the wrong settings lmao.

Here's the post from a calibrator mentioning the issue. Does not explain the cause, but it sounds like it is a bug with DV content specifically, and all the issues are solvable by calibrating. It also sounds like they don't show up on all panels, so I'd guess this is a combination of panel variance and again some kind of firmware compensation for near-black issues.
 
I was just going off what has been posted and in that video but I think there is a difference between G-Sync VRR, HDMI VRR, and Freesync so if the VRR black level bug is actually a thing it might not affect all of them. I'm definitely as interested in confirmation of that issue as anyone since it was said to affect the C9 and I'd assume the E9 as well as the CX. As in, it was said that it was a black level problem ever since VRR firmware was released in the first place.

People like to exaggerate minor things.

Compromised black levels are not a minor thing if that turns out ot be an issue.
-if confirmed, claims of no dolby vision support so that dolby vision black levels are supposedly broken on the CX but not the C9, E9. HDR 10, HDR black levels work correctly though according to the report.
-if confirmed VRR black levels are broken and are shown brighter than the deep OLED blacks you see in videos.

-10bit native 4k 120hz 444 over hdmi will probably get support by nvidia but if it doesn't it means banding, and 8bit dithering losing source detail while still banding a little. That is, not supporting unaltered and uncompressed source video.

-lack of uncompressed surround passthrough on a HDR multimedia powerhouse OLED is an issue for people with eARC receivers, again not wanting to compress source material needlessly. However Rtings said TrueHD and Atmos work, but any DTS-HD Master material you have is out of luck.

-no FRC could be crippling to the next generation of (low frame rate) consoles that will enjoy VRR on other hdmi 2.1 TVs going forward. On PC, if you keep for example a 70fps average (100fps or better is really best for 120hz) you shouldn't drop below 40fps in your frame rate graph so could avoid issues from dropping below the VRR Hz range.

-the input lag increase at 4k VRR being 23.3ms as published by the RTings review is a little disappointing but not a total deal breaker. Hopefully they can cut that down in firmware updates and especially off of future hdmi 2.1 gpus.


Really I hope they eventually fix most of these issues in the future. I'm not trying to hate this monitor or spread FUD. I want to know what the truth is on each issue either way. Hopefully they can tune it up with some fw fixes by the time the 3080ti's drop or november deals or whatever. Otherwise I'll probably get a 55" C9 or E9 on sale.
 
Last edited:
I was just going off what has been posted and in that video but I think there is a difference between G-Sync VRR, HDMI VRR, and Freesync so if the VRR black level bug is actually a thing it might not affect all of them. I'm definitely as interested in confirmation of that issue as anyone since it was said to affect the C9 and I'd assume the E9 as well as the CX. As in, it was said that it was a black level problem ever since VRR firmware was released in the first place.

I don't actually think there is a black level issue with VRR/game mode. I actually think this was an engineering/design decision. VRR is very much a gaming tech, and the reason most people enable it is because they want smooth, fluid motion and clarity of picture. Going along with that reasoning, absolute blacks afforded by OLED tech would smother details that could possibly be seen by other displays, thereby actually putting users of LG's OLED TVs at a disadvantage; not good when you consider how much these things cost.

The solution?


Slightly elevate blacks in VRR/game mode.


This does several things.

  1. Increase visibility in dark areas
  2. Send more information to a person's eyes (humans can't see dark very well)
  3. Lower pixel response times (possibly)
The best part is that it probably costs nothing for the onboard video processor in the TV because LG's OLEDs have a WRGB subpixel structure; just turn the White subpixel up a few levels from "off" and you're done.

Honestly, I think it's an amazingly simple solution to a very complex demographic of consumers: gamers.
 
HDR is supposed to be based on a set of absolute values. HDR is supposed to allow the deepest blacks while still showing a much higher color volume ceiling that not only shows more realistic peak brightnesses but accurate details in color throughout. Those colors levels are supposed to be absolute on a scale, and that scale's palette used as the author of the material intended it to be shown.. HDR is not varied or traded, offset +/- like a SDR screen's brightness slider does globally. I refuse to accept the reported worse black levels with VRR active, if true, are a "feature".
 
HDR is supposed to be based on a set of absolute values. HDR is supposed to allow the deepest blacks while still showing a much higher color volume ceiling that not only shows more realistic peak brightnesses but accurate details in color throughout. Those colors levels are supposed to be absolute on a scale, and that scale's palette used as the author of the material intended it to be shown.. HDR is not varied or traded, offset +/- like a SDR screen's brightness slider does globally. I refuse to accept the reported worse black levels with VRR active, if true, are a "feature".

https://twitter.com/EvilBoris/status/1260336886138142723
That's actually how it's suppose to look though according to this guy.
 
From the comments.. "Looks like shit" :eek:

(paraphrasing... he said: "looks just horrible!!!!!")

9v7w7by.png
 
Last edited:
*shrug* I've seen this guy appear on HDTV Test youtube channel and he seems to know his stuff so if he says that's the proper image then I would say there's a good chance he's right.
 
HDR is supposed to be based on a set of absolute values. HDR is supposed to allow the deepest blacks while still showing a much higher color volume ceiling that not only shows more realistic peak brightnesses but accurate details in color throughout. Those colors levels are supposed to be absolute on a scale, and that scale's palette used as the author of the material intended it to be shown.. HDR is not varied or traded, offset +/- like a SDR screen's brightness slider does globally. I refuse to accept the reported worse black levels with VRR active, if true, are a "feature".

Yes... but no.

HDR has a set of absolute values. True. However, unlike SDR, which is calibrated exclusively to 100 nits of brightness, HDR is calibrated to (insert arbitrary number) nits of brightness. That is to say, TVs vary WILDLY on max brightness.

The CX gets up to around 700 nits of brightness in HDR... except when it doesn't.
The Samsung Q90R gets up to around 1500 nits of brightness in HDR... except when it doesn't.

You see, unlike SDR where virtually every TV currently on the market can hit and maintain 100 nits of brightness, virtually every TV on the market calculates HDR values differently, even in 100% standardized content like Dolby Vision; HDR hasn't matured enough to create a 100% standardized way of creating TVs.
 
I dunno about it being intentional because like Samsung, LG could spend 5 minutes and implement a "black stabilizer" or "game shadow boost" setting to give people a choice.

I think it's some unavoidable hardware characteristic of the panels similar to how most mobile OLED screens suffer from green tint or gamma issues at 90/120hz but are perfectly fine at 60hz.

Also, per AVS, LG engineers were just recently brought aware of the issue so chances of it being intentional are slim.

Both the Q90R and CX have a "black stabilizer" or "game shadow boost" setting on their TVs.

CX:
CX black level.jpg


Q90R:
Q90r gamemode.jpg


I'm sure other manufacturers also have similar tech on their TVs.

As for LG engineers just being made aware of the issue... lol. Not a chance. Corporate speak 101: Appease your customers by telling them what they want to hear. The people that programmed the algorithms that determine picture settings are wicked smart and probably have more seat time in front of these TVs than all of us combined. No way they were just made aware of this item.
 
Last edited:
Yes... but no.

HDR has a set of absolute values. True. However, unlike SDR, which is calibrated exclusively to 100 nits of brightness, HDR is calibrated to (insert arbitrary number) nits of brightness. That is to say, TVs vary WILDLY on max brightness.

The CX gets up to around 700 nits of brightness in HDR... except when it doesn't.
The Samsung Q90R gets up to around 1500 nits of brightness in HDR... except when it doesn't.

You see, unlike SDR where virtually every TV currently on the market can hit and maintain 100 nits of brightness, virtually every TV on the market calculates HDR values differently, even in 100% standardized content like Dolby Vision; HDR hasn't matured enough to create a 100% standardized way of creating TVs.


Yes they have to tone map down due to the limits. HDR is based on 1000 (and 4000, 10000 depending). That doesn't explain why a movie tone mapped to ~ 750 nit small % peaks has ultra black levels, detail in blacks, while still having detail in colors up to those high-as-able nit ceilings --- yet a game can't.

GdpJb0T.png

Normalized Rec 2020 Coverage ITP

61.0 %

10,000 cd/m² Rec 2020 Coverage ITP

33.5 %

Despite having an excellent color gamut, the CX's color volume is only decent. It can't produce extremely bright colors, but with a perfect contrast ratio, it can display dark, saturated colors.

----------------------------------------

VRR on these TVs has only been available since october-november 2019.
Both the Q90R and CX have a "black stabilizer" or "game shadow boost" setting on their TVs.

CX:
View attachment 246957


Q90R:
View attachment 246958


I'm sure other manufacturers also have similar tech on their TVs.

As for LG engineers just being made aware of the issue... lol. Not a chance. Corporate speak 101: Appease your customers by telling them what they want to hear. The people that programmed the algorithms that determine picture settings are wicked smart and probably have more seat time in front of these TVs than all of us combined. No way they were just made aware of this item.

I don't know if them knowing about it ahead of time would be a compliment to them or an insult considering how it looks.
 
I dunno about it being intentional because like Samsung, LG could spend 5 minutes and implement a "black stabilizer" or "game shadow boost" setting to give people a choice.

I think it's some unavoidable hardware characteristic of the panels similar to how most mobile OLED screens suffer from green tint or gamma issues at 90/120hz but are perfectly fine at 60hz.

Also, per AVS, LG engineers were just recently brought aware of the issue so chances of it being intentional are slim.

It reminds me of how the OLED panels in VR headsets either have not so deep blacks (so called "mura effect" because it looks a bit noisy when the panel is right up your face) or horrible black smearing. Maybe it's related? But what about the overpriced Alienware monitor? How did that one perform?

This is a big bummer for me though, not gonna lie. I wasn't gonna be using VRR on this TV for some time though since I'm on a 1080 ti and waiting for Ampere. But between this and the reportedly non existent LFC eh... I might not purchase it as soon as it's available, as I originally intended. I'd like to know a bit more about what can be fixed through software and what is really something we'll just have to live with.
 
It reminds me of how the OLED panels in VR headsets either have not so deep blacks (so called "mura effect" because it looks a bit noisy when the panel is right up your face) or horrible black smearing. Maybe it's related? But what about the overpriced Alienware monitor? How did that one perform?

This is a big bummer for me though, not gonna lie. I wasn't gonna be using VRR on this TV for some time though since I'm on a 1080 ti and waiting for Ampere. But between this and the reportedly non existent LFC eh... I might not purchase it as soon as it's available, as I originally intended. I'd like to know a bit more about what can be fixed through software and what is really something we'll just have to live with.

If you weren't going to be using VRR then how does the lack of LFC even affect you in the first place? And when you do get Ampere I'm pretty sure you won't be dropping below 40fps that often unless you were planning to get an RTX 3060 and not a 3080 Ti. As for the black level issue eh I'll see if I can backup madpistol's claims once I get one.
 
How does RTINGS do their input lag tests?

It seems weird that the higher the res the higher the input lag and that VRR has higher input lag. It makes me thinks it has more to do with the equipment they used than the input lag of the TV itself.
VRR has variable input lag depending on the framerate, and if they're using a PC it matters if vsync was enabled, etc.

Without knowing how they're doing their tests I'm suspecting there isn't actually more lag added by the TV.
 
If you weren't going to be using VRR then how does the lack of LFC even affect you in the first place? And when you do get Ampere I'm pretty sure you won't be dropping below 40fps that often unless you were planning to get an RTX 3060 and not a 3080 Ti. As for the black level issue eh I'll see if I can backup madpistol's claims once I get one.

It will affect me in the future that is what I mean. I play plenty of older/poorly coded games that will simply never run at a great framerate even with a RTX 3080 ti and whatever CPU I'll pair it with, so it matters to me.
 
It will affect me in the future that is what I mean. I play plenty of older/poorly coded games that will simply never run at a great framerate even with a RTX 3080 ti and whatever CPU I'll pair it with, so it matters to me.

Yikes sub 40fps with or without VRR is still a terrible experience IMO. But if you plan to play tons of games at that frame rate then I guess it's best to look elsewhere.
 
Yikes sub 40fps with or without VRR is still a terrible experience IMO. But if you plan to play tons of games at that frame rate then I guess it's best to look elsewhere.

Yeah sub 40 is terrible but proper VRR makes it bearable to some extent (also depends greatly on the type of game - I can't even stand a stable 60fps in a shooter for example!). Just not looking forward to the tearing and/or stuttering when I drop below 40 which I know will sometimes happen. But yeah I can of course use a different screen (such as my current one) if it comes down to it. Don't get me wrong, this is far from being the #1 dealbreaker to me. But I'd really like that "perfect" screen that works for everything I do.

So I'm in the same boat as elvn and have my reservations regarding this TV based on what I am reading. We shall see how things pan out!
 
Yikes sub 40fps with or without VRR is still a terrible experience IMO. But if you plan to play tons of games at that frame rate then I guess it's best to look elsewhere.

I don't agree with this at all. I've played tons of games that run at 30 fps on PS4 Pro or Nintendo Switch with my LG C9 and it's honestly fine. Of course I would rather have them be 60+ fps with VRR etc but that's just not the reality with current consoles. You quickly get used to it. I have thoroughly enjoyed playing Luigi's Mansion 3 on Switch with the C9, despite it running at sub-1080p at 30 fps.

As for black stabilizer, my Samsung CRG9 has one and with higher settings it mostly crushes blacks compared to my LG C9. I tried this in Shadow of the Tomb Raider and the C9 was able to show a lot more detail in dark areas. I would have to test again if that was a bug caused by using VRR on the C9 but in any case I preferred how that looked. Personally I have never been a stickler for having the absolute best black level and I would rather take raised black levels over crushed black detail.

LG OLEDs have a bunch of issues of their own besides burn-in and size. Personally anything on the LG C9 has just not bothered me at all during the time I've owned one. It still has great contrast, response time, resolution, HDR, supports G-Sync etc.
 
It sounds like you love your C9. The C9 and E9 supporting more than the CX currently is my main concern. I'm leaning toward getting a 55" C9 or E9 in november deals if they still check more of the boxes. Will see how the CX support looks by the time the 3080ti drops otherwise.

https://www.rtings.com/tv/tools/compare/lg-c9-vs-lg-cx/802/10619?usage=10135&threshold=0.1

https://www.rtings.com/tv/tools/compare/lg-e9-vs-lg-cx/907/10619?usage=10135&threshold=0.1



48" would be nice but I can move my desk back farther if I'm already setting up for a long view. 48" TV as a monitor singly is about a 40" min ~ 3.3' viewing distance to me, 55" TV as a monitor is ~ 46" or 3.8' to 4'. Six inches isn't going to make that big of a difference <insert bawdy joke here> . It would make my monitor array a little different but still doable.

Using more than one monitor, I'd bump that viewing distance up a little in both scenarios so 48" TV would be more like 48" (4') viewing distance and 55" TV would be 55" (4.5') away. I'll probably have to bump the scaling up one on the side monitor(s) losing a little real-estate but it should still work really well.

48" TV ~~> 40" to 48" away

55" TV ~~> 46" to 55" away .. depending what I'm doing, since my island desk is on caster wheels.
---

A7gEgdY.png
 
Last edited:
If the blacklevel is correct in HDR, then for PC use we can just leave HDR on permanently *shrug*. Has worked well for me on pg27uq, the hdr-sdr mapping seems to work fine lately.
(I ordered a 48cx, will arrive late may or early june - I can test input lag relative to pg27uq when it arrives since i have proper numbers for that with 960fps camera).
 
Last edited:
If the blacklevel is correct in HDR, then for PC use we can just leave HDR on permanently *shrug*. Has worked well for me on pg27uq, the hdr-sdr mapping seems to work fine lately.
(I ordered a 48cx, will arrive late may or early june - I can test input lag relative to pg27uq when it arrives since i have proper numbers for that with 960fps camera).

I would not recommend this even though visually it looks fine. This means you will most likely accelerate any burn-in as the display will be constantly at a high brightness level. It's still better to set it to SDR when you are not HDR content and reducing the brightness of the display.

It sounds like you love your C9. The C9 and E9 supporting more than the CX currently is my main concern. I'm leaning toward getting a 55" C9 or E9 in november deals if they still check more of the boxes. Will see how the CX support looks by the time the 3080ti drops otherwise.

I think the smaller 48" model would still be a lot easier to manage. My 49" super ultrawide is about the same width as a 55" 16:9 and having something that is the same width but twice as tall would be quite overwhelming even from further away.
 
I would not recommend this even though visually it looks fine. This means you will most likely accelerate any burn-in as the display will be constantly at a high brightness level. It's still better to set it to SDR when you are not HDR content and reducing the brightness of the display.
Windows has a slider for hdr-sdr brightness adjustment, so you will have the same exact brightness as you would have in normal sdr.
 
Wish the Rtings review was a bit more detailed & pointed out the issues more but oh well. I guess I have no reason to "upgrade" my C9 to the CX. I would have preferred the 48" over the 55" size for slightly better PPI but it doesn't really affect me since I'm couch-PC'ing it with a wireless keyboard & mouse and a nice recliner (only as far away to where my feet won't touch the screen while reclined so I'm fairly close to the screen). No point paying more for a gimped HDMI 2.1 port, no DTS encoder (doesn't really affect me), 0 picture quality improvements outside of panel lottery, BFI 120hz not applicable on the CX anyway, and all the same issues the C9 has. The C9 even scores the same or .1 point higher in all categories vs the CX per the Rtings review:

C9 -> CX
1590086427894.png
 
Last edited:
Back
Top