LG 48CX

I've seen some posts in the last bit about changing the OLED light or brightness setting to be lower to mitigate the gamma shift issue with VRR. This will obviously make the overall picture dimmer, but I'm in a pitch black room so I don't mind that.

Can anyone confirm if one of those, which one actually does mitigate it to negligible, and at what value?
 
I don't see how OLED light would help. But it helps with burn-in and eye fatigue in a dark room.

The different setting called "brightness" can be used to hide the gamma shift, but you are effectively crushing the black details when you do that. Also seems to make the whole picture a bit darker and screw the calibration. But maybe at -1 or -2 it's not too bad, I haven't measured it. You can also use a gamma setting of 2.4 and probably not see the gamma shift ever, but again you are losing a lot of dark details when the majority of content is designed for 2.2.
 
I've been trying to determine how serious the gamma shift with VRR is and if it means I should take the TV back. In Dead by Daylight, as I shift from sections of menus, the fps instantly drops to 100 for a brief moment then back to 120. In the gif below you can see that this causes a flashing of sorts. I don't remember this happening with my last monitor.

CX Gamma Shift?

So it looks this might be an exaggerated example of the gamma shift since the fps drop is so fast and significant. When I play destiny the fps bounces all over the place, but is more consistent than major spikes in either direction like this. Which is why I'm guessing I don't notice it in that game.

Can someone confirm or correct if I'm wrong that this is a result of the gamma shift issue?
 
I've been trying to determine how serious the gamma shift with VRR is and if it means I should take the TV back. In Dead by Daylight, as I shift from sections of menus, the fps instantly drops to 100 for a brief moment then back to 120. In the gif below you can see that this causes a flashing of sorts. I don't remember this happening with my last monitor.

CX Gamma Shift?

So it looks this might be an exaggerated example of the gamma shift since the fps drop is so fast and significant. When I play destiny the fps bounces all over the place, but is more consistent than major spikes in either direction like this. Which is why I'm guessing I don't notice it in that game.

Can someone confirm or correct if I'm wrong that this is a result of the gamma shift issue?


IMO, there has been way too many inconsistent reports of the VRR gamma bug. It ranges from "near blacks gets raised and stay raised" to "near blacks flashing back and fourth from one value to the next"...so which is it then? That flashing that you see, I actually see that all the time in loading screens when my fps is like 20 lol. During normal gameplay I don't notice any flashing.
 
I don't care much for what happens in menus or loading screens... I've often seen flickering and stuff even with LCD monitors and a genuine g-sync module in those instances, but the only thing I care about is gameplay.

The only issues I can report with gameplay so far would be the noticeable raised near-blacks in some games (like Control which runs around 60fps and looks really bad with VRR IMO) and the stuttering around 100-120fps that will be fixed eventually (I haven't bothered trying all the workarounds mentioned yet). I think this will still remain the best gaming monitor for almost everyone out there, especially once they fix the high fps stuttering thing. The raised near-blacks are not noticeable everywhere all the time and it still looks way better than any LCD could ever dream of. And you can also just turn off VRR I suppose. A lot of people don't even mind the mismatched fps/hz feeling, and many are using VRR monitors without VRR actually working because of settings or software issues.
 
So I have an odd observation here.

I've been noticing some jutter and stuttering with GSYNC on my RTX 3080 and CX. I tried setting the settings down to 4K120 4:2:0 8-bit, and this seems to fall within HDMI 2.0b range, so it "should" resolve the stuttering issue. It still feels.... off. It's working, but it doesn't feel smooth compared to what I'm used to.

So I decided to put my RTX 2070 Super back in and test it at the same settings (4K120 4:2:0 8-bit) on Euro Truck Simulator (constant motion, easy to detect stuttering), and it is 100% smoother on my RTX 2070 Super. I thought this might be down to resolution, so I set it to 1440P120 RGB 8-bit, and again, performance is 100% smooth on the 2070 Super.

I'm starting to believe that LG's HDMI 2.1 VRR implementation on their LG CX is not ready for prime time. GSYNC/VRR on the 2070 Super is literally perfect, but on the 3080, it does not feel smooth at all. This isn't good, as I am now contemplating selling the CX in favor of a 4K monitor with a GSYNC module and DisplayPort. I paid $1800 for tthe CX 55. There's zero reason that a 2070 Super should be smoother feeling than an RTX 3080.

I really hate beta testing bleeding edge tech... seriously, that's what it feels like.
 
So I have an odd observation here.

I've been noticing some jutter and stuttering with GSYNC on my RTX 3080 and CX. I tried setting the settings down to 4K120 4:2:0 8-bit, and this seems to fall within HDMI 2.0b range, so it "should" resolve the stuttering issue. It still feels.... off. It's working, but it doesn't feel smooth compared to what I'm used to.

So I decided to put my RTX 2070 Super back in and test it at the same settings (4K120 4:2:0 8-bit) on Euro Truck Simulator (constant motion, easy to detect stuttering), and it is 100% smoother on my RTX 2070 Super. I thought this might be down to resolution, so I set it to 1440P120 RGB 8-bit, and again, performance is 100% smooth on the 2070 Super.

I'm starting to believe that LG's HDMI 2.1 VRR implementation on their LG CX is not ready for prime time. GSYNC/VRR on the 2070 Super is literally perfect, but on the 3080, it does not feel smooth at all. This isn't good, as I am now contemplating selling the CX in favor of a 4K monitor with a GSYNC module and DisplayPort. I paid $1800 for tthe CX 55. There's zero reason that a 2070 Super should be smoother feeling than an RTX 3080.

I really hate beta testing bleeding edge tech... seriously, that's what it feels like.
Here's my guess. With the 3080 you may have nvcp set to 420-8, but your game is fullscreening and disgregarding those settings. The 2070 is doing the same thing, but not hitting the fps range that has stutter issues (111-119). So try this first. See if the 3080 is hitting the high, bad range, while the 2070 is staying in the low smooth range. And then after those results, make sure the game is running in borderless window so that it is for sure using the nvcp settings. And then see if 3080 is good at all fps ranges.

If neither of those work, I'm selling a like-new Predator X27. lol
 
So I have an odd observation here.

I've been noticing some jutter and stuttering with GSYNC on my RTX 3080 and CX. I tried setting the settings down to 4K120 4:2:0 8-bit, and this seems to fall within HDMI 2.0b range, so it "should" resolve the stuttering issue. It still feels.... off. It's working, but it doesn't feel smooth compared to what I'm used to.

So I decided to put my RTX 2070 Super back in and test it at the same settings (4K120 4:2:0 8-bit) on Euro Truck Simulator (constant motion, easy to detect stuttering), and it is 100% smoother on my RTX 2070 Super. I thought this might be down to resolution, so I set it to 1440P120 RGB 8-bit, and again, performance is 100% smooth on the 2070 Super.

I'm starting to believe that LG's HDMI 2.1 VRR implementation on their LG CX is not ready for prime time. GSYNC/VRR on the 2070 Super is literally perfect, but on the 3080, it does not feel smooth at all. This isn't good, as I am now contemplating selling the CX in favor of a 4K monitor with a GSYNC module and DisplayPort. I paid $1800 for tthe CX 55. There's zero reason that a 2070 Super should be smoother feeling than an RTX 3080.

I really hate beta testing bleeding edge tech... seriously, that's what it feels like.
If there isn't some additional variable involved you aren't aware of, then same TV, same cable type, different GPUs points to something wrong in the GPUs or NVidia drivers, not the TV. Both GPUs are doing HDMI VRR.
 
Has anyone else tried disabling auto-dimming like I have?

I'm genuinely curious, as it was driving me crazy and I thought more people would want to do it. I haven't looked back since doing it!
I'd consider it essential if you want to use the CX 48 as a monitor. It's a shame you can only do it from the service menu.

I used an old Samsung Galaxy S4 and the LG Service Menu app for this, worked just fine. Exactly the same thing as the service remote, just no physical buttons.
 
I'd consider it essential if you want to use the CX 48 as a monitor. It's a shame you can only do it from the service menu.

I used an old Samsung Galaxy S4 and the LG Service Menu app for this, worked just fine. Exactly the same thing as the service remote, just no physical buttons.
Do you guys really notice a difference? I've seen it maybe once or twice kick in on extremely bright scenes, but it's by far not to an extend that would ever bother me. (talking SDR gaming content, I'm waiting on availability on a next-gen GPU for modern AAA titles).
 
I'd consider it essential if you want to use the CX 48 as a monitor. It's a shame you can only do it from the service menu.

I used an old Samsung Galaxy S4 and the LG Service Menu app for this, worked just fine. Exactly the same thing as the service remote, just no physical buttons.
Please share how you get to the service menu to disable this. I was only aware of the energy savings setting.
 
In my case, unchecking that box makes the Hz value stable with G-SYNC at any locked frame rate below 120 Hz, and the stuttering stops.
Logically speaking unchecking it just disables the gsync setting which makes things smooth. I don't see how this is a fix for stutter with Gsync? It just takes gssync out of the picture.
 
Logically speaking unchecking it just disables the gsync setting which makes things smooth. I don't see how this is a fix for stutter with Gsync? It just takes gssync out of the picture.
It somehow glitches so gsync is still running. Can confirm It worked for me too. The TV reports VRR on, BFI (truemotion) is grayed out, etc.
However, without any workarounds, 4k 120 10bit 444 Limited simply works for me, no glitching needed. OLED55CX6LA .26 firmware. RGB full doesn't and stutters a lot. As described above there's also less banding in 444 limited than rgb full on my unit/firmware.
 
idk if this has been mentioned yet but I'm sure many are aware that the PS5 currently is not supporting VRR at all. It's listed in their spec sheets though so it might be enabled eventually.
 
Has anyone else tried disabling auto-dimming like I have?

I'm genuinely curious, as it was driving me crazy and I thought more people would want to do it. I haven't looked back since doing it!

Using settings like these in the second quote that RTings offered in their C9 review to avoid ABL should probably work on the CX while keeping your screen from going into higher burn-in risk ranges. These settings wouldn't work for HDR though and would be cutting brighter colors out in general.

I don't know if it would turn off auto dimming entirely. It might or it might not. It might depend on what the bottom value of the dimming steps is compared to these setting. Either way it would avoid ranges that kick ABL on by keeping your settings in a safer range vs burn in.

The C9 has a new Peak Brightness setting, which adjusts how the ABL performs. Setting this to 'Off' results in most scenes being displayed at around 303 cd/m², unless the entire screen is bright, in which case the luminosity drops to around 139 cd/m². Increasing this setting to 'Low', 'Med', or 'High' increases the peak brightness of small highlights.

If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes).
 
Last edited:
Using settings like these in the second quote that RTings offered in their C9 review to avoid ABL should probably work on the CX while keeping your screen from going into higher burn-in risk ranges. These settings wouldn't work for HDR though and would be cutting brighter colors out in general.

I don't know if it would turn off auto dimming entirely. It might or it might not. It might depend on what the bottom value of the dimming steps is compared to these setting. Either way it would avoid ranges that kick ABL on by keeping your settings in a safer range vs burn in.
Can you set peak brightness in game mode?
 
From what I know about TVs in general, you can usually setup OSD settings under unique names and switch between them with the remote but I'll wait for someone who owns and LG CX to reply. I'm still waiting at least a few weeks on nov deals and hopefully the next firmware update. At this rate I won't have a 3000 series gpu until january based on the release time of the model I want and the availability of stock vs scalpers.
 
Using settings like these in the second quote that RTings offered in their C9 review to avoid ABL should probably work on the CX while keeping your screen from going into higher burn-in risk ranges. These settings wouldn't work for HDR though and would be cutting brighter colors out in general.

I don't know if it would turn off auto dimming entirely. It might or it might not. It might depend on what the bottom value of the dimming steps is compared to these setting. Either way it would avoid ranges that kick ABL on by keeping your settings in a safer range vs burn in.
ASBL and ABL are separate things. ASBL kicks in even at low brightness because the screen thinks there is static content.
 
ASBL and ABL are separate things. ASBL kicks in even at low brightness because the screen thinks there is static content.

But how low does ASBL go compared to the ABL avoiding settings RTings posted? Like is it several steps lower than those settings almost like a dim screensaver overlay? Honestly asking. My laptop is set to go pretty dim when it isn't on wall power so I imagine it's something like that, which can be quite dim if set up to be.
 
But how low does ASBL go compared to the ABL avoiding settings RTings posted? Like is it several steps lower than those settings almost like a dim screensaver overlay? Honestly asking. My laptop is set to go pretty dim when it isn't on wall power so I imagine it's something like that, which can be quite dim if set up to be.
It gets quite dim. It also does this constantly in desktop use. For example when I am typing this message it would start to dim the display and the only way to reset is to show a bright window on screen so it detects that content has changed.
 
So there is issues with 4K 120 444 and gsync?

I was considering this TV as an upgrade only for my single player games from my LG 38GL950. But I may wait now?
 
So there is issues with 4K 120 444 and gsync?

I was considering this TV as an upgrade only for my single player games from my LG 38GL950. But I may wait now?
Yes. If I were you I would wait until the next firmware update reviews. If it isn't fixed at that time then it would probably be best to avoid the CX if you are planning to game a lot. Being that I already have one, my fingers are surely crossed.
 
I'm waiting on the next firmware and since I won't be able to get the 3000 series gpu I want for awhile I feel less of a rush anyway. However I can't see myself getting an ips or any other LCD over one of these if they fix the VRR stuttering. That is, I'll probably still buy one begrudgingly even if there are still raised blacks in VRR - though flickering would bother me too much and I'd have to return it if it was flickering on me. The overall quality (sans stuttering and flickering) is too good on OLED compared to LCD. I also don't want a short panel anymore. I followed the 43" LCDs for awhile but they aren't worth the money so choices are limited. Most ultrawides are the same ~ 13" tall of a 27" 16:9. The 38" ultrawides are around 14.4" tall.

For reference, ordered by height.. (roughly, based on raw sizes):

----------------------------------------------------------------

22.5" diagonal 16:10 .. 19.1" w x 11.9" h (1920x1200 ~ 100.6 ppi) FW900 crt

27.0" diagonal 16:9 .... 23.5" w x 13.2" h (2560x1440 ~ 108.79 ppi)
34.0" diagonal 21:9 .... 31.4" w x 13.1" h (3440x1440 ~ 109.68 ppi)

37.5" diagonal 21:10 .. 34.6" w x 14.4" h (3840x1600 ~ 110.93 ppi)

31.5" diagonal 16:9 .... 27.5" w x 15.4" h (2560x1440 ~ 93.24 ppi) .. (3840x2160 ~ 137.68ppi)

40.0" diagonal 16:9 .... 34.9"w x 19.6" h (3840x2160 ~ 110.15ppi)

43.0" diagonal 16:9 .... 37.5" w x 21.1" h (3840x2160 ~ 102.46 ppi)

48.0" diagonal 16:9 .... 41.8"w x 23.5" h (3840x2160 ~ 91.79 ppi)

55.0" diagonal 16:9 .... 47.9"w x 27.0"h (3840x2160 ~ 80.11 ppi)

----------------------------------------------------------------
 
How are you all connecting your 3080s (or maybe any cards) to receivers for audio? Since I got my TV and 3080 I have had my GPU connected to my TV via HDMI and to my receiver via a DisplayPort to HDMI cable. Last night my card only automatically switched to 4K/60Hz and after rebooting the PC now I get no HDMI signal as long as the DisplayPort cable is plugged in. Not sure what happened. The receiver is a Denom 4520ci. Any thoughts?
 
I'm really on the fence with this. Is this TV still worth it even with the unfixable VRR raised-gamma issue and Gsync issue (possibly fixable via firmware)? What's the general consensus?
 
We're really nitpicking. I for one cannot say any raised-gamma issue but I have no idea what I'm looking for nor am I going to go hunt for it. The Gsync stutter only happens for me in COD MW and I just used a fixed refresh rate setting vs gsync and I can't really tell too much. The clarity, color pop is completely unparralleled (I just tested it against the 34gn850 which I had for 2 weeks). And I use it just as a desktop monitor; so those using it for the apps, that's just gravy on top.
 
  • Like
Reactions: MrTX
like this
I'm really on the fence with this. Is this TV still worth it even with the unfixable VRR raised-gamma issue and Gsync issue (possibly fixable via firmware)? What's the general consensus?
Depends. The IQ is vastly better than any other monitor. If having the best display device with a few kinks which may or may not get fixed bothers you, then you're always free to get any of the other monitors with worse IQ and no G-Sync issues. I've only had a chance to use VRR on my XB1 and it's not something I notice.
 
I haven’t had any issues with VRR with my 5700 xt. I’m going to try some more games today. But everything seams smooth with it. Also seams that I’m the only one here with an amd card. Don’t believe that will be an issue soon with the new cards coming out in a week. The only issue I have had is since the last firmware update, I launch a game and get a black screen. Have to toggle off and on game response or whatever it’s called and then everything is fine till I turn the computer off. After the first time it doesn’t do it any more. Still super annoying.
 
I haven’t had any issues with VRR with my 5700 xt. I’m going to try some more games today. But everything seams smooth with it. Also seams that I’m the only one here with an amd card. Don’t believe that will be an issue soon with the new cards coming out in a week. The only issue I have had is since the last firmware update, I launch a game and get a black screen. Have to toggle off and on game response or whatever it’s called and then everything is fine till I turn the computer off. After the first time it doesn’t do it any more. Still super annoying.

You'd still be subject to raised blacks when VRR is enabled while running a frame rate average whose graph runs through rates well below 120fpsHz

... but the VRR stuttering issue is on hdmi 2.1 gpus , at least the only hdmi 2.1 gpus available so far being the nvidia 3000 series. The 2000 series nvidia gpus don't have stuttering either since they are also hdmi 2.0b like your card so are limited in what they can output at 4k.

The 3000 series on hdmi 2.1 doesn't stutter at 8bit or reduced chroma resolution of 4:2:0 either I think, so it's almost like it is limited to running somewhat lower rates like a PG27UQ on dp 1.4 is in a way. Most reports of stuttering are at 100fpsHz and higher ranges but that includes an average frame rate that spans above 100fpsHz (unless you cap it at 100 or lower or set your monitor to 100hz in display settings I guess).

--------------------------------------------

So again, from what I understand the VRR stuttering is in this scenario:
hdmi 2.1 output gpu running 4k 10bit 444 where the actual frame rate anywhere in your frame rate graph goes above 100fpsHz. That could include a 75fps average game that swings 30fps and more.

The raised blacks issue is from the oled emitters being over energized through lower fpsHz rates compared to the gamma curve at 120fpsHz, which sounds like the farther you drop from 120fpsHz the brighter and more grey the blacks would be.


An example of what the actual frame rates can be on a game running ~ 75fpsHz average is in the quote below on metro exodus. As you can see the actual rates hit way higher than 100 (into 4k 10bit 444 stutter range) and also span much lower through 60fpsHz and some much lower "potholes" (very distant from 120Hz gamma).

Metro Exodus 4k Ultra frame rate AVERAGE graph (no DLSS, no RTX), showing what 75.5 fps average can look like in actual frame rates.
https://hothardware.com/reviews/nvidia-geforce-rtx-3090-bfgpu-review?page=4

397857_small_metro-3090.jpg

So it would make sense that you might be able to see the difference, especially if starting at a lower, and apparently brighter, frame rate average to begin with.
 
Last edited:
I just noticed the coolest thing. I have had uncharted 4 for years and never played it. I figured I would give it a go on the cx.

About half an hour into the game I went to see what the picture settings were at and noticed motion smoothing was on. I turned it off and it felt like the nasty 30fps I was expecting from the ps4 pro. Turned it back on and added some de juddder and the game feels great.

Obviously you wouldn't want to play a game like cod that way, but for a game like uncharted... I don't really notice the latency and the smoothess is great.
 
Well that makes sense. I haven't seen any gamma shifting because all of the games I've played so far are games that I can easily get 90+ fps on a 2080 Ti. Giving that Control is quite a demanding game on even the latest Ampere cards it's easy to see why that game shows gamma shifting so easily. So you either calibrate the TV, turn the brightness down 2 clicks from 50 to 48, or simply target 100+ fps at all times. At least there are solutions to this issue if it really bothers you that much.
 
Yeah I'll make one (or several) calibration profile of some sort for the games I won't be playing at 100+fps then. I think that's acceptable because even with VRR I frequently cap my frames way below 117 to get a more consistent experience, esp in the more immersive single player titles.
 
So rtings retested the cx with a hdmi 2.1 signal recently.

The input lag for 4K VRR is 13.9ms?

I find that is a bit high for what I was expecting.
 
Back
Top