LG 48CX

I think that I figured out the major issues I was having with eARC had to do with the Dolby Access/Dolby for home theater Windows app. Once I uninstalled it and set my sound device to 7.1, things seemed pretty stable...not perfect, usable. With eARC switched on and the Dolby Atmos for home theater spatial setup configured, things were so bad they they required a factory reset on the tv. (or unplug and and reconfigure all the hdmi ports because they'd all go non-functional and the config would be lost)

I also got my displayport to hdmi adapter this morning and that's working perfectly to connect directly into my receiver...this is what I'll be sticking with and hopefully eARC dies a quick death before I consider upgrading things again.
 
So BFI is better than Gsync?

BFI is meant to eliminate eye tracking motion blur while gsync eliminates stutters and screen tears. I say BFI is only better if you have so much headroom that you are running games well above the gsync window. The gsync window caps out at 120Hz on the CX so if you are running games at 200 or 300 fps, you aren't even running gsync anymore. That's when you can cap the fps to 120 to match 120Hz and use BFI. Some people will choose to cap their fps and still use gsync though so it's a matter of preference. I personally love BFI and will always use it when I can. Here's some slow motion footage of camera panning with BFI on captured on my Galaxy S20+, you can see everything is crisp and clear in motion just like a CRT. Flicker is only seen on camera, not perceptible in real life.



 
Last edited:
Just turned on and tuned Doom Eternal for HDR on my CX 55 for the first time and HOLY BEJEBUS I had to turn down the OLED Light in order to get this to where I want it. At 2560x1440 120hz HDR, it's just... glorious.
 
Am I off base in thinking that BFI should help with image retention on an OLED? Isn't it technically always changing pixels if you have that enabled?
 
Am I off base in thinking that BFI should help with image retention on an OLED? Isn't it technically always changing pixels if you have that enabled?


Yes and no. Image retention isn't a thing on OLED the same way it is on plasma. It doesn't matter how much time at once an image is on the screen. Is a cumulative wearing of the OLEDs that make up the pixels, so it's how long an image (or part of one) has been displayed on the screen in its entire lifetime, and is exacerbated by heat. BFI helps with this yes, because on high it's effectively displaying the image half as much of the time. But when it is displaying it it's brighter, causing more heat wear. I'm not sure there's any appreciable difference in wear between BFI on high and OLED light at half the value.
 
1440p really looks nice on this TV. And gives you full chroma at 120hz (for some reason it doesn't work at 1080p unless you go down to 60hz).

With my 1080 ti that's a good compromise until Ampere comes out since I have neither VRR nor integer scaling (well I found some tricks to do that with windowed mode actually but it's a bit clunky).
 
I've tried 1440p but just can't tolerate the softness and aliasing + loss of detail in the distance of games when sitting a bit over 1m away.

I do agree though that it's a bit more usable than other TV's I've tried.
 
I've tried 1440p but just can't tolerate the softness and aliasing + loss of detail in the distance of games when sitting a bit over 1m away.

I do agree though that it's a bit more usable than other TV's I've tried.
If this TV was capable of 4k120+HDR over HDMI 2.0b, I'd be using that. Since it isn't, I'll be waiting with baited breath for the HDMI 2.1 video cards.

As good as HDR 120hz looks, I ain't even mad.
 
Just turned on and tuned Doom Eternal for HDR on my CX 55 for the first time and HOLY BEJEBUS I had to turn down the OLED Light in order to get this to where I want it. At 2560x1440 120hz HDR, it's just... glorious.
Try Gears 5. That has the best HDR implementation I've seen, even looks good on HDR 400.
 
Ok so I got the Club 3D HDMI cable today, and the Black screen / no signal issues are gone using with the CAC-1085 adapter. So if you guys use a different HDMI cable not from Club 3D, then u' will have the No signal issues if u change any resolution.
But I still get the Washed out color when turn on HDR in windows though.
 
Nice to hear, wonder what magic they used on those HDMI cables. Makes no sense but it is what it is.

Assuming it works fine withing Windows if you set the desktop resolution to 4K @ 120Hz, HDR on and Full RGB... and then proceed to game at 1080p, 1440p, 720p, 2K. No signal loss? blank screens?

My adapter is long gone, next up RTX3xxx or whatever the marketers call it.
 
I was having issues with changing resolutions with the CAC-1085, erasing all instances of 4096x2160 in the edid with CRU and enabling GPU scaling in the nvidia control panel solved it (otherwise it gets stuck at 4096x2160 even with 3840x2160 selected), no change in hdmi cable which is a 16' hdmi 2.1 cable.

2160p120 RGB with 12bit working fine with a 2080ti and C9 (set back to 10bit after since like the CX it's a 10 bit panel) something to try before buying a new cable if you're having issues on resolution changes
 
The suggestion was to decrease the Windows brightness in the HDR , not the brightness on the display.

You don't want to turn the ABL off completely. It protects the panel from all sorts of damage including burn-in. I read somewhere that someone talked to the LG engineer about it, and was told that ABL is the core feature of the OLED panel, which is highly not recommended to disable. Without it, the panel will deteriorate very quickly.
So, disable it from the menu, but don't disable it from the factory menu, as in - completely.
So, OLED = ABL. Currently, at least.
Well, despite all your advice - I'm going to try disabling the ASBL. I'm somewhat aware of the risks, and realise this might cause problems in the long term, but I'm willing to be a victim of the cause of trying to disable it.

I've ordered the MKJ39170828 remote, as mentioned on forums for the LG C9. Will let you know how it works.
 
Last edited:
BFI is meant to eliminate eye tracking motion blur while gsync eliminates stutters and screen tears. I say BFI is only better if you have so much headroom that you are running games well above the gsync window. The gsync window caps out at 120Hz on the CX so if you are running games at 200 or 300 fps, you aren't even running gsync anymore. That's when you can cap the fps to 120 to match 120Hz and use BFI. Some people will choose to cap their fps and still use gsync though so it's a matter of preference. I personally love BFI and will always use it when I can. Here's some slow motion footage of camera panning with BFI on captured on my Galaxy S20+, you can see everything is crisp and clear in motion just like a CRT. Flicker is only seen on camera, not perceptible in real life.





Pretty neat. I've used strobing before on lcds and I'll prob mess with it for kicks months from now on an OLED but like you said, anyone who wants any kind of graphics eye candy on any kind of demanding game at 4k probably isn't going to be doing over 120fps average let alone minimum. DLSS (and hopefully some interpolation tech later) could help but graphics ceilings on demanding games aren't going to go any lower in future releases, and features like RTX might become more accepted which causes a big frame rate hit.

....BFI, even if you don't register the strobing consciously, is strobing your eyes so it can be eye-fatiguing over time like PWM (even if i'ts more like a static high brightness scene where the PWM rate would not be varying, varying PWM would cause even more aggressive eye strain). The rule of thumb generally is that your brightness and color brightness range is reduced by the same % as the blur reduction you are using, so that can be a big issue for some and pretty much rules it out for any real HDR. capability. e.g. 75% blur reduction = 75% reduced luminance. When a screen has a lower color brightness ceiling it will clip or show white at that peak, basically hit's the ceiling as a dead end one way or another. That can cause lost detail in fields of color, detail-in-colors that cross into the high end sort of like how poor contrast+black depth screens lose detail-in-blacks on the bottom end. You end up with a blob of peak color instead of greater details throughout the colored area, or you end up with a black blob on poor black depth+contrast detail in blacks screens in dark areas of a scene.
.
...It's also worth mentioning that at 120fps solid (minimum) on 120hz or higher monitors, you are getting 50% blur reduction compared to 60fps-Hz without BFI which is something.. (down to a "soften blur" instead of smearing when moving the FoV at speed). 100fps ~> 40%, and 100fps average something like 85 <<--- 100 -->> 115 fps which is still appreciable improvement in both blur reduction and motion definition though of course nothing like "zero" on the blur motion clarity aspect.

I like how your video actually shows the difference as opposed to people posting images saying "look how much better this game looks on my OLED the blacks are so dark and the contrast and colors!" when a lot of us are viewing the posts on edge lit LCDs lol. Even my VA tvs won't do the pictures justice. I do have an OLED tablet I can use but I'm not alway on it. The range of the photos and any changes in uploads would have to be the same as the oled too I think.
 
Last edited:
Well, despite all your advice - I'm going to try disabling the ABL. I'm somewhat aware of the risks, and realise this might cause problems in the long term, but I'm willing to be a victim of the cause of trying to disable it.

I've ordered the MKJ39170828 remote, as mentioned on forums for the LG C9. Will let you know how it works.


I thought you could turn off ABL now. That's a deal breaker for me. I can't stand the display jumping around, I need a static configuration. Damn it LG.
.

According to RTings CX review, the CX has aggressive ABL like the C9, E9.
"The CX has decent HDR peak brightness, enough to bring out highlights in HDR. There's quite a bit of variation when displaying different content, and it gets the least bright with large areas, which is caused by the aggressive ABL. "
That is how it is with HDR.
With SDR, there is a Peak Brightness Setting. Since it limits the peak brightness it doesn't seem compatible with HDR.
View attachment 258782
From the Rtings C9 Review, regarding SDR settings concerning ABL:
"If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes)."
I use dark themes and backgrounds on windows 10, my 3rd party file browser, 3rd party notepad, discord, irc, twitch, sms texting app, foobar, etc. Anything I can. Even on my lcds.

I use" turn off the lights" auto dimming light bulb web browser addon that remembers settings per site and has a white list you can add to for already dark sites like hardforum. It can also optionally show a page brightness/dimming slider that on a click toggles to a small out of the way square instead. It's dimming overlay can be set to click through so that you can click on links which is recommended.

Another browser addon I use is called "color changer" which shows a drop down wheel to change the background and text colors very easily with slider and a ring of pebble looking color gradients. The colors you pick are remembered per site. It also has quick on/always/never buttons. One other addon I use trys to make sure that the browser doesnt switch to a white loading background on momentary page loads which is shockingly bright when it happens.
360425_212950.png

I have a few apps that won't change their backgrounds from white no matter what which is very annoying. I haven't found replacements for them yet but at least some of them don't have to be shown at all times.
 
Last edited:
Pretty neat. I've used strobing before on lcds and I'll prob mess with it for kicks months from now on an OLED but like you said, anyone who wants any kind of graphics eye candy on any kind of demanding game at 4k probably isn't going to be doing over 120fps average let alone minimum. DLSS (and hopefully some interpolation tech later) could help but graphics ceilings on demanding games aren't going to go any lower in future releases, and features like RTX might become more accepted which causes a big frame rate hit.

....ABL, even if you don't register the strobing consciously, is strobing your eyes so it can be eye-fatiguing over time like PWM (even if i'ts more like a static high brightness scene where the PWM rate would not be varying, varying PWM would cause even more aggressive eye strain). The rule of thumb generally is that your brightness and color brightness range is reduced by the same % as the blur reduction you are using, so that can be a big issue for some and pretty much rules it out for any real HDR. capability. e.g. 75% blur reduction = 75% reduced luminance. When a screen has a lower color brightness ceiling it will clip or show white at that peak, basically hit's the ceiling as a dead end one way or another. That can cause lost detail in fields of color, detail-in-colors that cross into the high end sort of like how poor contrast+black depth screens lose detail-in-blacks on the bottom end. You end up with a blob of peak color instead of greater details throughout the colored area, or you end up with a black blob on poor black depth+contrast detail in blacks screens in dark areas of a scene.
.
...It's also worth mentioning that at 120fps solid (minimum) on 120hz or higher monitors, you are getting 50% blur reduction compared to 60fps-Hz without BFI which is something.. (down to a "soften blur" instead of smearing when moving the FoV at speed). 100fps ~> 40%, and 100fps average something like 85 <<--- 100 -->> 115 fps which is still appreciable improvement in both blur reduction and motion definition though of course nothing like "zero" on the blur motion clarity aspect.

I like how your video actually shows the difference as opposed to people posting images saying "look how much better this game looks on my OLED the blacks are so dark and the contrast and colors!" when a lot of us are viewing the posts on edge lit LCDs lol. Even my VA tvs won't do the pictures justice. I do have an OLED tablet I can use but I'm not alway on it. The range of the photos and any changes in uploads would have to be the same as the oled too I think.

All very valid points. It might be possible for LG to mitigate some of the brightness issues with HDR like how Asus managed to make a monitor capable of 400 nits when strobing, while previous LCDs used to be no brighter than 150-200 nits. I don't expect them to ever fully solve the problem though. Games becoming too demanding for 120fps is definitely the real concern as I've already experienced it once. Back when the first Lightboost monitor came out, the Asus VG248QE, I immediately jumped on it and was easily enjoying games at 100/120fps at 1080p on a GTX 780. Then the new consoles launched (PS4/XBONE) and games became increasingly demanding and my once mighty 780 could no longer hold 100fps at 1080p. I expect the same to happen to my 3080 Ti. Regardless though, I will still use BFI whenever it is viable. It's been so long since I've experienced CRT motion.
 
  • Like
Reactions: elvn
like this
All we need is an option to disable ABL and maybe they could limit it to certain low level of OLED level, they can even implement a warning that it voids your warranty or something. I like my displays to be around 120 nits, I doubt the ABL goes much lower than that, and I don't care about HDR either.
 
Well, despite all your advice - I'm going to try disabling the ABL. I'm somewhat aware of the risks, and realise this might cause problems in the long term, but I'm willing to be a victim of the cause of trying to disable it.

I've ordered the MKJ39170828 remote, as mentioned on forums for the LG C9. Will let you know how it works.

If it breaks, just get another one....OLEDs disposable like toilet paper!
In fact, when are we getting OLED toilet paper???
 
All we need is an option to disable ABL and maybe they could limit it to certain low level of OLED level, they can even implement a warning that it voids your warranty or something. I like my displays to be around 120 nits, I doubt the ABL goes much lower than that, and I don't care about HDR either.


idk if you keep missing this in my posts :confused:

I keep re-posting it to all you ABL-offers anyway regardless of how many times you skip it.

These settings do exactly what you are talking about, and even brighter than what you are saying. It's from the C9 review but it should work the same.

From the Rtings C9 Review, regarding SDR settings concerning ABL:
"If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes)."

Don't get ABL confused with peak brightness %windows.. all monitors have that. 2%, 5%, 25%, 50% 100% are progressively lower luminances. Even the samsung FALD LCDs ratchet down compared to % of the screen. ABL is a hard drop to 140nit as a safety reflex vs burn in risk.

OLED have organics and are subject to heat aging them prematurely. I think of an analogy like oled is some kind of glassware that can take flares of stovetop burner heat but not sustained. The idea is to keep it on simmer with some crackling flame flares on the HDR highlights without blasting the whole bottom of the pan with flame and for too long.

The heat issue makes me consider that I'd probably have to re-arrange my whole living room setup if I ever replace my FALD LCD TV in the living room with a big OLED TV. Currently the back of the big tv is to the big picture window in the room, so it gets direct sunlight if the vertical blinds are open. I'd also consider what running an oled in un-air conditioned environments in the summer might do. My living room is 76F right now but my pc room is 83F. The PC room is on the sunny side in the morning and the living room gets hit later in the day. I have some fans cycling the PC room heat out as of just now but throughout the summer it will be air conditioned 24/7 pretty much.

Like others have said, some hot running displays have thicker chasis with fans inside. The panasonic oled tvs availabile in the UK put a whole metal backplane on their oleds internally as a heatsink. This reportedly allows them to run cooler and thus brighter vs burn in limitations.
 
Last edited:
Ok so I got the Club 3D HDMI cable today, and the Black screen / no signal issues are gone using with the CAC-1085 adapter. So if you guys use a different HDMI cable not from Club 3D, then u' will have the No signal issues if u change any resolution.
But I still get the Washed out color when turn on HDR in windows though.

You just cost me some $$$. Bastard!
 
I don't think you can disable ABL even in the service menu. It's the ASBL that can be disabled.
 
I don't think you can disable ABL even in the service menu. It's the ASBL that can be disabled.

Exactly. Automatic Static Brightness Limiter is the only thing I notice in desktop use and it's a bit annoying so I will probably try to disable it eventually.

Still haven't run into ABL and I run mine with 90 contrast. Peak brightness cannot be changed in Game mode so it's set to off so this might only apply to HDR.
 
what wallpaper engine wallpapers are you all using?

I want one that is basically the warp field from star trek in the background. I'm currently using this



but I wish there was a version that I remember from over 15 years ago from a game mod for star trek bridge commander, they created a lovely warp effect where you could pan around the exterior of your ship or have a set camera pathway for beauty shots against the warp field backdrop.
 
Ok so I got the Club 3D HDMI cable today, and the Black screen / no signal issues are gone using with the CAC-1085 adapter. So if you guys use a different HDMI cable not from Club 3D, then u' will have the No signal issues if u change any resolution.
But I still get the Washed out color when turn on HDR in windows though.

That's a very surprising turn of events. Does washed out color in HDR happen in actual HDR content or just on the desktop? Does dropping to 4K 60 Hz change the situation? Might be just running out of bandwidth or something at 4K 120 Hz.
 
That's a very surprising turn of events. Does washed out color in HDR happen in actual HDR content or just on the desktop? Does dropping to 4K 60 Hz change the situation? Might be just running out of bandwidth or something at 4K 120 Hz.

DEsktop, not sure about HDR contents besides HDR game like Cod MW.
 
DEsktop, not sure about HDR contents besides HDR game like Cod MW.

You could try how it looks in YouTube HDR videos. Probably need to use Google Chrome or Chromium based browser to get them to work right.

I put the Club3D adapter and the Club3D 2m "10K" HDMI cable on order, hopefully it will work alright.

Did this solve all your black screen issues? E.g. booting computer, switching resolutions, toggling HDR on/off?
 
You could try how it looks in YouTube HDR videos. Probably need to use Google Chrome or Chromium based browser to get them to work right.

I put the Club3D adapter and the Club3D 2m "10K" HDMI cable on order, hopefully it will work alright.

Did this solve all your black screen issues? E.g. booting computer, switching resolutions, toggling HDR on/off?

For me yes, so don't count on it too much. It might not work for you.
 
For me yes, so don't count on it too much. It might not work for you.

In that case I will just return them. But I have both CX and C9 so I can test if it performs differently on these for some reason. But it's good to know the adapter just doesn't behave right with some cables.

In the past I had issues getting 4K 60 Hz HDR with long 8-10m HDMI cables and found that just the cable could cause all kinds of weird issues like randomly missing resolutions/refresh rates/color spaces, blank screens, flickering etc so I hope this could be a similar case. The cables I tried were all supposedly HDMI high speed capable but in reality had various problems until I got an expensive AudioQuest cable that still works like a charm running signal from my computer in the next room to my living room 65" C9.
 
Ok really dumb question, just got CX OLED and have the following setting in NVIDIA panel:

3840x2160 native resolution
60hz as refresh rate
RGB as color output format
Output dynamic range as full

Along with enabling HDR in the windows settings.

I assume these are the settings we should be using and I assumed that at 60hz I would be able to select 12bit color depth? However, I can only select 8 bit color depth unless I bring down the refresh rate to 30hz? I am using 1080 Ti at the moment.
 
Best Buy is showing as "sold out" for all CX models :( I'm assuming this doesn't mean permanently sold out? I would have expected language like "temporarily out of stock" if they had intended to restock...

https://www.bestbuy.com/site/lg-48-...v-smart-oled-with-hdr/6416732.p?skuId=6416732

Standard Best Buy language for when they are out of stock. There would be red lettering under the description if they were discontinuing the sale of the item.

On another note...as soon as I decided it was time to pull the trigger, Best Buy sold out...lol
 
Ok really dumb question, just got CX OLED and have the following setting in NVIDIA panel:

3840x2160 native resolution
60hz as refresh rate
RGB as color output format
Output dynamic range as full

Along with enabling HDR in the windows settings.

I assume these are the settings we should be using and I assumed that at 60hz I would be able to select 12bit color depth? However, I can only select 8 bit color depth unless I bring down the refresh rate to 30hz? I am using 1080 Ti at the moment.

I have a C8, so your TV may be different, but is there an 'HDMI Deep Color' or similarly named setting in the TV picture options? If I remember right I had to turn that on to unlock the full bandwidth of the cable I'm using.
 
I have a C8, so your TV may be different, but is there an 'HDMI Deep Color' or similarly named setting in the TV picture options? If I remember right I had to turn that on to unlock the full bandwidth of the cable I'm using.

Thank you for the response, it is turned on, I wonder if I should change my HDMI cable. So bizarre, changed my cable, same thing, I was under impression I could do 12 bit at 60hz, not sure what I am doing wrong. I guess I will have to research this a bit.
 
Last edited:
Ok really dumb question, just got CX OLED and have the following setting in NVIDIA panel:

3840x2160 native resolution
60hz as refresh rate
RGB as color output format
Output dynamic range as full

Along with enabling HDR in the windows settings.

I assume these are the settings we should be using and I assumed that at 60hz I would be able to select 12bit color depth? However, I can only select 8 bit color depth unless I bring down the refresh rate to 30hz? I am using 1080 Ti at the moment.

That's the same for me. I think 10 and 12 bit color are only supported at 4:2:2 output format over HDMI 2.0.
 
Ok really dumb question, just got CX OLED and have the following setting in NVIDIA panel:

3840x2160 native resolution
60hz as refresh rate
RGB as color output format
Output dynamic range as full

Along with enabling HDR in the windows settings.

I assume these are the settings we should be using and I assumed that at 60hz I would be able to select 12bit color depth? However, I can only select 8 bit color depth unless I bring down the refresh rate to 30hz? I am using 1080 Ti at the moment.

4k 60Hz 444 RGB 12Bit doesn't fit within HDMI 2.0 bandwidth. Only 8bit. BTW you can still run HDR with those settings, or you can use 10Bit 422 your choice.
 
Got the club 3d adapter today and it has been a nightmare with my Janky Ass Old Ass PS4 era HDMI cable....got one of those fancy 48gbs ULTRA PRON edition HDMIs coming from 3D club and will see on Thursday
 
4k 60Hz 444 RGB 12Bit doesn't fit within HDMI 2.0 bandwidth. Only 8bit. BTW you can still run HDR with those settings, or you can use 10Bit 422 your choice.


Ah thank you, so I am not crazy, this saved me a lot of futile research, thank you
 
Ah thank you, so I am not crazy, this saved me a lot of futile research, thank you

No problem. One more thing to note is that the CX itself cannot do 12Bit 444 at 120Hz, only 10Bit since it's HDMI bandwidth has been capped to 40Gbps instead of the maximum 48Gbps. In reality though, this will make zero difference since the CX is a 10Bit panel and not a 12Bit one. Of course right now no GPU that exists can even do 4k120Hz 10Bit 444, but just something to keep note of for future GPU releases.
 
Just a reminder to remove the protective film from the back of the TV if you haven't yet. Seems keeping it on reduces the panels heat dissipating ability which is clearly important based on Panasonics OLED's that use a thick metal heatsink on the rear and offer higher peak brightness and far less image retention.
 
Back
Top