LG 48CX

Actually elvn Rtings did not manage to reduce peak brightness on their units even the ones that suffered burn-in. But that was on older models, maybe they let you eat away at peak brightness to increase lifespan on recent sets. I'd prefer that personally as I'm mostly consuming SDR content anyway. Ideally, let the user decide? IDK.
Yeah that's what I was getting at but I don't think that is the case, at least not much past the reserved 25% if at all.
I think once you "melt"past your buffer you will get permanent burn in ghost images. If only watching movies, media, and gaming with the burn in limiting features enabled that should last the lifetime of the set though, maybe 5 yrs though I've heard people at 6 with no burn in.

Needless to say, I'm leaving all the protections on the 77" I ordered for the living room and primarily using it for movies and YouTube, a few console games maybe. Too pricey for me to be a short term throw away.

5 years from now people will be upgrading to 8k tvs and will probably also be adopting lightweight microOLED 4k and greater glasses to watch virtual screens in real space / mixed reality. Also years later than that microLEDs in panels.
 
Last edited:
Yeah that's what I was getting at but I don't think that is the case, at least not much past the reserved 25% if at all.
I think once you "melt"past your buffer you will get permanent burn in ghost images. If only watching movies, media, and gaming with the burn in limiting features enabled that should last the lifetime of the set though, maybe 5 yrs though I've heard people at 6 with no burn in.

Needless to say, I'm leaving all the protections on the 77" I ordered for the living room and primarily using it for movies and YouTube, a few console games maybe. Too pricey for me to be a short term throw away.

5 years from now people will be upgrading to 8k tvs and will probably also be adopting lightweight microOLED 4k and greater glasses to watch virtual screens in real space / mixed reality. Also years later than that microLEDs in panels.
I have a 5 year old LG B6 and it's flawless. No burn in. I don't baby it either. All my settings are for greater brightness. Recently got a 77" Sony a80j and again, running it with settings for higher brightness.
 
Hello,

I recently got a LG C1. Most guides recommend to set color format to RGB and color depth to 10 bpc in NVCP, but with these settings I ran into some issues with color banding, for exemple on this test pattern : http://www.lagom.nl/lcd-test/gradient.php

However, when I set RGB 8 bpc in NVCP (displayed as "8-bit with dithering" in windows advanced display settings), the color banding is visibly reduced.

Even better, if I use YCC 444 AND 8 bpc, I can't see any color banding at all in this test pattern.

Is there any downside in using these settings instead of RGB 10 bpc ?
 
Hello,

I recently got a LG C1. Most guides recommend to set color format to RGB and color depth to 10 bpc in NVCP, but with these settings I ran into some issues with color banding, for exemple on this test pattern : http://www.lagom.nl/lcd-test/gradient.php

However, when I set RGB 8 bpc in NVCP (displayed as "8-bit with dithering" in windows advanced display settings), the color banding is visibly reduced.

Even better, if I use YCC 444 AND 8 bpc, I can't see any color banding at all in this test pattern.

Is there any downside in using these settings instead of RGB 10 bpc ?

In SDR I can't see this issue. Are you talking about HDR? This is exactly what I discovered last week. Colorbanding is greatly reduced with ycbcr444 limited + 8bit in HDR on my C1.
It is very obvious in AC: Valhalla for example, when there is a blue sky without clouds. But it is also visible on test patterns or even youtube videos like this one:



The grey wall shows way less posterization/banding with 8bit + ycbcr444 limited VS 10 bit.
And since I cannot see any difference at all between 10bit and 8bit (+FRC), limited vs full, RGB vs ycbcr444 in terms of picture quality in any content or games, I will just stick to ycbcr444 limited + 8bit.
 
If anyone is on the fence, I just saw a Costco ad at my parents' place today, they had 48" C1s for 1099.99. Didn't recall the prices on the other sizes
 
Hello,

I recently got a LG C1. Most guides recommend to set color format to RGB and color depth to 10 bpc in NVCP, but with these settings I ran into some issues with color banding, for exemple on this test pattern : http://www.lagom.nl/lcd-test/gradient.php

However, when I set RGB 8 bpc in NVCP (displayed as "8-bit with dithering" in windows advanced display settings), the color banding is visibly reduced.

Even better, if I use YCC 444 AND 8 bpc, I can't see any color banding at all in this test pattern.

Is there any downside in using these settings instead of RGB 10 bpc ?

In SDR I can't see this issue. Are you talking about HDR? This is exactly what I discovered last week. Colorbanding is greatly reduced with ycbcr444 limited + 8bit in HDR on my C1.
It is very obvious in AC: Valhalla for example, when there is a blue sky without clouds. But it is also visible on test patterns or even youtube videos like this one:



The grey wall shows way less posterization/banding with 8bit + ycbcr444 limited VS 10 bit.
And since I cannot see any difference at all between 10bit and 8bit (+FRC), limited vs full, RGB vs ycbcr444 in terms of picture quality in any content or games, I will just stick to ycbcr444 limited + 8bit.


Lower with frc will give a compression effect almost like anti aliasing so yeah it would soften the color transitions/gradations if there were any. Nothing really wrong with doing that if it looks better to you.

However youtube can also cause problems due to it's own compression, even if you are watching in what I am assuming is the max quality available of that vid from them. You were watching youtube on pc, with which browser or app? Was it full screen? Out of curiousity did you try watching the youtube videos that bothered you instead on the WebOS version of youtube or off of a different device (so that it wouldn't be on the pc named input in order to see how it compares to your pc rgb)?

Where you playing AC Valhalla on PC rather than console? Do you have anti aliasing applied on AC: valhalla? If so what kind and amount and are you at native resolution 4k in the game (consoles use dynamic resolution and checkerboarding)? Full screen exclusive mode HDR of course on PC. Are you using 120hz VRR off of hdmi 2.1 for that game? If so, what were your frame rate averages while you saw that issue (assuming the game doesn't have a 60fpsHz cap)? The gamma on the tv is set for 120Hz so you could try playing it with VRR disabled just to test to see if there is any difference there. Did you have dynamic HDR/Tone-mapping or Hgig enabled or disabled on the TV for the game mode? (Dynamic HDR is inaccurate and off from the absolute HDR metatdata values). Did you use any white point/gamma adjustment in the game's own HDR settings? What are your tv settings including general and especially the brightness, color, sharpness (and BFI) extended settings? Do you have the hdmi input on deep color?

You can check and try these settings for pc:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

Also, how far away are you sitting from the screen?

There are a lot of variables potentially.
 
Last edited:
Lower with frc will give a compression effect almost like anti aliasing so yeah it would soften the color transitions/gradations if there were any. Nothing really wrong with doing that if it looks better to you.

However youtube can also cause problems due to it's own compression, even if you are watching in what I am assuming is the max quality available of that vid from them. You were watching youtube on pc, with which browser or app? Was it full screen? Out of curiousity did you try watching the youtube videos that bothered you instead on the WebOS version of youtube or off of a different device (so that it wouldn't be on the pc named input in order to see how it compares to your pc rgb)?

Where you playing AC Valhalla on PC rather than console? Do you have anti aliasing applied on AC: valhalla? If so what kind and amount and are you at native resolution 4k in the game (consoles use dynamic resolution and checkerboarding)? Full screen exclusive mode HDR of course on PC. Are you using 120hz VRR off of hdmi 2.1 for that game? If so, what were your frame rate averages while you saw that issue (assuming the game doesn't have a 60fpsHz cap)? The gamma on the tv is set for 120Hz so you could try playing it with VRR disabled just to test to see if there is any difference there. Did you have dynamic HDR/Tone-mapping or Hgig enabled or disabled on the TV for the game mode? (Dynamic HDR is inaccurate and off from the absolute HDR metatdata values). Did you use any white point/gamma adjustment in the game's own HDR settings? What are your tv settings including general and especially the brightness, color, sharpness (and BFI) extended settings? Do you have the hdmi input on deep color?

You can check and try these settings for pc:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

Also, how far away are you sitting from the screen?

There are a lot of variables potentially.
In SDR I can't see this issue. Are you talking about HDR? This is exactly what I discovered last week. Colorbanding is greatly reduced with ycbcr444 limited + 8bit in HDR on my C1.
It is very obvious in AC: Valhalla for example, when there is a blue sky without clouds. But it is also visible on test patterns or even youtube videos like this one:



The grey wall shows way less posterization/banding with 8bit + ycbcr444 limited VS 10 bit.
And since I cannot see any difference at all between 10bit and 8bit (+FRC), limited vs full, RGB vs ycbcr444 in terms of picture quality in any content or games, I will just stick to ycbcr444 limited + 8bit.


Thanks a lot for your responses, glad to know I am not alone on this.

Yes I am talking about HDR, in SDR I can't see any problem either.
I checked lors' youtube video and a slight banding on the wall is present in YCC444 8bit, but much more visible in RGB 10bit.

Concerning settings, I followed the guide elvn linked and I am running the TV connected to a HDMI 2.1 port, with a HDMI 2.1 certified cable, at 4K@120, with HGiG on and deep color.
In RGB 10bit mode, the TV's VRR information screen reports :
118.XXHz
VRR
3840 x 2160P@120
RGB 10b 4L10
 
Lower with frc will give a compression effect almost like anti aliasing so yeah it would soften the color transitions/gradations if there were any. Nothing really wrong with doing that if it looks better to you.

However youtube can also cause problems due to it's own compression, even if you are watching in what I am assuming is the max quality available of that vid from them. You were watching youtube on pc, with which browser or app? Was it full screen? Out of curiousity did you try watching the youtube videos that bothered you instead on the WebOS version of youtube or off of a different device (so that it wouldn't be on the pc named input in order to see how it compares to your pc rgb)?

Where you playing AC Valhalla on PC rather than console? Do you have anti aliasing applied on AC: valhalla? If so what kind and amount and are you at native resolution 4k in the game (consoles use dynamic resolution and checkerboarding)? Full screen exclusive mode HDR of course on PC. Are you using 120hz VRR off of hdmi 2.1 for that game? If so, what were your frame rate averages while you saw that issue (assuming the game doesn't have a 60fpsHz cap)? The gamma on the tv is set for 120Hz so you could try playing it with VRR disabled just to test to see if there is any difference there. Did you have dynamic HDR/Tone-mapping or Hgig enabled or disabled on the TV for the game mode? (Dynamic HDR is inaccurate and off from the absolute HDR metatdata values). Did you use any white point/gamma adjustment in the game's own HDR settings? What are your tv settings including general and especially the brightness, color, sharpness (and BFI) extended settings? Do you have the hdmi input on deep color?

You can check and try these settings for pc:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

Also, how far away are you sitting from the screen?

There are a lot of variables potentially.

Good points, but I know how to validate things like that. I'm using OLED since two years. I know Youtube compression can show banding, but it is clearly less visible with ycbcr444 limited + 8bit. Also visible in the Eizo monitortest programm and the gradient test patterns. My C9 behaves exactly the same btw. Also 60Hz vs 120Hz or VRR on or off makes no difference.

I could swear I stumbled across it on the C9 some time ago. Sometimes it seems to me that things go away and then suddenly come back, wether due firmware updates, GPU driver updates or whatever. Or maybe not, and it was always the case lol.

At the moment PC-mode and ycbcr444 limited + 8bit is definitely my choice for HDR atm. I leave HDR enabled btw, also for desktop usage, because I don't want to switch back and forth and I want to watch HDR supported videos on Youtube via chrome browser. HDR/SDR brightness balance slider is set to "0", in the TV settings black stabilizer at 7, fine tune dark area at -10 and HGIG. These settings exactly match SDR at OLED-Light 25 / contrast 85 with gamma 2,4. Picture looks identical to me.
 
Last edited:
Need another tv. PS5 And windows usage. It seems it's impossible to do better than this 48
 
$1099 ?!?

Ok, seriously now. Anyone reading this thread who even THINKS that they may want to buy a truly incredible monitor someday... make it now. At that price, it absolutely is obliterating whatever other gaming display you've ever considered buying.

The Gsync... just works. The HDR... just works. The 120hz... just works. And they all work together at the same time with no drama. And it's OLED. You really aren't going to do any better in the next couple of years. Or ever maybe.

What a deal.

Even more so considering how many other things are overpriced and in short supply.

I agree and just to give some pricing perspective, right now here in Denmark the few CX'es still in stock are priced at 940$ and the C1 is at 1120$ - Vat/sales tax included.
 
Holy cow things are changing fast out there.

I'm working toward moving off Windows to Linux and I did a Manjaro (KDE) install last night and while the included VLC player didn't work with HDR on my RTX 3080 (I never have any luck with it on windows either frankly)... I installed MLC and it played a H265 HDR video just fine.

So that means GSYNC, 120hz, 8bit, HDR, with dual monitors (secondary 1440p IPS) actually works with the CX as the primary monitor under Linux now... even with Nvidia! I'm still kind of shocked.

My requirement for being able to switch my main system was being able to use almost all of the advanced features on the OLED or it just doesn't make sense. Well, it works.

The remaining 2 "advanced" features I really want to see are:

Full true 10 bit desktop support for everything. (probably going to happen soon with the rate of updates going on in the community)
7.1 channel audio support. (I really can live without this, 5.1 works fine.)

That's about it. Thanks to Valve and Wine pretty much all games either are working or are about to as the game devs with anti cheat turn on support ahead of the Steam Deck.

Exciting times.
 
Guys, I just noticed this setting in Windows 10. Right click on desktop > Display settings > Graphics settings:

1635280878704.png


Mine was set to "Off" previously. Will this setting interfere in any way with having VRR/FreeSync/G-Sync/etc. enabled anywhere else? I just don't want to break any functionality. It sounds to me like games would use this setting (if enabled) if they were unable to use native G-Sync/FreeSync/VRR. But then, I'm also unaware of games that wouldn't use those technologies. I thought that as long as your hardware supported VRR, it would work in all titles. Do they have to be coded to support it or something?
 
Last I heard this setting was a band aid fix for Windows Store games. I don't even remember if I have it on or off right now, lol. Most games will definitely use VRR just fine without it (at least in my experience with nvidia) since g-sync mostly "just works".

Meanwhile I just hit 4000 hours on my unit last night and it still looks perfect. Which is expected, based on Rtings test and the OLED light setting I'm running.
 
Holy cow things are changing fast out there.

I'm working toward moving off Windows to Linux and I did a Manjaro (KDE) install last night and while the included VLC player didn't work with HDR on my RTX 3080 (I never have any luck with it on windows either frankly)... I installed MLC and it played a H265 HDR video just fine.

So that means GSYNC, 120hz, 8bit, HDR, with dual monitors (secondary 1440p IPS) actually works with the CX as the primary monitor under Linux now... even with Nvidia! I'm still kind of shocked.

My requirement for being able to switch my main system was being able to use almost all of the advanced features on the OLED or it just doesn't make sense. Well, it works.

The remaining 2 "advanced" features I really want to see are:

Full true 10 bit desktop support for everything. (probably going to happen soon with the rate of updates going on in the community)
7.1 channel audio support. (I really can live without this, 5.1 works fine.)

That's about it. Thanks to Valve and Wine pretty much all games either are working or are about to as the game devs with anti cheat turn on support ahead of the Steam Deck.

Exciting times.

Yet the nvidia driver still crashes on Linux if your TV is off when it tries to resume the display signal - requiring a hard reboot. Really annoying, lots of threads on the nvidia forums, and yet to be fixed. I have to be really careful and make sure I turn the TV on before moving my mouse or anything that will wake the display normally. Otherwise, the CX has been great for me on Linux using it everyday since release.
 
Woudn't it be nice if LG for a parting gift put in a "PC sleep mode" function checkbox so we could have the set come back up with the PC from sleep?

I can dream...

But it does really help prevent any accidents of the screen coming on when you aren't around and being static all day.
 
Woudn't it be nice if LG for a parting gift put in a "PC sleep mode" function checkbox so we could have the set come back up with the PC from sleep?

I can dream...

But it does really help prevent any accidents of the screen coming on when you aren't around and being static all day.

I just got a C1 for the living room recently. On that one I set up my nvidia shield to do a screen saver that cycles a gallery of full black and other dark images randomly after 5 minutes idle. I also set it up to turn off the screen after 20 or 30 minutes of not running anything. I might set the power off to longer than that but that's how I have it right now. However devices like a shield or a PC can crash or freeze, reboot freeze, get stuck on a floating error message on a top layer above the screensaver, the screensaver itself get stuck, etc. thereby leaving static material on the screen potentially even if it's rare. So a device screensaver is not entirely safe.

When I'm going afk I use the "turn off the screen" trick whenever I can manage to. That just turns off the oled emitters without putting the screen into standby. When I return to give the screen face time again I hit any button on the remote and it wakes the emitters up instantly. I usually just hit the right side of the navigation ring to turn the emitters back on. I only hit the power button to standby when I'm going to sleep for the night, leaving the house for awhile, or if I'll be busy doing something else for hours.

The remote could get bumped somehow and turn the screen emitters back on I guess (e.g. cats) but that hasn't been a problem so far. I think a screen saver on the default device/pc combined with "turn off the screen" emitters is a pretty solid combo. I do wish the LG tv would have it's own screensaver on a timer and more defined "turn off" timer. 4 hours is way too long. Other than the 4 hr idle timer - as far as I'm aware the only timer available in the TV's OSD is not a usage/idle time but a coundown/scheduler style timer.

I turn all three of my tv-screens on the first time I use the pc setup that day. It's not an issue for me. The side screens are both the same model samsung so I can use a single remote to turn both of those screens on.

-------------------------------------------------

Now that I think of it, I can potentially turn everything on from google assistant/home via voice or clicking the saved room if I set it up that way. e.g. "Ok Google, turn on PC room". or "Ok google, turn on PC screens". That and vice-versa with the "off" versions of those commands. I'll have to try that out. Not certain that I can get the samsung screens onboard (since they don't have nvidia shields on them to wake them via hdmi and don't have built in google assistants) but I'll google it to see if I can give it a shot somehow. I have google assistant on my phone and I already had a speaker unit version in my living room so I don't have to rely on the TV's built in one while it's off.

Edit: according to this device Q&A answer to whether it is possible to power my samsung TV's on and off via google assistant I should be able to do it:
"Yes it is, you attach your smart TV to the smartthings app then go into google home add new device and add your smartthings account to your home and it will show up"


I know. I hear ya. It's just too darn big. I love my 65" C9 for movies and anime though. No one ever talks about how great OLED is at displaying anime.

I set up a named setting just for anime/cartoon stuff. I turned the motion settings up to make the lower framerate animations work better and I tweaked the color/saturation, brightness, etc. to better suit cell shaded and hard lined cartooning. It looks really nice. My shield 2019 also upconverts things really well, especially paired with an oled. It's really pretty amazing looking. Even old lower rez live action stuff without new masters like "Farscape" in 720p 4:3 looks better than 1080p, practically 4k fidelity compared to what it would look like otherwise. I'm very impressed.


I don't think the 48" is too big, it's just too big for a "terminal" setup. It's only a good size if you can do a command center style setup at 38" - 48" viewing distance from the screen to your eyeballs. That rules out the stereotypical chair-desk-monitor up against the wall like a bookshelf / upright piano style setup.

That said, personally I don't use my 48cx for static desktop/apps. It's a media and gaming "stage" or theater with 4k VA screens on the sides for the static stuff.. but those are big too, both are 43" 4k VA screens.
 
Last edited:
48 C1 is same price at target now too in case that matters to anyone ~ $1100 (w/o warranty)
 
What is max luminance of LG C1 in SDR in strobing (motion blue reduction) mode?
 
Yet the nvidia driver still crashes on Linux if your TV is off when it tries to resume the display signal - requiring a hard reboot. Really annoying, lots of threads on the nvidia forums, and yet to be fixed. I have to be really careful and make sure I turn the TV on before moving my mouse or anything that will wake the display normally. Otherwise, the CX has been great for me on Linux using it everyday since release.
I kinda have the same issue under windows. If I turn off my CX without my other monitor on neither display will turn on til I reboot. It is not a crash since I can hear everything going on through the speakers and it doesn't stop mining.
 
I apologize if this has been asked before in this thread bu I only scrolled through like 20 pages.

I love my cx48 but the panel but the auto-dimming feature is killing me. I can deal with this in the desktop by quickly min/maxing everything, it gets old but it's quick fix every few minutes. But this is completely unacceptable playing a slow paced very darkly lit game, every few minutes it makes the game unplayable. It's a deal breaker honestly. I saw there is a remote that lets you open settings to change this? It doesn't seem like you can disable this but at least change the timeout to something no where as aggressive as it is. Was hoping for advice for someone that has the remote on what is possible.

This monitor is past the return window so I can't take it back. If I have to decrease the lifespan of the panel to 1/4 of normal to be able to use the monitor than so be it.
 
Meanwhile I just hit 4000 hours on my unit last night and it still looks perfect. Which is expected, based on Rtings test and the OLED light setting I'm running.

My 55C7 was perfect until about 8,000 hours mark. By 14,000 hours it was noticeably burned in and I replaced it with 48C1. I used 100% OLED and 100% contrast (max settings) through its life. I had pixel shift and both ABLs turned on always.

For best results only light up the pixels that you are actually using:

Remove desktop icons or move them to a side monitor (I used 2nd small monitor)
Auto hide taskbar or move it to second monitor
Black screen background in windows
Dark theme in windows
Black screen saver on short timeout (the TV will display images for extended time if you just put monitor into power saving)
Dark reader in your browser
 
Probably better off doing all of that plus using another non OLED screen on the side for static stuff. I use a few in portrait mode as well as a 10" tablet on a metal easel style stand along with 10" keyboard. The tablet is also often in portrait mode. It comes in handy when I'm doing full-screen exclusive (HDR) gaming and want to type to people or look something up without clunkily dropping out of Full Screen Exclusive HDR mode (lots of blinking and clicking over like a sloppy resolution change).



I love my cx48 but the panel but the auto-dimming feature is killing me. I can deal with this in the desktop by quickly min/maxing everything, it gets old but it's quick fix every few minutes. But this is completely unacceptable playing a slow paced very darkly lit game, every few minutes it makes the game unplayable. It's a deal breaker honestly

I didn't turn off any asbl or abl and I played:

- Jedi: Fallen Order (in HDR mode)
- Nioh 2 (HDR mode)
- Darksiders 3 (with reshade applying a fakeHDR filter and a lightroom filter for a quasi HDR look)
- some Fenxy: immortals rising (in HDR mode)
- some assassin's creed odyssey (in HDR mode)

I'm also looking forward to "SDR+" ~ partial HDR color volume games using windows 11's auto HDR in the long run. (the might update win10 to use it eventually too, rumored but I didn't expect them to)

The brightness limiters didn't bother me at all in those games (on the 48 CX) and those games are well known to have a lot of dark fantasy areas and areas where you are walking or traversing heights or pitfalls, "tight rope walking", avoiding water hazards etc. so you aren't always dashing around. Brightness limiters haven't bothered me in dark fantasy movies either for that matter on both the CX and the C1. I'm not saying I've never seen it happen on occasion but it's nothing that's slapping me in the eyeballs constantly consiously where I'd consider it overt and annoying.

I've not gamed on the C1, only the 48CX so there could be a difference between them. There are also a lot of settings that can vary between people's setups:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

.. and games that allow gamma/white point/brightness adjustments in HDR that people could set up in a way that could exacerbate abl triggering. Some people might also use overly bright peaks and brightness settings in SDR settings on their OLED from the TV's OSD settings that would trigger it more often. You could also potentially suffer more ABL by using dynamic HDR and/or dynamic color scaling settings which use relative/interpolated~guessed values (sort of like using a dynamic vivid mode) - instead of letting the HDR metadata be tone mapped directly by the tv alone to absolute values ~ HGiG or leaving SDR settings more natural.

For SDR - If you set your peak brightness low enough it really shouldn't trigger ABL anymore but asbl would still happen occasionally on a fully static screen.
The Rtings C9 review mentions how the peak brightness setting, which still exists in the CX and C1, etc, can aggravate ABL triggering.
The C9 has a new Peak Brightness setting, which adjusts how the ABL performs. Setting this to 'Off' results in most scenes being displayed at around 303 cd/m², unless the entire screen is bright, in which case the luminosity drops to around 139 cd/m². Increasing this setting to 'Low', 'Med', or 'High' increases the peak brightness of small highlights.
If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes).

That would cover you for ABL (not asbl) for SDR usage. SDR is typically in those ranges anyway. Reference is 100nit white. Some people use 120 or 200 based on how bright their room is. With an oled, or any real theater media viewing, you shouldn't be in a powerfully bright lighting environment though imo.

https://www.lightillusion.com/viewing_environment.html

If you were to calibrate a home TV within a Home Viewing Environment to the above Reference Standards for SDR displays the on-screen image would be far too dull and washed-out with the imagery nothing like that intended by the program maker. To arrive at a viewing image that is as similar as possible to look & feel of the graded intent means increasing the peak brightness, as well as reducing the EOTF value, so the on-screen imagery can better counter the bright viewing environment.


Obviously, if you are serious about watching accurate imagery at home you will control your viewing environment. But even then, there may be compromises that have to be accepted, and understanding that will enable the best alternative calibration to be defined.


The issue is that there are no standard approaches as to how to adjust TV display calibration to accurately counter the variable environmental settings within a standard home viewing environment. This is an area in desperate need of research, and is something the BBC have attempted to look at with their HLG HDR standard's System Gamma value, as HDR viewing environments are far more critical than SDR.

https://www.lightspace.lightillusion.com/uhdtv.html



I also use a browser addon called "turn off the lights" that auto dims web pages by % I set. That and/or the "color changer" addon that can change background and text color on a per site basis on the fly. They both have blacklist/whitelist so you can exclude sites that don't need those addons applied (like hardforum for example with it's dark theme). I use those addons on all of my pc's browsers. The turn off the lights addon acts like ABL but the transition is much more noticeable due to varying web page load times. If set on auto it doesn't dim the web page until it's fully loaded.. Once I apply the color changer addon manually per site I don't have to dim them anymore though. If ABL was that slow and on "every page load" / scene change in the same way as the turn off the lights addon it would be super annoying but that is not the case with my TV and my settings (which more or less match those in the reddit link above). I've primarily been watching and playing HDR content and I have no complaints on the dimming safety features overall.
 
Last edited:
I apologize if this has been asked before in this thread bu I only scrolled through like 20 pages.

I love my cx48 but the panel but the auto-dimming feature is killing me. I can deal with this in the desktop by quickly min/maxing everything, it gets old but it's quick fix every few minutes. But this is completely unacceptable playing a slow paced very darkly lit game, every few minutes it makes the game unplayable. It's a deal breaker honestly. I saw there is a remote that lets you open settings to change this? It doesn't seem like you can disable this but at least change the timeout to something no where as aggressive as it is. Was hoping for advice for someone that has the remote on what is possible.

This monitor is past the return window so I can't take it back. If I have to decrease the lifespan of the panel to 1/4 of normal to be able to use the monitor than so be it.

Then just disable ASBL...

https://play.google.com/store/apps/details?id=by.makarov.smarttvlgrc

Hit IN START, Code: 0413, go to OLED section and disable TPC and hit EXIT OR IN START again.

Your phone must support IR blaster afaik.
 
Last edited:
The LG C1 48" appears to be $1099 everywhere in the US. This is the time to pick up this display. If you're on the fence and have just over $1000 to drop on a monitor, do it.
 
Anyone able to take a Pic of the voltage input specs on their 48c1 or 48cx monitor (or a Pic of the psu board inside the tv/monitor)? I am trying to confirm if they are dual voltage (100-240v, 50/60hz) or locked to us specs (120v, 60hz) Since I frequently move internationally.

Thanks in advance!
 
Anyone able to take a Pic of the voltage input specs on their 48c1 or 48cx monitor (or a Pic of the psu board inside the tv/monitor)? I am trying to confirm if they are dual voltage (100-240v, 50/60hz) or locked to us specs (120v, 60hz) Since I frequently move internationally.

Thanks in advance!
My CX says it's 120V 50/60hz, so should be usable with a relatively cheap transformer.
 
My CX says it's 120V 50/60hz, so should be usable with a relatively cheap transformer.

That's interesting (and lines up with what I have read at AVS or reddit); the combination of those power frequencies with a specific voltage is pretty non standard. According to some of those threads, the actual psu inside the TV is 100-240v, but the exterior of the TV is marked as 120v.

May just have to snag one on BF and pop it open :)

Thanks!
 
That's interesting (and lines up with what I have read at AVS or reddit); the combination of those power frequencies with a specific voltage is pretty non standard. According to some of those threads, the actual psu inside the TV is 100-240v, but the exterior of the TV is marked as 120v.

May just have to snag one on BF and pop it open :)

Thanks!
Yeah I would have expected any internally-DC electronics these days to be 100-240V 50/60hz, especially when sold globally...but who knows. Like I said, worst case it's nothing a $50 transformer can't solve. Requiring 60hz would have been a much bigger problem.
 
Back
Top