LG 48CX

For anyone in America interested the 48" C1 is down to $932 with code NY15OFF.

https://www.ebay.com/itm/184898835257

42" $750'ish next Holiday Season? Count me in.
how much of a problem would it be buying this from an ebay seller (reputable as they may be) vs from a local brick and mortar a la best buy? like if there was some sort of defect or something that makes it not workable in my setup for whatever reason.. i'm keen to save money but dunno if it's worth it in this case
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Keep in mind what I’m talking about PQ I’m talking about for movies.

Does anyone know if LG improve motion handling with the C1 or if they plan on improving it in the next generation of TV?
PQ is absolute. The only HDR displays that follow PQ are the LG in HGIG mode, the G-SYNC Ultimate monitors, and $30 k reference monitors. Everything else is non-standard tone mapping and is inaccurate. You actually prefer the inaccurate oversaturated colours of the Sony OLED over accurate PQ. It uses the same LG panel and cannot magically display brighter colours while still following PQ.

Accurate "motion handling" (whatever that means - it's a bullshit term) is actually the absence of any motion interpolation with just 5:5 pull down to map 24 fps content to a 120 Hz screen. This is the most accurate image as far as motion is concerned. You actually prefer the motion interpolation / smoothing / smearing on the Sony / plasma. LG isn't going to drastically rewrite their motion interpolation algorithm at this point so if you don't like how it looks compared to the Sony don't expect any drastic changes.
 
Last edited:
You actually prefer the motion interpolation / smoothing / smearing on the Sony / plasma. LG isn't going to drastically rewrite their motion interpolation algorithm at this point so if you don't like how it looks compared to the Sony don't expect any drastic changes.
No.

The way Plasma phosphors are driven, results in better, more natural motion handling than the "sample and hold" way of OLED's. And less percieved blur, due to resolving more lines of resolution during benchmark motion tests.

There's something very natural about watching a plasma. And it can really make movies shot on film look fantastic.
 
Last edited:
how much of a problem would it be buying this from an ebay seller (reputable as they may be) vs from a local brick and mortar a la best buy? like if there was some sort of defect or something that makes it not workable in my setup for whatever reason.. i'm keen to save money but dunno if it's worth it in this case
It's eBay. They force sellers to accept returns even if listed as "no returns accepted" especially for a legitimate reason like a defective product. I've bought a lot of stuff from Electronic Express and have had zero issues. They ship TV's double boxed.

You're the only one who can judge whether $170 savings is worth it vs the convenience of a local best buy. Sounds like you want to buy and try and if that's the case get it locally.
 
anyone see the news? 30% more brightness. they're calling it OLED EX for Evo eXperience. LG Display stock up 5% today (ticker LPL)
Welcome improvement if not particularly exciting.

They also seemed to have some weird OLED throne thing with a curved 1500R 55" model. It would be interesting to get the smaller 42" and 48" models as curved but unlikely to happen.

These LG OLEDs are already so good that I can't really think of actually exciting improvements beyond smaller models, RGB subpixels and higher refresh rates.
 
Any word if the new models will eliminate near black VRR flicker? That would be the only reason worth upgrading for me.

Nothing official AFAIK, but given that LG is always trying hard to please gamers, I would bet my money on that they will fix it on 2022 models.
 
Accurate "motion handling" (whatever that means - it's a bullshit term) is actually the absence of any motion interpolation with just 5:5 pull down to map 24 fps content to a 120 Hz screen. This is the most accurate image as far as motion is concerned. You actually prefer the motion interpolation / smoothing / smearing on the Sony / plasma. LG isn't going to drastically rewrite their motion interpolation algorithm at this point so if you don't like how it looks compared to the Sony don't expect any drastic changes.


Any word if the new models will eliminate near black VRR flicker? That would be the only reason worth upgrading for me.

It's been awhile since I've read up about it but the near black flicker could be due to the White subpixel in the WRGB / WOLED array on near blacks.. but it might also be exacerbated by the fact that the panel's gamma is set to 120hz. While using VRR, and at the demanding 4k resolution or near it in uw, a lot of people will be running far beneath 100fps+ average frame rate ranges/graphs (70-85 <<-- 100 -->> 115-117cap). The frame rate and Hz in those scenarios will be low relative to 120 especially at the low end of their frame rate fluctuations. Even 100fpsHz average usually sinks to 85fps and could hit some 70fps potholes throughout that kind of graph.

When watching movies rather than games, you can enable "cinema screen" which multiplies the 24fps movie content x5 to maintain 120fpsHz avoiding anything sub 120fpsHz and staying at 120hz gamma. That also removes judder in low frame rate video entirely but it doesn't remove *stutter* in fast motion and even moderate speed panning scenes due to OLED response times being so fast. The only way to reduce that is to use interpolation but some people don't like the "soap opera effect" which is actually more realistic looking depending. It doesn't look like a soft 35mm film though and the "realism" can show makeup and backgrounds, scene props and other things that are sub-par due to them normally being able to be masked by the opposite of the soap opera effect, the "cinema" or "filmic" look I guess. As far as I know the only way to reduce the bad stutter OLED exhibit in panning shots and some high motion cinematography is to use interpolation, so you have to pick your poison one way or the other or buy a FALD lcd which will have slow enough response time to mask the stuttering. For me personally, I can't stand the stuttering when watching videos/movies.

Note that youtube and netflix use compression so are not good tests for picture quality issues. That near black precipice might be affected by your named mode settings too (brightness, contrast, gamma, etc).
Netflix. The low quality streams cause the low light near blacks to be literally flickering boxes. If you pump up the black levels, you can see that it’s huge macro artifact blocking from awful compression. It makes the screen to overreact to compensate.
Throw in a 4K Blu Ray of the exact same content and you won’t see it at all.

That is a quote from a reddit thread. I don't know that you'd never see it at all ever but it sounds like it would make it much more prevalent in that kind of content.
 
Last edited:
No.

The way Plasma phosphors are driven, results in better, more natural motion handling than the "sample and hold" way of OLED's. And less percieved blur, due to resolving more lines of resolution during benchmark motion tests.

There's something very natural about watching a plasma. And it can really make movies shot on film look fantastic.
That's smearing. It's less accurate even if you prefer it. The correct solution to sample and hold blur is to increase the frame rate of the content. Sample and hold is neither good nor bad - it's an exact representation of the source frames over time. Non-sample and hold displays smear frames over time.
 
Last edited:
Firmware ver.04.30.55 dropped today.

Typical useless release notes

(t) improvement of minor issues related to software (04.30.40)
(u) improvement of minor issues related to software (04.30.55)

And to top if off, as a NYE present, they botched the archive contents by forgetting to keep the file extension *.epk on the updated firmware image.... so download away, unzip to flashdrive and rename the extension-less file to have this required extension for the TV to recognize it and apply the update. Else it will just sit there.
 
Last edited:
That's smearing. It's less accurate even if you prefer it. The correct solution to sample and hold blur is to increase the frame rate of the content. Sample and hold is neither good nor bad - it's an exact representation of the source frames over time. Non-sample and hold displays smear frames over time.
Plasmas resolve more lines of resolution during motion. That's not smearing. Its generally regarded that plasma is excellent for sports and for low framerate stuff, like film. The motion is smooth and you still get the detail. Actual detail.

OLED is fantastic for higher framerate content. But the fact is, it can and often does feel stuttery with low framerate content like film and broadcast TV. And they don't resolve as many lines of resolution during motion benchmark tests. Which further drops perceivable detail, especially with that low framerate content. Sample and hold also results in a percieved blur, even if it technically isn't there.

With higher framerate content, its not as noticeable, because you get an inherently perceived temporal detail quality, from having more frames to look at.
 
Apologies if this has been discussed here already, but if there some way to fix the terrible banding in darker scenes on this display? Using a 3080 at 4k/120Hz/VRR/10bit. Doesn’t seem to make a difference going to 60Hz, or turning off VRR, SDR vs AutoHDR (this scene is Witcher3 but I remember the same thing in Cyberpunk 2077 native HDR). I have another CX 65 in my basement that never has obvious PQ/banding issues like this.

58274E9B-17B1-4629-871B-74D4A0B05C1C.jpeg
 
Right I am talking about stuttering in older video content. I was under the impression the Sony handled that better and thus the Sony is the better OLED for cinema.
 
deal is dead apparently :/
Better deals on C1s will be here soon when a. retailers get wind of the EX series devices b. pre- Superbowl consumer craze starts again c we get closer and closer to next gen announcement and subsequent release in the spring.

At least while supplies last....
 
Right I am talking about stuttering in older video content. I was under the impression the Sony handled that better and thus the Sony is the better OLED for cinema.
The Sony has a better motion interpolation algorithm. It is generating frames in-between, instead of holding a single 24 fps frame for 5 frames. That's all it is. If you crank the slider up to 10 you get the so called soap opera effect with maximum smoothness.
 
Last edited:
Better deals on C1s will be here soon when a. retailers get wind of the EX series devices b. pre- Superbowl consumer craze starts again c we get closer and closer to next gen announcement and subsequent release in the spring.

At least while supplies last....
i need one asap though
 
Apologies if this has been discussed here already, but if there some way to fix the terrible banding in darker scenes on this display? Using a 3080 at 4k/120Hz/VRR/10bit. Doesn’t seem to make a difference going to 60Hz, or turning off VRR, SDR vs AutoHDR (this scene is Witcher3 but I remember the same thing in Cyberpunk 2077 native HDR). I have another CX 65 in my basement that never has obvious PQ/banding issues like this.

View attachment 427268

Some people in the thread opted for 8bit+dithered/FRC (see the quote below). That's a quick an easy way to avoid it if you are noticing it on some content.


In SDR I can't see this issue. Are you talking about HDR? This is exactly what I discovered last week. Colorbanding is greatly reduced with ycbcr444 limited + 8bit in HDR on my C1.
It is very obvious in AC: Valhalla for example, when there is a blue sky without clouds. But it is also visible on test patterns or even youtube videos like this one:



The grey wall shows way less posterization/banding with 8bit + ycbcr444 limited VS 10 bit.
And since I cannot see any difference at all between 10bit and 8bit (+FRC), limited vs full, RGB vs ycbcr444 in terms of picture quality in any content or games, I will just stick to ycbcr444 limited + 8bit.



-----------------------------


Things like youtube use compression which can exacerbate things like banding and near black flashing. I'm not sure if non-native resolution would too. OSD settings also could make a difference (e.g. your TVs being in different rooms might have one in different/brighter ambient lighting to compensate for). I'm not sure why you'd have it on one modern LG OLED and not another outside of maybe having different OSD settings or different source material. You probably see it more in named pc mode ~ rgb/444. Other than that, have you tried different cables or a different output on the gpu? 10bit (or 8bit + frc) and 444 chroma at 4k 120hz requires a lot of bandwidth obviously.

Our new GeForce RTX 30 Series graphics cards support key HDMI 2.1 technologies, including:
  • Fixed Rate Link (FRL): A new signaling technology that’s necessary to achieve higher uncompressed resolutions, such as 8K, and to enable 48Gbps ultra high speed bandwidth speeds

You'd have to clone all of your settings exactly, use the same gpu and cables, same game and settings in OSD, windows, and game settings etc in order to really test each screen against each other apples to apples.

This guide is a pretty good reference for settings which could help eliminate anything that's not optimal settings wise:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/
 
Last edited:
Some people in the thread opted for 8bit+dithered/FRC (see the quote below). That's a quick an easy way to avoid it if you are noticing it on some content.





-----------------------------


Things like youtube use compression which can exacerbate things like banding and near black flashing. I'm not sure if non-native resolution would too. OSD settings also could make a difference (e.g. your TVs being in different rooms might have one in different/brighter ambient lighting to compensate for). I'm not sure why you'd have it on one modern LG OLED and not another outside of maybe having different OSD settings or different source material. You probably see it more in named pc mode ~ rgb/444. Other than that, have you tried different cables or a different output on the gpu? 10bit (or 8bit + frc) and 444 chroma at 4k 120hz requires a lot of bandwidth obviously.



You'd have to clone all of your settings exactly, use the same gpu and cables, same game and settings in OSD, windows, and game settings etc in order to really test each screen against each other apples to apples.

This guide is a pretty good reference for settings which could help eliminate anything that's not optimal settings wise:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

Thanks for this — really helpful. Lots of things to try today. Definitely have different source material into the two TVs .. downstairs one is fed by an Apple TV, PS5 and switch only. Picture is flawless on that one but obviously not using named PC mode. Haven’t tried it with my PC yet. Upstairs TV is my PC only. Settings on both should be optimized and firmware up to date but I will check again with this guide. Happy new year!
 
This looks like the game itself and not banding as a result of the display. Mind sharing the game so I can test.

Yeah, you are totally correct. I just looked at my graphics settings again and turned off the vignette post-processing option .. ugly artifacts are gone. It's The Witcher 3 by the way, my second playthrough. Never remembered anything like that throughout the 170 hours of my first time through it on my last monitor but most likely I just had the option turned off. False alarm! Still, the recommended settings guide posted above proved helpful, seems a couple recommendations have changed since the last time I configured the monitor.

This is also my first time using the Auto HDR option in Windows 11 and I have to say it's very impressive. No downsides really as far as I can tell.
 
  • Like
Reactions: elvn
like this
i got my 48" c1 hooked up tonight and it's amazeballs, holy shit, even at my hdmi 2.0-limited 2560x1440 @ 120hz 8 bit.

i was amazed how easy it was to set up for pc and windows hdr. the only thing that was not intuitive was changing the hdmi input i'm using to "PC." other than that it was just plug and play and flip the hdr switch in windows. and gsync works perfectly too. doesn't even show up as a "gsync compatible"/freesync display, just straight up gsync like my old monitor. the input lag is indistinguishable from my 165hz predator xb271.

and yes i know i'm taking the burn in risk, but because of the room i'm in (can be completely dark, and i arranged everything so there's minimum reflection to deal with) i can keep the oled brightness all the way down to 30, run solid black background, auto hide taskbar, keep the logo dimming setting on high, yadda yadda. we shall see.
 
i got my 48" c1 hooked up tonight and it's amazeballs, holy shit, even at my hdmi 2.0-limited 2560x1440 @ 120hz 8 bit.

i was amazed how easy it was to set up for pc and windows hdr. the only thing that was not intuitive was changing the hdmi input i'm using to "PC." other than that it was just plug and play and flip the hdr switch in windows. and gsync works perfectly too. doesn't even show up as a "gsync compatible"/freesync display, just straight up gsync like my old monitor. the input lag is indistinguishable from my 165hz predator xb271.

Lol, yup. I felt the same way even being limited to HDMI 2.0 (4K, 120Hz, no VRR etc.). As you’ve noticed, the responsiveness is crazy good. Once I got my 3090 and was able to take full advantage of all features…hoo boy, definitely dialed it up to an 11.

I wouldn’t worry too much about burn-in, especially at that OLED light level. You’re already taking a lot of precautions. This is a huge thread and it seems like if it was an issue, we’d have some reports of it by now. I’m at roughly 1.6 years on my CX, used daily as a monitor for both work and play. Still bright, uniform, and beautiful. Never had any burn-in on my old B7 that was used in the exact same manner, either.

Enjoy and welcome to the club!
 
You can show/hide toggle the taskbar with a hotkey using the taskbar hider app just so you know. I prefer the taskbar hider method to the mouse-over method. The regular hidden taskbar in windows can be show using Win+T but using the taskbar hider app I can lock it away until I need to see it which is usually only when I need the system tray for some reason. It also allows me to set a custom hotkey.

http://www.itsamples.com/taskbar-hider.html

taskbar-hider.png


That works great -at least in windows 10. I'm not switching to 11 for awhile yet but I know you can make it revert back to some of the windows 10 interfaces. If anyone wants to try it out on win11 let me know if it works.

With multiple monitors I can also just unlock the taskbar and drag it to a side screen, then hotkey lock it away. I do that and drag it to the top of one of my portrait mode screens even though I leave the central OLED as my primary monitor.

================================================

Taskbar Hider is a simple way to do it.
There is also this Taskbar Hide 3.1.1 app that works with 9x up to windows 10 (no word on win11 and I don't have that installed).
This one has a very rudimentary interface but it seems to work well. I just tried it on my other single monitor machine.

https://www.eusing.com/hidewindows/bosskey.htm

You can use a hotkey like the other app to show/hide the taskbar with this. You can just use it for that but if you want you can do a few more things with this app that the previous one doesn't. Some of the extended features are available using DisplayFusion app which I already use though. If you do try this out, make sure you hit the "Hide Any Part Of The Taskbar" icon one in from the end and enable the "All Taskbar" checkbox so that your running apps will show on the taskbar to start with. If you can't find a window you had open, like a browser, hit the second icon in the "ribbon" at the top of the app to show/hide individual selected apps. I accidentally hid my browser the first time I hid the taskbar.
Hide windows program or Close it
Hide icons from the system tray
Modify application windows order on taskbar
Set any program window stay on top
Minimize applications to tray
Hide taskbar or any part of the taskbar
Change the title and icon of any window
Quiet all sounds when hide program windows
Show window property of application, such as window handle, class name, process id etc.
Maximize or minimize all windows or only IE windows.
Automatically maximize all new windows or only new IE windows.

In addition to the show/hide lockaway toggle, you can strip the taskbar down by unchecking the boxes.
sfM1DxA.png


You can edit the hotkeys to whatever you like:

IWrM1jj.png


====================================

Either of those apps seem to work fine.

I don't really use the taskbar anymore other than the system tray so it usually remains locked away hidden. I have a set set of buttons on my stream deck that launch and min/restore my most used apps individually or all at once. I also use Win+Tab for the task view occasionally. Displayfusion allows you to optionally (a checkbox in the options) only display the apps active on that specific monitor in the alt-tab or ctrl+alt+tab task view/switcher and you can customize the size of the thumbnails. Other than that I hit Win+S and type a few letters to search any app or windows setting panels I need to open that aren't already mapped to my stream deck.

I'd still like to find a way to break out of or duplicate the system tray outside of the taskbar someday. I haven't had any luck finding a way to do that yet. Stripping the taskbar down to the system tray and locking it away with a show/hide toggle hotkey comes pretty close though.



edit: I figured out a passable way to do it using streamdeck multi functions. I am going to set up the stream deck to do this sequence below when hitting a single 'system tray" button.

activate hotkey to show the taskbar .. <very short timer>.. activate window's own Win+B shortcut to highlight the taskbar menu arrow.. <very short timer>.. then <space> or <enter> to open it.

I realized that setting the multi action sequence to hit the taskbar toggle hotkey again at the end of that series will hide the taskbar but leaves the system tray open and floating in it's popup position (dependent on where your taskbar is located). The tray will remain popped up indefinitely or until a few seconds after you click another window active or click on the desktop. While it's popped up you can operate on it's icons normally with a right clicks and it will persist while operating on other icons in the tray. Otherwise will only stay popped up until you click somewhere else or hit a key on the keyboard as normal. So I still have to invoke the taskbar for a fraction of a second to pop up the system tray but otherwise it can be locked away practically forever as far as I'm concerned. :geek:


============================

There is also this app to make the taskbar transparent in case your usage scenario is showing that thin line when the taskbar is hidden/minimized. The hider apps hide it completely though.

https://github.com/TranslucentTB/TranslucentTB/releases
 
Last edited:
i got my 48" c1 hooked up tonight and it's amazeballs, holy shit, even at my hdmi 2.0-limited 2560x1440 @ 120hz 8 bit.

i was amazed how easy it was to set up for pc and windows hdr. the only thing that was not intuitive was changing the hdmi input i'm using to "PC." other than that it was just plug and play and flip the hdr switch in windows. and gsync works perfectly too. doesn't even show up as a "gsync compatible"/freesync display, just straight up gsync like my old monitor. the input lag is indistinguishable from my 165hz predator xb271.

and yes i know i'm taking the burn in risk, but because of the room i'm in (can be completely dark, and i arranged everything so there's minimum reflection to deal with) i can keep the oled brightness all the way down to 30, run solid black background, auto hide taskbar, keep the logo dimming setting on high, yadda yadda. we shall see.
Should be fine. The taskbar plus OLED light at 100 is what got my B6; since I resolved those two (5 years ago) I haven't had any major issues (though nearly 7 years of constant use including two of WFH has clearly caused degradation). And my B6 lacks many of the modern protections OLEDs have. You're taking all the reasonable steps and I wouldn't expect any major problems.

And yes, switching TVs to "PC" mode is a thing, since it disables most of the extra post-processing and ensures correct handling of any incoming data.
 
I'm almost certainly retiring my B6P once the C2 comes out; aiming for 48" this time (don't think I can go down to 42" at once lol).

Heavy use since it came out; taskbar burned in after a year due to stupidity (not minimizing it plus OLED backlight at 100). Outside of that no major burn in despite extremely heavy use (12+ hours/day average) as a monitor. Some wear is evident due to WFH use, but is not typically noticeable in games (really only during all-white screens, where you can see the areas that have worn out and can't be driven as bright anymore).

Overall, it held up extremely well for how heavily it's been used.

Now, to obtain a RTX 3080...
 
From what I've read LG reserves the top 25% of the brightness range, outside of user brightness, for the wear evening routine. So if you use it as a static desktop monitor you are more likely to burn through that buffer sooner. It's been compared to a candle burning down it's wax, or millions of candles I guess.

That buffer wear down can be mitigated by taking some settings precautions. For example, linus not only drove his oled for desktop/apps at high brightness but from his burn in video it looked like he kept apps open constantly with a giant cross of brightly colored window frames from quad window usage. It also looked like he ran large bright white app backgrounds instead of forcing dark themes and probably wasn't using browser plugins (like "color changer", "turn off the lights", etc.) to save adjustments for commonly visited aggressively bright pages. There are some other things you can do as precautions or wear-alleviation in windows and the TV OSD too and across different named settings. Outside of those, one mechanic that I don't think everyone (outside of regulars in threads like these) is aware of or bothers to use the "Turn 'off' the Screen" trick when afk either. The 'turn off the screen' command via microphone or quick menu leaves the screen active on the computer/in the monitor array with everything running on it but it turns off the OLED emitters (without going into standby) until you are back at the screen giving it face time. That saves a lot of wasted and static (e.g. paused or idle game or video) burn time when you aren't actively looking at the screen and it conveniently leaves your paused game or running app active including sound unless you mute it, as well as not dropping the monitor out of windows or a multi monitor array which would be a pita.
 
Last edited:
question... the c1 has an optical audio out. currently i'm using an old 1080p-era yamaha receiver with a 5.1 speaker setup, hooked up to the optical out from my soundblaster z because it supports dolby dts. it's been a really solid audio setup but i'm wondering...

could i theoretically run the optical from the tv to the receiver, connect my other devices like consoles to the tv directly, and essentially use the tv as the "receiver" with my yamaha avr simply acting as the optical decoder + speakers?
 
question... the c1 has an optical audio out. currently i'm using an old 1080p-era yamaha receiver with a 5.1 speaker setup, hooked up to the optical out from my soundblaster z because it supports dolby dts. it's been a really solid audio setup but i'm wondering...

could i theoretically run the optical from the tv to the receiver, connect my other devices like consoles to the tv directly, and essentially use the tv as the "receiver" with my yamaha avr simply acting as the optical decoder + speakers?

It can do dolby but it can't do dts.
 
I've been using a 48" C1 as my primary monitor for about a month and I absolutely love it. I am used to turning off my monitor when I leave or go to bed and have noticed I'm now having trouble getting signal when I turn the monitor back on (never had this issue with my previous monitors). I typically have to turn the computer off and on again to get the signal back which is a huge annoyance. Anyone else having this issue or thoughts on what might resolve it?
 
42 inch C2 is now o ooo oooofficial!
  • 4:4:4 12-bit 120Hz. Maybe not...I may have misread this part.
  • Full 48Gbps HDMI 2.1 bandwidth.
  • Evo panel.
  • No heatsink though. :( G2 has heatsink but there's no options for 42" or 48").
I don't know about you guys but I am happy to go back to OLED!
 
Last edited:
42 inch C2 is now o ooo oooofficial!
  • 4:4:4 12-bit 120Hz.
  • Full 48Gbps HDMI 2.1 bandwidth.
  • Evo panel.
  • No heatsink though. :( G2 has heatsink but there's no options for 42" or 48").
I don't know about you guys but I am happy to go back to OLED!
Only if it were 144Hz, or better yet 160Hz it would be a total monitor killer. I'm not mentioning the 38" form factor because I don't see it happening in the OLED world in any kind of the foreseeable future. 4:4:4 4K 12-bit 120Hz with Nvidia VRR is crazy though. Is like an m4 with akrapovic exhaust.
 
From what I've read LG reserves the top 25% of the brightness range, outside of user brightness, for the wear evening routine. So if you use it as a static desktop monitor you are more likely to burn through that buffer sooner. It's been compared to a candle burning down it's wax, or millions of candles I guess.

That buffer wear down can be mitigated by taking some settings precautions. For example, linus not only drove his oled for desktop/apps at high brightness but from his burn in video it looked like he kept apps open constantly with a giant cross of brightly colored window frames from quad window usage. It also looked like he ran large bright white app backgrounds instead of forcing dark themes and probably wasn't using browser plugins (like "color changer", "turn off the lights", etc.) to save adjustments for commonly visited aggressively bright pages. There are some other things you can do as precautions or wear-alleviation in windows and the TV OSD too and across different named settings. Outside of those, one mechanic that I don't think everyone (outside of regulars in threads like these) is aware of or bothers to use the "Turn 'off' the Screen" trick when afk either. The 'turn off the screen' command via microphone or quick menu leaves the screen active on the computer/in the monitor array with everything running on it but it turns off the OLED emitters (without going into standby) until you are back at the screen giving it face time. That saves a lot of wasted and static (e.g. paused or idle game or video) burn time when you aren't actively looking at the screen and it conveniently leaves your paused game or running app active including sound unless you mute it, as well as not dropping the monitor out of windows or a multi monitor array which would be a pita.
If I were linus... I would run it just like that and have extras in my warehouse to swap to when ever I noticed an issue.
 
Back
Top