LG 48CX

Yeah definitely removed. But don't worry, it was a huge pile of crap even on the C9. It would show you two 16:9 inputs, of which one had to be the TV signal. Then those were shrunk down to the middle of the screen with some constantly shown UI elements.

It was pretty unusable. Which is a real shame because a proper PbP mode with splitting even just two inputs horizontally or vertically would be just awesome.
Negative, I dont think it was a pile of crap at all. Removing it was a stupid decision which I hope they re-add to the c1 line up. Was just watching today, nfl on one screen, youtube on the left and can switch audio. You can watch either hdmi or cable tv or whatever. This is on a 2018 B8 55" too. Sucks they removed it on the CX line up.
 
Negative, I dont think it was a pile of crap at all. Removing it was a stupid decision which I hope they re-add to the c1 line up. Was just watching today, nfl on one screen, youtube on the left and can switch audio. You can watch either hdmi or cable tv or whatever. This is on a 2018 B8 55" too. Sucks they removed it on the CX line up.
My undertanding is that on the B8 it also allowed two HDMI. Not so on the C9. The one on the C9 was an awful user experience altogether and just discovering the feature was difficult. For some reason they first made the feature worse then just removed it.

Adding a proper PbP mode on the C11 series would actually be a big enough feature that I might sell my CX for one. I've been missing it on my current setup.
 
I use multiple screens and sometimes have three windows stacked on one of my 43 inch screens that are in portrait mode. One or all three can be video windows (like running my hdhomerun ota antenna in an app window for local news). So I can do that, or some other app window tiled 2/3 if the height of the screen (eg web browser, file browser, or email)..with a video frame or other window on top or bottom.

Now that I have a lg cx I'll use it for a full screen media playback and gaming stage /"window" . Even if i run some smaller video window(s) on a portion of the side portrait monitor, I'll swap whatever I want full screen to "the main ring" on the fly using display fusion hotkey's tied to streamers buttons.

If I really wanted to, on the CX, I could set up displayfusion window hotkey's to place video windows into quads, or "half screen" on each side (letterboxed) or "half" on one side and one above the other stacked on the other side.
= = or | = or = |

If you have external feeds you just have to figure out what hardware (eg. capture card or networked tuners
) you'd need to get to import the signal to a player on your pc directly or via stream instead of using the ports on the TV itself. Using power mixer app for windows you can easily adjust the audio of each app individually too. I set-up power mixer to enable midi input and use a cheap small midi board's faders to make it easier to do but you can use the power mixer app window's on screen volume sliders per app with your mouse.
 
Last edited:
On my RTX 3080 and CX 55.

Capped my frame rate @ 115, turned on gsync, set resolution to 4K120 4:4:4 10-bit, fired up Unreal Tournament 3 (and old game I knew would stay pegged at 115 fps @ 4K)

Boy... that sucker was smooooooooooooooth.

LG... you done good. Real good. 👍

EDIT: confirmed with the Freesync Information page.
 
Last edited:
I just played a round of Overwatch locked @ 115 FPS 4K 4:4:4 10-bit.

Does it get any better than this? This is stupendous!!!! Gaming has never looked this good, IMO. Everything is sharp, clear, colorful, fast, and not a frame tear in sight.

If you're a PC gamer, the LG CX should be your goal for a display. This thing is on a whole new level now that LG has fixed the HDMI 2.1 gsync issues. Wow....
 
I just played a round of Overwatch locked @ 115 FPS 4K 4:4:4 10-bit.

Does it get any better than this? This is stupendous!!!! Gaming has never looked this good, IMO. Everything is sharp, clear, colorful, fast, and not a frame tear in sight.

If you're a PC gamer, the LG CX should be your goal for a display. This thing is on a whole new level now that LG has fixed the HDMI 2.1 gsync issues. Wow....

Not JUST PC gaming, but ALL gaming. I played 10 hours of Demon's Souls on PS5 yesterday and I feel like HDR performance on the CX is improved upon when compared to my old B7. BTW I found this video pretty helpful for any other PS5 owners here:

 
On my RTX 3080 and CX 55.

Capped my frame rate @ 115, turned on gsync, set resolution to 4K120 4:4:4 10-bit, fired up Unreal Tournament 3 (and old game I knew would stay pegged at 115 fps @ 4K)

Boy... that sucker was smooooooooooooooth.

LG... you done good. Real good. 👍

EDIT: confirmed with the Freesync Information page.
What was the biggest change for you with the new update?
 
I just played a round of Overwatch locked @ 115 FPS 4K 4:4:4 10-bit.

Does it get any better than this? This is stupendous!!!! Gaming has never looked this good, IMO. Everything is sharp, clear, colorful, fast, and not a frame tear in sight.

If you're a PC gamer, the LG CX should be your goal for a display. This thing is on a whole new level now that LG has fixed the HDMI 2.1 gsync issues. Wow....

In your NV Panel under resolution settings are you using the PC resolutions or the Ultra HD, HD, SD resolutions? I'm asking because I'm using the PC resolutions
and have just now experienced losing signal to the TV randomly. I'm also using Zeskit 8K cables. I have the same settings as you except my refresh is set to 120Hz.
 
In your NV Panel under resolution settings are you using the PC resolutions or the Ultra HD, HD, SD resolutions? I'm asking because I'm using the PC resolutions
and have just now experienced losing signal to the TV randomly. I'm also using Zeskit 8K cables. I have the same settings as you except my refresh is set to 120Hz.
Has to be the PC one because the TV one is limited to 60hz refresh rate.
 
On my RTX 3080 and CX 55.

Capped my frame rate @ 115, turned on gsync, set resolution to 4K120 4:4:4 10-bit, fired up Unreal Tournament 3 (and old game I knew would stay pegged at 115 fps @ 4K)

Boy... that sucker was smooooooooooooooth.

Since UT3 should be able to easily run locked at 115fps 100% of the time, here's something else you can try:

  • Set the framerate cap to 120fps
  • Turn off gsync.
  • Turn on "Motion Pro" (black frame insertion).

Now it should be really smooth...though I kind of wonder if LG's implementation of black frame insertion at 120Hz impacts the input lag (I'd love to check this myself, but I only have access to an E9 which doesn't support BFI on 120Hz inputs). It is however worth noting that Rtings' review of the CX states that VRR @ 4k has around double the input lag of 4k 120Hz without VRR (yet interestingly 1080p + VRR is listed with the lowest input lag of all).
 
F648718C-1BAB-4761-B6C6-71F952B21294.jpeg
A5C9C616-C9E5-4610-828E-101CAE37B687.jpeg

All setup and I’m really liking it. One thing that’s bothering me, using my 2080Ti - the image improvement going from 60hz to 100+hz is huge (I felt the same way back when I had the first gen ASUS ROG Swift). So that means I have to run in YCbCr 4:2:0. I only care about gaming and while gaming i don’t notice a difference. But I do notice little things like the RTSS overlay isn’t quite as saturated of a yellow color compared to RGB 4:4:4. And the Lagom LCD test black level and white saturation images clip at the low/high end when using YCbCr 4:2:0. So this leads me to wonder what image quality sacrifices I’m making while gaming, if any. I’m sure there are others here struggling between the lesser of two evils, what are people choosing?
 
View attachment 304546View attachment 304547
All setup and I’m really liking it. One thing that’s bothering me, using my 2080Ti - the image improvement going from 60hz to 100+hz is huge (I felt the same way back when I had the first gen ASUS ROG Swift). So that means I have to run in YCbCr 4:2:0. I only care about gaming and while gaming i don’t notice a difference. But I do notice little things like the RTSS overlay isn’t quite as saturated of a yellow color compared to RGB 4:4:4. And the Lagom LCD test black level and white saturation images clip at the low/high end when using YCbCr 4:2:0. So this leads me to wonder what image quality sacrifices I’m making while gaming, if any. I’m sure there are others here struggling between the lesser of two evils, what are people choosing?

First off, amazing setup! Personally I've been mainly using 120Hz 420 for the past 5 months and I would only change back to 60Hz 444 for HDR gaming which I haven't had much of on PC. 8Bit 420 is perfectly fine for SDR gaming.
 
I have an Ampere card now but I gamed with 4:2:0 120hz for a while and yes it looks totally fine in games, my only real issue was lack of VRR due to my 1080 ti. But no such problem with a 2080 ti, I would 100% not have jumped on Ampere if I had one.
 
Since UT3 should be able to easily run locked at 115fps 100% of the time, here's something else you can try:

  • Set the framerate cap to 120fps
  • Turn off gsync.
  • Turn on "Motion Pro" (black frame insertion).

Now it should be really smooth...though I kind of wonder if LG's implementation of black frame insertion at 120Hz impacts the input lag (I'd love to check this myself, but I only have access to an E9 which doesn't support BFI on 120Hz inputs). It is however worth noting that Rtings' review of the CX states that VRR @ 4k has around double the input lag of 4k 120Hz without VRR (yet interestingly 1080p + VRR is listed with the lowest input lag of all).
I get about 1/60 of a second extra lag when enabling 120Hz BFI. (about 4 extra frames with a 240 FPS phone camera compared to with BFI off, using a keyboard LED as reference)

120 Hz BFI is one of the reasons I held out until this year to buy an OLED.
 
I have an Ampere card now but I gamed with 4:2:0 120hz for a while and yes it looks totally fine in games, my only real issue was lack of VRR due to my 1080 ti. But no such problem with a 2080 ti, I would 100% not have jumped on Ampere if I had one.
HDR wouldn't work at 8-bit 4:2:0 120 Hz, which is a deal breaker for most.
 
I get about 1/60 of a second extra lag when enabling 120Hz BFI. (about 4 extra frames with a 240 FPS phone camera compared to with BFI off, using a keyboard LED as reference)

120 Hz BFI is one of the reasons I held out until this year to buy an OLED.
Are you running in game mode? I find that any other mode will have slightly higher input lag even with instant game response enabled. Not enough to be a major issues but enough to be noticeable.

I haven't noticed a perceivable difference between BFI on vs off.
 
View attachment 304546View attachment 304547
All setup and I’m really liking it. One thing that’s bothering me, using my 2080Ti - the image improvement going from 60hz to 100+hz is huge (I felt the same way back when I had the first gen ASUS ROG Swift). So that means I have to run in YCbCr 4:2:0. I only care about gaming and while gaming i don’t notice a difference. But I do notice little things like the RTSS overlay isn’t quite as saturated of a yellow color compared to RGB 4:4:4. And the Lagom LCD test black level and white saturation images clip at the low/high end when using YCbCr 4:2:0. So this leads me to wonder what image quality sacrifices I’m making while gaming, if any. I’m sure there are others here struggling between the lesser of two evils, what are people choosing?
Whats that desk you have behind your desk to support the oled? Is that buyable somewhere?
 
Whats that desk you have behind your desk to support the oled? Is that buyable somewhere?
It’s two ikea nightstands that we bought but ended up not having a use for. I started using them about a year ago with my 43” Samsung as a temporary solution that ended up being more permanent than I expected. It’s a bit of a janky setup but it works.
 
I've not had a ton of time at the panel yet but I did put on a 4k hdr documentary and a few movies (guardians of the galaxy and a few others), a few episodes of mandalorian, etc. to test it out. The first thing I watched was a documentary whose opening scene had some people in native dress in some country playing hot potato with a ball on fire in the dark. It was VERY impressive looking. All of the content was after that too. I tweaked some of the basic settings in a few profiles and set it up to switch profiles by voice in order to switch between movie and game settings easier (I'll have to go over all of my settings on the TV and GPU again to make sure I have everything set up best later). Then I bought jedi: fallen order because I heard the HDR was exceptional on that game and it happens to be on sale. I didn't play too much yet but it looked very good. I'm going to have to turn some settings down from extreme though since it doesn't support sli and so I'm down to a single 1080ti for the moment. I'll keep checking stock for the 3000 series gpu I have my eye on but I wouldn't be suprised if I can't get one until sometime Q1 2021 for hdmi 2.1 VRR 120hz 444 and a lot more graphics power.

My phone and tablet have been OLED for years so I have some experience with small scale oled at least. My new phone is also OLED. I've touted HDR from what I've read about it and what I've seen momentarily on other sets but this is my first real HDR capable TV in my home and it's stunning. Things like shafts of light beaming down from overhead in a darker area in per pixel HDR are nothing like you can get on a regular screen. There are a lot of other examples like sci fi ships, lighting, stars in the sky, laser beams, spell effects, dragon's fire, etc and a lot of other more natural ones. To anyone who has one of these and heard anyone in these threads saying HDR is meaningless you have to know those words were not true now with the truth staring you in the face. I realize hdr gaming is hit or miss but if you can find a decent HDR implementation it is very nice to have there too.

Now I'll have to prioritize buying/supporting titles with worthwhile HDR and good DLSS 2.0 implementations just like I used to lean toward sli supported titles when possible in the past. For example, I chose Jedi: Fallen Order instead of two other action/adventure titles on sale at the moment because it was the only one of the three with quality HDR verified in threads I had read.
 
Last edited:
elvn I thought my Q90R was "decent" at HDR content, but the truth is... it flat out sucks at HDR content. It does have FALD, which does great in high bit-rate content such as 4K Blu-ray. On compressed and low bit-rate HDR, though, it just flat out sucks. Lots of light smearing and FALD bloom, along with lagging FALD updates on high contrast scenes.

A lot of people see the CX's limited peak brightness and think "oh, that sucks for HDR". On the contrary... HDR looks, in my opinion, the best on OLED. It has massive punch from the fact that one pixel can be full brightness, and the one next to it is completely dark. You also don't have the lagging FALD of LCD TV's. Most people that say HDR sucks on OLED probably hasn't experienced it. It's amazing.
 
View attachment 304546View attachment 304547
All setup and I’m really liking it. One thing that’s bothering me, using my 2080Ti - the image improvement going from 60hz to 100+hz is huge (I felt the same way back when I had the first gen ASUS ROG Swift). So that means I have to run in YCbCr 4:2:0. I only care about gaming and while gaming i don’t notice a difference. But I do notice little things like the RTSS overlay isn’t quite as saturated of a yellow color compared to RGB 4:4:4. And the Lagom LCD test black level and white saturation images clip at the low/high end when using YCbCr 4:2:0. So this leads me to wonder what image quality sacrifices I’m making while gaming, if any. I’m sure there are others here struggling between the lesser of two evils, what are people choosing?
Not to go off topic, but what is the second pic
View attachment 304546View attachment 304547
All setup and I’m really liking it. One thing that’s bothering me, using my 2080Ti - the image improvement going from 60hz to 100+hz is huge (I felt the same way back when I had the first gen ASUS ROG Swift). So that means I have to run in YCbCr 4:2:0. I only care about gaming and while gaming i don’t notice a difference. But I do notice little things like the RTSS overlay isn’t quite as saturated of a yellow color compared to RGB 4:4:4. And the Lagom LCD test black level and white saturation images clip at the low/high end when using YCbCr 4:2:0. So this leads me to wonder what image quality sacrifices I’m making while gaming, if any. I’m sure there are others here struggling between the lesser of two evils, what are people choosing?
Hey,

Not to derail, but what's the second game?
 
elvn I thought my Q90R was "decent" at HDR content, but the truth is... it flat out sucks at HDR content. It does have FALD, which does great in high bit-rate content such as 4K Blu-ray. On compressed and low bit-rate HDR, though, it just flat out sucks. Lots of light smearing and FALD bloom, along with lagging FALD updates on high contrast scenes.

A lot of people see the CX's limited peak brightness and think "oh, that sucks for HDR". On the contrary... HDR looks, in my opinion, the best on OLED. It has massive punch from the fact that one pixel can be full brightness, and the one next to it is completely dark. You also don't have the lagging FALD of LCD TV's. Most people that say HDR sucks on OLED probably hasn't experienced it. It's amazing.

I think another thing is that the HDR experience itself has been refined throughout the years. I bought my first OLED back in 2016 and that was when the first HDR games were starting to roll out on PS4 Pro. The experience back then was definitely more of a "miss" rather than "hit" with many games either looking no better or possibly even WORST when played in HDR. Also, anyone remember just how bad HDR on Windows 10 used to be? Fast forward to today and now game developers have gotten better at implementing HDR into their games and Windows itself is so much better at handling HDR as well. Combine that with the improved HDR performance on the CX vs older OLEDs and now I can confidently say that HDR is an absolute must for me, whereas before due to bad early impressions I was totally fine on skipping out on it.
 
elvn I thought my Q90R was "decent" at HDR content, but the truth is... it flat out sucks at HDR content. It does have FALD, which does great in high bit-rate content such as 4K Blu-ray. On compressed and low bit-rate HDR, though, it just flat out sucks. Lots of light smearing and FALD bloom, along with lagging FALD updates on high contrast scenes.

A lot of people see the CX's limited peak brightness and think "oh, that sucks for HDR". On the contrary... HDR looks, in my opinion, the best on OLED. It has massive punch from the fact that one pixel can be full brightness, and the one next to it is completely dark. You also don't have the lagging FALD of LCD TV's. Most people that say HDR sucks on OLED probably hasn't experienced it. It's amazing.

Some people just look at the specs without understanding the limitations of the technology. On paper you might think an LCD is better for HDR, but in reality it's only better in rare edge cases.

When they measure contrast, color volume, and pretty much any picture quality specification they use a static image.
But when you watch a movie or play a game there is a lot of motion. Because LCDs take so long to transition you aren't getting full color volume or color accuracy while that pixel is transitioning. Not even close.
You may think, oh but my LCD has a 1ms response time, it's only bad for a split second. Well that's also a misleading specification, because when they measure LCD response times they only count the time it takes to do 90% of the transition. The last 10% of the transition can take 10s of milliseconds and typically doesn't even completely finish the same frame.
An OLED has near instantaneous 100% complete transitions and you get 100% of the specified color volume, color accuracy, and contrast 100% of the time no matter how fast the motion is.
 
Hi guys,

I am considering getting a 4k LG OLED for 70% productivity, 15% gaming, 15% movies/series, and my desk is 90cm deep.

The productivity part is mainly office/admin related.

* How sharp is the text? Is it comparable to let's say Dell Ultrasharp Office monitors?
* How calm is the picture?
* I am running a ASUS ROG Strixx 2060 O6G, will I have to get a HDMI 2.1 capable card?
* Will the cheaper LG 48CX3LB be equal to the LG 48CX9LB in desktop/productivity?

LG 48CX9LB ~1550 Euro
LG 48CX3LB ~1350 Euro

Thanks
Marco
 
* Will the cheaper LG 48CX3LB be equal to the LG 48CX9LB in desktop/productivity?

As I understand it the 9LB has dual DVB tuner, which the 6LB and 3LB doesn't have. I cant find good english information on the 3LB... But when using it as a monitor it wont matter.
 
Hi guys,

I am considering getting a 4k LG OLED for 70% productivity, 15% gaming, 15% movies/series, and my desk is 90cm deep.

The productivity part is mainly office/admin related.

* How sharp is the text? Is it comparable to let's say Dell Ultrasharp Office monitors?
* How calm is the picture?
* I am running a ASUS ROG Strixx 2060 O6G, will I have to get a HDMI 2.1 capable card?
* Will the cheaper LG 48CX3LB be equal to the LG 48CX9LB in desktop/productivity?

LG 48CX9LB ~1550 Euro
LG 48CX3LB ~1350 Euro

Thanks
Marco
1. Plenty sharp to me, but this is all subjective.
2. Not sure what this means...
3. If you're cool with 60hz for the productivity work you're fine.
4. No idea about the differences.
 
I didn't like the giant top layer volume overlay windows uses on the master volume. I found this little tray (or no tray) app to hide that top layer volume window.

http://wordpress.venturi.de/?p=1#more-1

Use at your own risk as always. Seems to be working fine for me though I haven't rebooted yet so I can't comment on it auto starting.

Btw Jedi: fallen order looks amazing once I got some time to actually play it and enabled HDR in the game menus. :LOL:

A friend is telling me AC - odyssey has great HDR on his LG CX. I've also heard HDR works in sekiro though I don't know anyone with a CX who owns it. (It works but you used to have to alt-tab out and back in apparently.. idk if that has been patched). If anyone knows about sekiro's HDR please lmk b/c I'm considering buying it if/when it goes on sale again for xmas sales. I was playing shadow of war with settings cranked last month and I think even that game has HDR even though it came out in 2017. A lot of games within the last 5 years look great with graphics cranked up.
 
Last edited:
Playing through SOTR now... almost done with it. Holy cow what a pretty game on the CX in HDR.
Some people just look at the specs without understanding the limitations of the technology. On paper you might think an LCD is better for HDR, but in reality it's only better in rare edge cases.

When they measure contrast, color volume, and pretty much any picture quality specification they use a static image.
But when you watch a movie or play a game there is a lot of motion. Because LCDs take so long to transition you aren't getting full color volume or color accuracy while that pixel is transitioning. Not even close.
You may think, oh but my LCD has a 1ms response time, it's only bad for a split second. Well that's also a misleading specification, because when they measure LCD response times they only count the time it takes to do 90% of the transition. The last 10% of the transition can take 10s of milliseconds and typically doesn't even completely finish the same frame.
An OLED has near instantaneous 100% complete transitions and you get 100% of the specified color volume, color accuracy, and contrast 100% of the time no matter how fast the motion is.
This. The "quality" of the HDR because it can illuminate the brighter pixels individually is absolutely incredible especially when you consider that the black starting point is "perfect black." You simply don't need as much overall differential in brightness to be an incredible experience.

I KNOW that for HDR the OLED light level should be set to 100, BUT because I use my CX48 for desktop use I tuned it to an acceptable point at OLED light 75 for HDR content. This is simply to provide some protection against burn in if I forget and leave HDR on or leave a HDR game with a static HUD up for too long.

In the dim room I use the CX48 in, the HDR experience is still VASTLY SUPERIOR to any other HDR experience I've seen on any other screen up to now. It's still remarkable.

Sure, in a bright room, superbright items on screen like lasers and fire won't be amazing, but as we already know, you should not be using an OLED in a bright room for desktop use at all, and in general not if you can help it.

If you must run the screen in torch mode in a bright environment, an OLED isn't your best choice anyway. Used properly, it is unbeatable.
 
Playing through SOTR now... almost done with it. Holy cow what a pretty game on the CX in HDR.

This. The "quality" of the HDR because it can illuminate the brighter pixels individually is absolutely incredible especially when you consider that the black starting point is "perfect black." You simply don't need as much overall differential in brightness to be an incredible experience.

I KNOW that for HDR the OLED light level should be set to 100, BUT because I use my CX48 for desktop use I tuned it to an acceptable point at OLED light 75 for HDR content. This is simply to provide some protection against burn in if I forget and leave HDR on or leave a HDR game with a static HUD up for too long.

In the dim room I use the CX48 in, the HDR experience is still VASTLY SUPERIOR to any other HDR experience I've seen on any other screen up to now. It's still remarkable.

Sure, in a bright room, superbright items on screen like lasers and fire won't be amazing, but as we already know, you should not be using an OLED in a bright room for desktop use at all, and in general not if you can help it.

If you must run the screen in torch mode in a bright environment, an OLED isn't your best choice anyway. Used properly, it is unbeatable.
Leave OLED light at 100, set the HDR/SDR brightness balance slider in Windows to 7%. SDR content will be the brightness of OLED light ~30.
 
I KNOW that for HDR the OLED light level should be set to 100, BUT because I use my CX48 for desktop use I tuned it to an acceptable point at OLED light 75 for HDR content. This is simply to provide some protection against burn in if I forget and leave HDR on or leave a HDR game with a static HUD up for too long.

In the dim room I use the CX48 in, the HDR experience is still VASTLY SUPERIOR to any other HDR experience I've seen on any other screen up to now. It's still remarkable.
Leave OLED light at 100, set the HDR/SDR brightness balance slider in Windows to 7%. SDR content will be the brightness of OLED light ~30.

I intuitively turned the windows hdr balance slider down at some point when setting everything up, until it looked good on regular desktop stuff. Not sure how far / what percent.

I don't mind leaving my HDR at full in regard to burn in concerns though b/c I use a full black desktop wallpaper and no taskbar or icons. The only things I'm running on the OLED are movies /videos and games as a media stage. I have a 43" 4k 60hz VA (~ 6100:1) on each side for everything else. There is a 43" samsung 8 series VA on sale from some vendor referenced on slickdeals at the moment if anyone is looking. At least it was last time I checked.
 
Playing through SOTR now... almost done with it. Holy cow what a pretty game on the CX in HDR.

This. The "quality" of the HDR because it can illuminate the brighter pixels individually is absolutely incredible especially when you consider that the black starting point is "perfect black." You simply don't need as much overall differential in brightness to be an incredible experience.

I KNOW that for HDR the OLED light level should be set to 100, BUT because I use my CX48 for desktop use I tuned it to an acceptable point at OLED light 75 for HDR content. This is simply to provide some protection against burn in if I forget and leave HDR on or leave a HDR game with a static HUD up for too long.

In the dim room I use the CX48 in, the HDR experience is still VASTLY SUPERIOR to any other HDR experience I've seen on any other screen up to now. It's still remarkable.

Sure, in a bright room, superbright items on screen like lasers and fire won't be amazing, but as we already know, you should not be using an OLED in a bright room for desktop use at all, and in general not if you can help it.

If you must run the screen in torch mode in a bright environment, an OLED isn't your best choice anyway. Used properly, it is unbeatable.

Youtuber HDTVTest said that HDR is suppose to be viewed in a pitch black room anyways, or at most a very very small amount of ambient light. Anyone who is viewing HDR in a super bright room is already doing it wrong ;)



12:30
13:05
 
Last edited:
No, I'm not talking about the Windows HDR balance slider. I have that set properly. There are other ways you can royally screw up:

I'm talking about leaving an HDR image, from an HDR source, up on the screen. I don't want to accidentally pause an HDR movie, or leave a hud up on an HDR game for a long period of time, and have there by any possibility that the pixels are static at 100.

You pick what you are comfortable with. I'm using this screen for nearly all kinds of use and I don't want to have to be THAT careful, so I have accepted limited the max pixel brightness. For this kind of situation, it should be exponential, so even a small restriction of the maximum should provide quite a lot of safety margin for keeping the screen in good shape for 5+ years of heavy desktop use, which is my goal.
 
No, I'm not talking about the Windows HDR balance slider. I have that set properly. There are other ways you can royally screw up:

I'm talking about leaving an HDR image, from an HDR source, up on the screen. I don't want to accidentally pause an HDR movie, or leave a hud up on an HDR game for a long period of time, and have there by any possibility that the pixels are static at 100.

You pick what you are comfortable with. I'm using this screen for nearly all kinds of use and I don't want to have to be THAT careful, so I have accepted limited the max pixel brightness. For this kind of situation, it should be exponential, so even a small restriction of the maximum should provide quite a lot of safety margin for keeping the screen in good shape for 5+ years of heavy desktop use, which is my goal.

You're fine because the TV has ASBL which will automatically dim the screen if you leave a static logo on for a set period of time, unless that function is disabled when the TV is in HDR mode or something. Even the PS5 has an auto dimming feature that you can set; was playing Demon's Souls last night then walked away from the TV to eat and came back to find the entire screen had been dimmed down a good amount.
 
You're fine because the TV has ASBL which will automatically dim the screen if you leave a static logo on for a set period of time, unless that function is disabled when the TV is in HDR mode or something. Even the PS5 has an auto dimming feature that you can set; was playing Demon's Souls last night then walked away from the TV to eat and came back to find the entire screen had been dimmed down a good amount.
Yeah assume the TV is made for the average max brightness Joe to walk into max brightness Best Buy, get sold the TV, and replicate max brightness Best Buy at home with no clue. LG wouldn't be selling the TV if this behavior ACTUALLY generally messed it up quickly because there are a lot of max brightness Joes out there that would leave bad reviews and poison the technology. There's enough max brightness Joe protection built in that an informed [H] user will be 10000000% fine.
 
Back
Top