LG 48CX

If you think the ISF Expert mode is piss yellow, that is you being used to a very blue tinted image and seeing that as white. Our eyes can be easily deceived like this and many think that a blue tinted screen looks sharper and has "more white" whites. Rtings uses display calibrator hardware for measuring things according to a 6500K white point which is kind of like the daylight white you would see in real life, which is usually warmer rather than blueish. When I calibrated my CRG9 to 120 nits and 6500K, at first I felt that it was way too dark and looked too warm but after getting used to it, now it feels comfortable and accurate. Color accuracy also depends on ambient lighting conditions as well as individual panel variance, which is why you can't just plop in calibration values made on monitor A in environment X and expect them to be perfect on monitor B in environment Y. But they might be a decent starting point nonetheless.

Often display presets are made for what the manufacturer thinks looks most impressive in a store at max brightness under fluorescent lights. That is a completely different situation than running it at home at far lower brightness and in a much darker room or using it during the daytime in bright sunlight. People will usually respond to color settings that produce the most vivid looking colors despite this being usually very inaccurate. Wide gamut LCDs and preset modes on displays clearly show a prevalence of these settings over more accurate ones.

I don't think many people even have a colorimeter to get some actual readings and instead are just basing off their owns eyes to determine what looks proper and what doesn't. Getting a colorimeter was probably one of the best investments I've ever made and I would recommend anyone picking up a $500+ display to invest in one.
 
Last edited:
I don't think many people even have a colorimeter to get some actual readings and instead are just basing off their owns eyes determine what looks proper and what doesn't. Getting a colorimeter was probably one of the best investments I've ever made and I would recommend anyone picking up a $500+ display to invest in one.

I can agree with that. I borrowed one from work (just a pretty cheap Spyder 5 Pro) right before coronavirus hit and it has been useful for setting up displays. It's very difficult to accurately gauge color hues by eye, the best I've managed is using a grayscale image and trying to remove all tint from it.
 
LG has started their first China OLED factory that produces 48” panels at large volumes. I expect to see 48” on G series next year and potentially come with a lower price tag to boot.
 
I got my 55cx, ran a firmware upgrade (03.00.60), and both the pendulum demo (default checked g-sync when started, still worked windowed) and my eyes in games think g-sync works in 120Hz 4k 4:2:0. windowed and fullscreen. (also did a full clean video driver installation).
55" is borderline too tall, but I'm ok with it. I wouldn't recommend the 55" unless you're really sure it'll fit and it wont be too tall.

Game mode looks like garbage, no exaggeration, but it is "picture mode - game". In "picture mode - normal" it looks fine, and still allows instant game response etc, could it just be a misunderstanding and that picture mode - game is not necessary? It is just called "picture mode" after all. I can't Imagine LG forcing people to use that mode (it really looks as bad as some monitor game modes).

I do not feel any extra input lag with picture mode - normal and instant game response - on, but that doesn't mean it isn't there. I can test the input lag later or tomorrow. But most likely, game mode is just a trashy unecessary picture mode. Description in picture sounds like "instant game response" includes all normal game mode functions.

I "think" gsync on 4k120 4:2:0 works, but from my experience, it also adds a significant amount of input lag compared to 1440p120 RGB gsync. I've kept it @ 1440p for this reason. Maybe HDMI 2.1 GPUs will fix this.
 
I'm sure that bypassing the near black flashing fix when VRR is active is not what LG would like to happen. It might be the only way it can work, or it might get fixed in a later firmware update who knows. As it has been explained to me - LG originally had near black dithering in game mode on the C9 just like outside of game mode, but apparently it screwed up the resolution/detail too much in game mode for some reason. So they later did a firmware update that instead flattened the blacks in game mode. This still works with VRR disabled on the C9 and CX, but it is apparently bypassed with VRR mode activated. - So no I don't think it is always just "they know way better how it looks best so just accept it".. it is an ongoing engineering of the units , at least hopefully it is. Plainly some things are work arounds and those methods and workarounds can be improved or later scrapped and replaced by a different method in some cases. This also goes for "just use 4:2:2" , "8bit dithered is fine" etc. 1:1 source fidelity and sending a native resolution, native color resolution, uncompressed audio, etc. unaltered and uncompressed is best and should be available whenever possible, even if "ok enough", "you won't notice most of the time" signals and processing exist.

While there have been specific higher brightness modes added to newer TV lines just for bright viewing environments - outside of that HDR's default hard coded values are their own absolute values. HDR's defaults without work arounds via OSD settings are designed for a dim to dark theater environment - not a bright store. In fact many people (in average to bright viewing environments) complain that the bulk of the scenes being in SDR brightness ranges are too dim in HDR, which is probably why they added the brighter modes as work arounds which you can enable via OSDs. I'll also add that the C9 and E9 both have auto hardware calibration features and that the CX probably will support that as well eventually if you have a user license for the software and have calibration hardware.

However like HDR authoring, calibration is usually based in a dim to dark room (with hardware right up against the panel). Once you alter the viewing environment to a higher brightness all the values including contrast and saturation are not going to be the same anymore to your eyes and brain. So unless you have a bunch of settings and you are switching between them with your room lighting - your calibrated values are way off to your eyes. Kind of like how a flashlight is blindingly bright in the dark at night but in the daytime looks dim and useless. I keep 5 or 6 sets of named settings on my living room tv for this reason.
 
Last edited:
Sorry, not trying to spam videos, but I have one more in regards to size/viewing distance. This video makes me chuckle and question my "Is 48 inches still too big?" doubts :)

 
Just look at VR for a huge FoV screen. But the game dynamics and interfacing, HUDs, etc are designed around it. Also, if you are using a virtual screen it is typically a window within the FoV not using the whole wall.
 
Sorry, not trying to spam videos, but I have one more in regards to size/viewing distance. This video makes me chuckle and question my "Is 48 inches still too big?" doubts :)



I like this as a concept, but having tried my 65" LG C9 living room TV like this it just isn't a feasible thing until we have reasonably priced 8K in this size. Sitting that close, the resolution gets somewhat low and while usable, looking at different sides of the screen to see things becomes awkward as you are turning your head to look at corners of the screen where a lot of UI elements are. Games also tend to go fullscreen unless you run at a lower resolution with no scaling or in borderless windowed mode so that is an issue on a monitor this huge.

I do like the idea of using a black background with taskbar and desktop icons hidden with a smaller window in the middle so you have a more comfortable viewing experience. I will probably try this on the 48" when I get it, maybe even setup Displayfusion splits to put the Windows taskbar there if it can do it so you can effectively scale your viewing area as needed, with the real resolution being the main limitation. The OLED blacks means there is no visible bezels so it looks like a window floating in blackness.
 
I like this as a concept, but having tried my 65" LG C9 living room TV like this it just isn't a feasible thing until we have reasonably priced 8K in this size. Sitting that close, the resolution gets somewhat low and while usable, looking at different sides of the screen to see things becomes awkward as you are turning your head to look at corners of the screen where a lot of UI elements are. Games also tend to go fullscreen unless you run at a lower resolution with no scaling or in borderless windowed mode so that is an issue on a monitor this huge.

I do like the idea of using a black background with taskbar and desktop icons hidden with a smaller window in the middle so you have a more comfortable viewing experience. I will probably try this on the 48" when I get it, maybe even setup Displayfusion splits to put the Windows taskbar there if it can do it so you can effectively scale your viewing area as needed, with the real resolution being the main limitation. The OLED blacks means there is no visible bezels so it looks like a window floating in blackness.

Having tried everything from 27" 4K to 65" OLED on my desk the last cople of years I would agree that 65" is just to big. 55" is OK if you have a somewhat deeper desk, I am using my 55"GX about 70-80 cm away and that is fine. It should be said that I almost never run anything in fullscreen (I didn't on the 4K 27" either) as I find it to be not very efficiant for my tasks (programming and similar). Also, having proper text settings for Cleartype etc makes a huge difference. It is easy to think that its the resolution that is the problem otherwise but it really isn't.
 
Personally I'm keeping this monitor as a media "stage". Using this as a text based work screen (programming, authoring, even primarily text-based web page digestion) is not what I would buy something like this for.

Regarding sizes... Technically, any screen size would work if you sat back far enough, for example a living room couch or chair with a good lap desk. Most people wouldn't want put an actual desk that far away. A 65" screen at comfortable viewing distance singly to me would be 4.5' to 5' minimum to my eyeballs. A 48" or 55" LG OLED with screen(s) on the side in my setup will be a fair distance back for me similarly. My peripheral desk and chair are on rollerblade style wheels though so I can move them closer as desired for things like 21:10 aspect immersive gameplay on a racing game or similar (assuming/hoping that custom rez 21:10 ~ 3840x1600 with 120hz VRR will work with hdmi 2.1 gpus on these). I sit about 6.5' and mainly 8' away from the screen to my eyeballs leaning back on couch with a lap desk from a 70' Vizio Fald 4k in my living room when I occasionaly plug my 4k laptop into it. It works well enough for general pc web and media use. I wouldn't be doing coding or spreadsheets on it though.

A7gEgdY.png


-------------------------------------------------------

This sounds interesting from an avsforum member regarding bit depth,color format, and dithering settings: (use at your own risk if considering it)

https://www.avsforum.com/forum/40-o...aming-thread-consoles-pc-14.html#post59699820

ARwUh93.png
 
Last edited:
Having tried everything from 27" 4K to 65" OLED on my desk the last cople of years I would agree that 65" is just to big. 55" is OK if you have a somewhat deeper desk, I am using my 55"GX about 70-80 cm away and that is fine. It should be said that I almost never run anything in fullscreen (I didn't on the 4K 27" either) as I find it to be not very efficiant for my tasks (programming and similar). Also, having proper text settings for Cleartype etc makes a huge difference. It is easy to think that its the resolution that is the problem otherwise but it really isn't.

Yeah I don't remember running anything but games in fullscreen in a long time. Just no point to it when most applications don't benefit from it when the resolution and screen size gets high enough.
 


I tested the 55 CX, full input lag chain. 117fps cap in nvidia panel. vsync off. faster response - ultra.
Ignore the millisecond counter, it turned out to be low precision (I always try new ones, none so far have held up). G-sync is on, and clearly visibly functioning.
120Hz 4k with 4:2:0 chroma subsampling. Glorious model o - 4ms debounce. When counting frames locally with the file, there are 25 960fps frames from finger impact to cs:go movement - roughly 25ms, which is nothing abnormal.

As a baseline, I have previously done the same test on asus pg27UQ with UN-capped framerate, and hit ~20ms. The uncapped framrate with roughly 450fps almost lowers the input lag by roughly the difference between the two runs, so the LG 55CX in 120Hz VRR-gsync is on par with the pg27uq, or at most a couple milliseconds slower.

The precision of the camera (s9+ super slow mo) is probably not perfect, but with the same test system and camera over multiple runs, it should be accurate enough relative to the two monitors. LTT did a similar test and the better mice hit roughly 15ms total rendering chain input lag with a faster tn 240Hz, which pretty much makes up for the 15 -> 20ms difference between their run and mine, confirming that the baseline is solid. https://youtu.be/orhb7Njj3h8?t=277

LTT 240Hz uncapped fps test ~15ms
My pg27uq 120Hz uncapped fps 450ish ~20ms
My LG 55 CX - VRR on 117fps cap ~25ms

So input lag issues seem non-existent.
This is NOT with game mode, so as suspected that's just a cosmetic picture mode. "Instant game response" is on, all random processing off.
Subjectively it looks and feels snappier than the pg27u, probably because of the near instant pixel-response. I have really missed the clarity of crt/oled/ulmb during my two years with pg27uq, even though it has really fast response for a IPS.
 
Last edited:
Sorry, not trying to spam videos, but I have one more in regards to size/viewing distance. This video makes me chuckle and question my "Is 48 inches still too big?" doubts :)


65" inch monitor only makes sense when you want to replace multiple monitors with one screen. This means instead of using fullscreen like you used to do with a small monitor, you would game only in windowed mode and have the rest of the screen estate for other apps. I've actually tried this before but 4k resolution is too small for this purpose and VRR windowed mode didn't work at that time (or still doesn't).
 


I tested the 55 CX, full input lag chain. 117fps cap in nvidia panel. vsync off. faster response - ultra.
Ignore the millisecond counter, it turned out to be low precision (I always try new ones, none so far have held up). G-sync is on, and clearly visibly functioning.
120Hz 4k with 4:2:0 chroma subsampling. Glorious model o - 4ms debounce. When counting frames locally with the file, there are 25 960fps frames from finger impact to cs:go movement - roughly 25ms, which is nothing abnormal.

As a baseline, I have previously done the same test on asus pg27UQ with UN-capped framerate, and hit ~20ms. The uncapped framrate with roughly 450fps almost lowers the input lag by roughly the difference between the two runs, so the LG 55CX in 120Hz VRR-gsync is on par with the pg27uq, or at most a couple milliseconds slower.

The precision of the camera (s9+ super slow mo) is probably not perfect, but with the same test system and camera over multiple runs, it should be accurate enough relative to the two monitors. LTT did a similar test and the better mice hit roughly 15ms total rendering chain input lag with a faster tn 240Hz, which pretty much makes up for the 15 -> 20ms difference between their run and mine, confirming that the baseline is solid. https://youtu.be/orhb7Njj3h8?t=277

LTT 240Hz uncapped fps test ~15ms
My pg27uq 120Hz uncapped fps 450ish ~20ms
My LG 55 CX - VRR on 117fps cap ~25ms

So input lag issues seem non-existent.
This is NOT with game mode, so as suspected that's just a cosmetic picture mode. "Instant game response" is on, all random processing off.
Subjectively it looks and feels snappier than the pg27u, probably because of the near instant pixel-response. I have really missed the clarity of crt/oled/ulmb during my two years with pg27uq, even though it has really fast response for a IPS.


Thanks for testing. I had my suspicions that gsync should work at 4k120 420, no reason why it shouldn't.
 
Thanks for testing. I had my suspicions that gsync should work at 4k120 420, no reason why it shouldn't.
It works sometimes, and input lag on 4k/120 gsync is noticeably worse compared to 1440p120 gsync. Just keep this in mind.

I can always tell if i'm playing @ 4k120 compared to 1440p120 on my CX. The difference in input lag is very noticeable, even without any measuring tools.
 
It works sometimes, and input lag on 4k/120 gsync is noticeably worse compared to 1440p120 gsync. Just keep this in mind.

I can always tell if i'm playing @ 4k120 compared to 1440p120 on my CX. The difference in input lag is very noticeable, even without any measuring tools.
Not for me (g-sync always works in 4k 120Hz, fullscreen, windowed, games + pendulum demo).. It's also very unreasonable that hardware-scaling 1440p would reduce input lag. if anything, it would add some. Maybe you are just running lower fps in 4k so it feels laggier?
 
Last edited:
Regarding the guy with the 65 inch TV, this question was posed on the YT channel:

Did you consider the 55"? Just wondering why you went with one that's SOOO big for your viewing distance when the 55" is already big. How far away from the screen are you, 30" or so?

He responded:

I sit about 25"(64cm) away from the screen. I did consider getting the 55" model, but this 65" model have better discount and cost just a little more than the 55" model.


!!To each his own, of course, but that seems like an outrageous viewing distance.
 
Not for me (g-sync always works in 4k 120Hz, fullscreen, windowed, games + pendulum demo).. It's also very unreasonable that hardware-scaling 1440p would reduce input lag. if anything, it would add some. Maybe you are just running lower fps in 4k so it feels laggier?
I'd be curious to know how you got the Pendulum demo working on 4k/120. That has never worked for me.
 
I think Denon's 2020 A/V receivers are the first to be announced/have specs revealed? It looks like they'll only have one hdmi port supporting 40Gb/s.
 
I'd be curious to know how you got the Pendulum demo working on 4k/120. That has never worked for me.
I didn't do anything special. Simply did a clean driver install when i switched "monitor", as I know from previous monitor switches that g-sync can break if you don't. (win10 pro, build 2004, 2080ti FE, 9900ks). Shouldn't matter, but i used the lonely hdmi on the back, hdmi4. In the nvidia panel i have 3840x2160 selected under the "PC" section. "use default color settings".
 

Attachments

  • nvid panel.PNG
    nvid panel.PNG
    24.1 KB · Views: 0
Last edited:
I think Denon's 2020 A/V receivers are the first to be announced/have specs revealed? It looks like they'll only have one hdmi port supporting 40Gb/s.

https://www.crutchfield.com/S-54nx2IhTbaH/learn/what-you-need-to-know-about-hdmi-arc-and-earc.html

fcBupw5.png

https://www.cnet.com/news/hdmi-audio-return-channel-and-earc-for-beginners/

arc-updated.jpg


https://www.whathifi.com/us/advice/hdmi-arc-and-hdmi-earc-everything-you-need-to-know

----------------------------------------------------------------------------

If you have working eARC on a receiver you shouldn't need to pass video through the receiver at all. Since ARC and eARC were released they enable you to connect all of your devices directly to a tv and use the tv's eARC out to the receiver to pass audio data to the receiver almost as if the receiver were a set of speakers hooked up to the TV. This is way better because you don't have to have all that bandwidth on the reciever, just enough for uncompressed audio formats. Passing video through a receiver tends to add input lag too, including so bad that it sometimes causes lipsync issues. ARC/eARC also lets you swap the tv inputs with their own unique stored settings via a single TV remote control without having to adjust the otherwise single input's settings on the on the tv after every time you'd formerly swap between the multiple inputs on the receiver.

FYI: Unlike the C9, E9, etc... the CX doesn't support DTS audio formats pass through via eARC (DTS-HD, DTS-X, DTS-MA). However if you are running software or a device that allows you to convert dts-hd, dts-x to PCM 5.1/7.1 it will work fine. If you are running a blurray player that can't convert it on older blurays and other dts content you would be out of luck, but if you are using a pc media player like MPC-HC, or a smart device like a nvidia shield or similar you should be able to set the app or device up to convert the audio on the fly. All of the 2019 and 2020 LG OLEDs support Dolby TrueHD (lossless) and Dolby ATMOS (which is DolbyTrueHD + atmos encoding piggy backing on it). Most modern uhd discs support atmos so even if you don't have atmos speakers you'll be getting dolby trueHD on modern titles. (If you are doing rips or whatever, that is up to the ripper if they want to add the file space and time to rip uncompressed audio tracks.. you could also rip dts audio blurays and convert DTS tracks to PCM using ripping software if necessary). Netflix also has dolby atmos enabled titles but even though it has multi speaker encoding it'll be way more compressed via streaming.

from Rtings.com LG CX review
dnclKTh.png

https://decider.com/2019/05/08/netflix-audio-quality/
" Was it as good as the disc? No, but it was better than I expected. The Netflix audio wasn’t quite as sharp or defined as the Blu-ray disc. While the new Netflix audio can match DVDs in terms of audio quality, Blu-ray discs use better audio—like DTS Surround Sound and Dolby TrueHD—which lose a lot less sound information than regular DVDs. "


Also note that Denon receivers in particular have had hdmi arc issues from LGs. That sucks because I like Denon recievers and have a 7.1 with arc (not eARC) in my living room. It probably won't be an issue on newer Denon receivers that have hdmi 2.1 eARC support but I would defnitely look into it before buying a Denon for a LG TV to make sure. Newer receiver models should fully support the audio return channel IN from a 4k hdmi 2.1 TV even if the receiver can't pass 4k 120hz 444 10bit OUT. You don't have to waste money on a receiver that has 4k 120hz 444 10bit video passthrough/output as long as it supports the hdmi2.1 features you need for eARC audio IN you should be good.
 
Last edited:
Going to share a post I made at AVSForum as it might be interesting to some people in here as well:

"I compared the 55" Q95T against the 55" OLED GX at home and had expected the Q95T to be much brighter but to my great surprise it wasn't. I compared side by side with built in YouTube and NetFlix apps on both TVs as well as with a PC connected (HDR and non HDR). Did the comparison in both a bright room and a pitch black one but still was not able to see an obvious difference in that the Samsung was obviously brighter. Even when putting it in Dynamic mode. Of course I triple checked to see that there where no auto brightness, eco solution or similar running. Also noticed that the Samsung had really aggressive ABL sometimes, like when displaying a single bright object on a black background which made it look quite dim compared to the OLED. Would assume that is to preserve black levels but still. I should add that I only had my eyes to measure things, and that the Samsung probably was a bit brighter sometimes, but I really had to look for it to see the difference and I doubt a more casual viewer would have noticed it."

There seem to be some review which mention the same things here, like these:

 
Going to share a post I made at AVSForum as it might be interesting to some people in here as well:

"I compared the 55" Q95T against the 55" OLED GX at home and had expected the Q95T to be much brighter but to my great surprise it wasn't. I compared side by side with built in YouTube and NetFlix apps on both TVs as well as with a PC connected (HDR and non HDR). Did the comparison in both a bright room and a pitch black one but still was not able to see an obvious difference in that the Samsung was obviously brighter. Even when putting it in Dynamic mode. Of course I triple checked to see that there where no auto brightness, eco solution or similar running. Also noticed that the Samsung had really aggressive ABL sometimes, like when displaying a single bright object on a black background which made it look quite dim compared to the OLED. Would assume that is to preserve black levels but still. I should add that I only had my eyes to measure things, and that the Samsung probably was a bit brighter sometimes, but I really had to look for it to see the difference and I doubt a more casual viewer would have noticed it."

There seem to be some review which mention the same things here, like these:



I would suspect that without measuring equipment being able to tell what the brightness difference is might be more difficult at the very brightest end of the scale. Whenever I have looked at QLED vs OLED in stores here the main difference to me has been the lack of any halos on the OLED in HDR demo videos that specifically show high contrast stuff on dark background, like this one. Now these are of course TVs set on store demonstration modes in bright ceiling lighting.

Here at home looking at my Samsung KS8000 vs LG C9, the C9 just has a lot more depth to its image, regardless of HDR vs SDR.
 
I would suspect that without measuring equipment being able to tell what the brightness difference is might be more difficult at the very brightest end of the scale. Whenever I have looked at QLED vs OLED in stores here the main difference to me has been the lack of any halos on the OLED in HDR demo videos that specifically show high contrast stuff on dark background, like this one. Now these are of course TVs set on store demonstration modes in bright ceiling lighting.

Here at home looking at my Samsung KS8000 vs LG C9, the C9 just has a lot more depth to its image, regardless of HDR vs SDR.

Yes, I agree with that. But on the other hand, if you as a viewer can't see the difference, you could argue if it does exist. The interesting thing is that it used to be a clear difference a few years ago where the QLEDs (LCDs) where actually noticeable brighter but today that does not seem to be as obvious anymore. If QLED looses that advantage, what ones remain compared to OLED besides burn in potential which seem is mostly a non issue these days its seems?
 
Yes, I agree with that. But on the other hand, if you as a viewer can't see the difference, you could argue if it does exist. The interesting thing is that it used to be a clear difference a few years ago where the QLEDs (LCDs) where actually noticeable brighter but today that does not seem to be as obvious anymore. If QLED looses that advantage, what ones remain compared to OLED besides burn in potential which seem is mostly a non issue these days its seems?

I would say a lot of things with audio and visual require experience looking at that sort of things to notice. Your eyes and ears could get more acute to seeing those things when you know what to look or listen for. But if you can't tell a difference then you are right, it might as well not exist.

If you use an OLED like a TV meaning it's maybe on in the evening for X hours displaying varying content ranging from movies and TV to games, then you will most likely not have any burn-in issues to deal with. Desktop display use is a bit different as most of us aren't constantly moving windows around but I imagine if you just used a couple of virtual desktops for different content that would cause enough changes to not worry about it too much. Desktop will generally run at a far dimmer settings which is why I expect that for example the Alienware 55" OLED is fine for that use.
 
LG has started their first China OLED factory that produces 48” panels at large volumes. I expect to see 48” on G series next year and potentially come with a lower price tag to boot.

Thanks for the info, sir. Have you heard anything about smaller? I heard something about being able to use "half a diagonal" for manufacturing, which would mean that 48" TVs are produced from 96" 8k leftover panels? If that's the case, what's to stop LG from halfing an 88" to get to 44" or a 77" to get to 39.5", etc.? I still don't get the reluctance. I understand monitor loving people are still a smaller market, but if I were company I'd want to test the waters as you could possibly dominate the market with those kind of sizes (particularly with gaming continuing to grow, and with the Acer/ASUS alternatives being too expensive for what they offer on their non OLED panels).
 
Going to share a post I made at AVSForum as it might be interesting to some people in here as well:

"I compared the 55" Q95T against the 55" OLED GX at home and had expected the Q95T to be much brighter but to my great surprise it wasn't. I compared side by side with built in YouTube and NetFlix apps on both TVs as well as with a PC connected (HDR and non HDR). Did the comparison in both a bright room and a pitch black one but still was not able to see an obvious difference in that the Samsung was obviously brighter. Even when putting it in Dynamic mode. Of course I triple checked to see that there where no auto brightness, eco solution or similar running. Also noticed that the Samsung had really aggressive ABL sometimes, like when displaying a single bright object on a black background which made it look quite dim compared to the OLED. Would assume that is to preserve black levels but still. I should add that I only had my eyes to measure things, and that the Samsung probably was a bit brighter sometimes, but I really had to look for it to see the difference and I doubt a more casual viewer would have noticed it."

There seem to be some review which mention the same things here, like these:



If both of the TVs are calibrated correctly the LCD is only going to be brighter in very bright HDR scenes mastered beyond the peak brightness OLED is capable of. And it's only going to be brighter if the entire scene is brighter. Most of the very bright stuff in HDR takes up a tiny portion of the screen and because of LCD contrast limitations its going to look better on OLED.
 
Until LCDs start using micro/nano emitters or start developing consumer oriented dual layer LCD tvs, the quote below will still apply.


Quote from the above HDTVTest C9 OLED review video:
"The peak brightness on paper of these FALD TVs operate in zones, rather than per pixel. So let's say if you ask an FALD LED LCD to light up a very small area with specular highlights, for example, reflections on a vehicle or fireworks against a night sky - these small bright elements are not going to come close to the measured peak brightness of 1000nits or 1500 nits simply because the FALD LED LCD TV has to balance backlight illumination between adjacent zones or else there will be too much blooming which will wash out the picture even further. This the reason why in many side by side comparisons that I, and maybe some of you, have seen - specular highlight detail on a 700nit OLED can often look brighter than even a 1600nit FALD LED LCD. When you add the total absence of blooming and glowing in dark scenes, again because of pixel level illumination, it's not surprise
that it is an OLED and not an LED LCD that has been voted as the best HDR TV
by the public in our HDR shootout for two years in a row. Nevertheless I still have a soft spot for high performing LED LCDs that can present high APLscenes with more HDR impact than OLEDs which are restricted by ABL or auto brightness limiter which caps full field brightness at 150 nits for heat and power management. "

That doesn't mean much more realistic brighter colored highlights aren't better (they certainly are) - it means due to the limited nature of large zone FALD arrays they aren't really capable of displaying extreme bright colors side by side with contrasted darker areas. An extremely expensive dual layer LCD reference monitor however could do at least 1000nit side by side with contrasted areas and without worry about burn in, and probably with larger %windows and brightness durations. That kind of tech hasn't made it to consumers yet for the most part (there was one tv in china I think).. and it has room to improve on thickness and cooling methods vs heat, power usage, etc. .

I'll also mention in regard to the calibration comment - that due to the nature of the white subpixel, at extreme brightness colors probably aren't 100% accurate anymore on OLEDS. I've heard it mentioned that they are in fact impossbile to calibrate accurately in HDR for this reason (along with the different % window drops and of course having to deal with ABL).

LCDs , if it weren't for the side by side low density FALD limitation, would in fact be brigthter than OLEDS in hightlights even if the entire scene wasn't brighter... but the FALD offsetting the brightness to the nearest contrasted/darker area will temper than LCD brightness a lot as mentioned in the quotes. However, I still have to clarify that if you compared an OLED with 800nit peaks in small % color brightnesses to an older OLED with say 500nit peak at the same % , you would indeed see much brighter colors in the highlights even in scenes where the entire screen isn't brighter. Just look at some of the youtube videos or screenshots of the color mapped HDR movies and games.

Yes OLED can't hit 1000 nit to 1500nit of FALD LCDs but I've learned more about OLED over the years from experts:


Quote from the above HDTVTest C9 OLED review video:
"The peak brightness on paper of these FALD TVs operate in zones, rather than per pixel. So let's say if you ask an FALD LED LCD to light up a very small area with specular highlights, for example, reflections on a vehicle or fireworks against a night sky - these small bright elements are not going to come close to the measured peak brightness of 1000nits or 1500 nits simply because the FALD LED LCD TV has to balance backlight illumination between adjacent zones or else there will be too much blooming which will wash out the picture even further. This the reason why in many side by side comparisons that I, and maybe some of you, have seen - specular highlight detail on a 700nit OLED can often look brighter than even a 1600nit FALD LED LCD. When you add the total absence of blooming and glowing in dark scenes, again because of pixel level illumination, it's not surprise
that it is an OLED and not an LED LCD that has been voted as the best HDR TV by the public in our HDR shootout for two years in a row. Nevertheless I still have a soft spot for high performing LED LCDs that can present high APLscenes with more HDR impact than OLEDs which are restricted by ABL or auto brightness limiter which caps full field brightness at 150 nits for heat and power management. "

So you pick your tradeoffs.

HDR on PC sounds like a mess and hit and miss atm but HDR itself is definitely the future and well worth it if you can get it on some material. Like I said I'll be using my OLED as a media stage for both PC Games and Movies, and also PS4/PS5, so wherever I can get HDR I'll use it. There will always be plenty of SDR material, just like there is still a ton of 1080p material.

----

Yes movies are mastered at 10,000nit but UHD discs are mastered at HDR1000 mostly with some HDR 4000 and only a few HDR 10,0000 so far since there isn't any consumer hardware capable of showing it yet.

I'd think mapping HDR 10,000 to 1,000 would be easier. Mapping 10,000 or 1,000 to 750 - 800 nit sounds like it would be trickier. I'll have to read up on that more.

Going from 150 - 200 - 300 nit of color highlights, detail in color in bright areas, and from colored light sources in scenes to 500, 600, 700, and 800 nit on an OLED would be well worth it to me.

HDR is attempting to show more realism in scenes. We are still nowhere near reality at 10,000nit HDR but it is much more realistic than SDR. The Dolby public test I quoted was using 20,000nits.

https://www.theverge.com/2014/1/6/5276934/dolby-vision-the-future-of-tv-is-really-really-bright
"The problem is that the human eye is used to seeing a much wider range in real life. The sun at noon is about 1.6 billion nits, for example, while starlight comes in at a mere .0001 nits; the highlights of sun reflecting off a car can be hundreds of times brighter than the vehicle’s hood. The human eye can see it all, but when using contemporary technology that same range of brightness can’t be accurately reproduced. You can have rich details in the blacks or the highlights, but not both.

So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a
jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better."


------------------------------------

HDR Color temperature map based on HDR 10,000 in a game.
You can see that the brighter parts of the scene are mostly 100nit, 150nit until you get to the sun (10,000nit), the sun's corona and the glint of the sun off of the rifle's scope and barrel which are in the 1,000nit range and a little higher on the brightest part of the scope's glint to 4000nit. I'd expect everything above the SDR range in the scene would be tone mapped down in relation to the peak colors (peak color brightnesses or luminances) the OLED or FALD LCD are capable of. That is, rather than scaling the whole scene down in direct ratio and making the SDR range darker which would look horrible.

View attachment 247777

--------------------------------------
 
Last edited:
If both of the TVs are calibrated correctly the LCD is only going to be brighter in very bright HDR scenes mastered beyond the peak brightness OLED is capable of. And it's only going to be brighter if the entire scene is brighter. Most of the very bright stuff in HDR takes up a tiny portion of the screen and because of LCD contrast limitations its going to look better on OLED.

Weird part is that even when used as a PC monitor for work (not all white background windows though as that would trigger ABL), I found the OLED still so be almost as bright. And I am quite certain that I actually did compare with the correct settings, like not automatically adjusted brightness etc. But I guess that OLEDs have become a bit brighter every year while many LCDs seem to have gone the other way for some reasons maybe it has to do with this ultrawide viewing angle filter.
 
https://www.rtings.com/tv/tools/compare/samsung-q90r-vs-lg-cx/779/10619?usage=11114&threshold=0.1

GnDhqDS.png

Just keep in mind that a FALD density nowhere near the pixel level is not going to be able to do those kinds of color brightnesses full blast side by side with darkly contrasted areas. It would look horrible if they even tried to. So the gap isn't quite as large as that seems from the numbers due to offsetting FALD zone brightnesses.

There are also tradeoffs on both sides like Bloom/Dim "Halos" with FALD that lose detail-in-color or detail-in-blacks and are obvious to the eye, while OLED has much lower %windows of color brightness as more of the screen hits higher color brightnesses so will lose some detail in colors in HDR - as well as the ABL reflex kicking it all the way down to 140nit at times (e.g. 100% window).

Most of the highest brightness HDR content is highlights in dynamic scenes but there can also be very bright light sources in a scene(the sun, the energy from a sci-fi ship's propulsion system or a reactor, etc). Those are usually moving as well though. ABL is worst when you are moving closer to a light source so that it is gradually filling the screen larger with it's bright area as if you are zooming in on it, to the point where the ABL kicks on (and off depending how you or the camera are moving). From what I've seen it's a worthwhile trade-off... almost like you are wearing instantaneous photochromic prescription lenses (aka "transition" lenses, the kind that darken). It's not optimal but if it prevents permanent burn-in it's a welcome feature and trade-off to me until something better comes along.

I'm set on getting a 48" or 55" OLED for my pc room by the end of the year, but I'll probably not graduate to an OLED in my living room due to direct sunlight~heat and static imagery concerns. Hopefully by the time I want to upgrade my living room TV there will be some better LCD tech in the next few years.
 
I don’t know what it is with me, but I found my 27uq to be vastly superior in regards to HDR than my C9 or recent Q90R. To me the HDR impact wasn’t even close. I’m wondering if it has something to do with the smaller screen. It just seems much brighter with HDR highlights.
 
I don’t know what it is with me, but I found my 27uq to be vastly superior in regards to HDR than my C9 or recent Q90R. To me the HDR impact wasn’t even close. I’m wondering if it has something to do with the smaller screen. It just seems much brighter with HDR highlights.

The pg27uq has higher peak brightness in an even smaller screen with 384 dimming zones. The only area where the CX 48 will definitively win is on black levels. The pg27uq is one of the few monitors out there that can really go toe-to-toe with the CX 48". Considering that it cost $2000 at launch (and $1200 now), I would expect nothing less.
 
I don’t know what it is with me, but I found my 27uq to be vastly superior in regards to HDR than my C9 or recent Q90R. To me the HDR impact wasn’t even close. I’m wondering if it has something to do with the smaller screen. It just seems much brighter with HDR highlights.

Only in bright scenes. I have the Acer X27, basically Acer's version of a PG27UQ, and it's absolutely terrible in dark scenes due to FALD blooming.
 
I don’t know what it is with me, but I found my 27uq to be vastly superior in regards to HDR than my C9 or recent Q90R. To me the HDR impact wasn’t even close. I’m wondering if it has something to do with the smaller screen. It just seems much brighter with HDR highlights.
OLED give pixel-level highlights, meaning the small bright details in the scene are far more accurate than a FALD that is highlighting an area that can be up to 100 times bigger than the actual area that is supposed to be bright. This bloom is making you think it's better due to the higher nit level being emitted while it's far less accurate. That is not to say anything about the difference in max brightness, but the pixel accuracy on the OLED provides a higher level of contrast and a better overall picture.
 
I think the PG27UQ/equivalents really are better for HDR in a bright daylight room, not only are they brighter but moderately bright light also hides the halos really well (also no worries about burn-in, pixel shifting annoyances etc), but the oled in a room with no direct lightsources is soo good. You just don't need more brightness for HDR than the oled can produce in a light controlled room.
 
Back
Top