LG 48CX

HDTVtest review of the CX.



edit*
I thought the rtings would have already been on here, guess not.




We have the same tv. Sony x800d 43" here, using it as my current monitor. What's it like to make such great choices in life and have such good taste?
 
  • Like
Reactions: zehoo
like this
-----------------------------------

VRR Black Levels - still lose ultra blacks
===================================

It seems to be a problem with the TV changing gamma when enabling VRR: https://www.notebookcheck.net/LG-s-...-worldwide-availability-in-June.466905.0.html

"Additionally, it looks like the current firmware has a few bugs such as VRR mode causing default gamma to be brighter above black "

And it's already been mentioned in a twitter post that it can be calibrated out. So either you'd have to fiddle with the TV settings to get the gamma correct again with VRR enabled or simply do a calibration to correct the gamma level. Either way it's not a big deal and can be fixed. Don't see why anyone would choose to turn Gsync off rather than just calibrating the TV. Yeah I get that LG should fix it themselves, not saying they shouldn't. But until they do, the USER can fix it themselves at least.
 
Informative videos thanks.

Long wait for me but time flies
==========================
I'll be waiting for the november deals and the 3000 series gpus to see if there is hdmi 2.1 and 10bit 120hz 4k 444 before I get a 55" C9 or 48" CX but it's good to hear more questions and issues are getting answered and raised, and that at least some of them are more or less absolved. I'm looking forward to more feedback from owners in this and other threads.

I was thinking a similar wave length. I REALLY want this now. But wouldnt the GX series be a better choice for gaming consider its a better size? This way it takes less space? It would be nice if all of these models could have those reverse cameras so it does not look like a screen is there.

I want g sync 120 fps at 4k. My problem with my Samsung JS 55/65" is that the input lag is not as good for games.

It would be great to have an option where you don't need to consider 55" all in one 4k vs 27" 1440/1080p 240 hz.

I found myself seriously considering a new Dell for fps and thought to myself, "I am not a pro gamer... Never will be. However... Do I really want this experience of kicking ass more? Answer was yes, but who wands to downgrade real estate size?"

 
I was thinking a similar wave length. I REALLY want this now. But wouldnt the GX series be a better choice for gaming consider its a better size? This way it takes less space? It would be nice if all of these models could have those reverse cameras so it does not look like a screen is there.

I want g sync 120 fps at 4k. My problem with my Samsung JS 55/65" is that the input lag is not as good for games.

It would be great to have an option where you don't need to consider 55" all in one 4k vs 27" 1440/1080p 240 hz.

I found myself seriously considering a new Dell for fps and thought to myself, "I am not a pro gamer... Never will be. However... Do I really want this experience of kicking ass more? Answer was yes, but who wands to downgrade real estate size?"


GX is the same screen for a lot more money, just a different form factor focused on mounting. They did add one of those new digital tuners, but you could easily get an outside device to offer that.

IMO the only reason to wait is not to get a GX, it's to get a CX at bigger discounts later on.
 
GX is the same screen for a lot more money, just a different form factor focused on mounting. They did add one of those new digital tuners, but you could easily get an outside device to offer that.

IMO the only reason to wait is not to get a GX, it's to get a CX at bigger discounts later on.

And to make sure that nvidia's 7nm 3000 series is both HDMI 2.1 and supporting 10bit 120Hz VRR HDR 4k 444 chroma since the CX can't send 12bit as it has 40Gbps instead of 48Gbps hdmi 2.1. Most are assuming both hdmi2.1 and 10bit but until we see it, it's an assumption. Better deals too though, yes.
 
We have the same tv. Sony x800d 43" here, using it as my current monitor. What's it like to make such great choices in life and have such good taste?

A C9 or E9 or the way it's looking, depending on Nvidia's hand shown later, a 48" CX for me... but in the following year(s) 2021+, whenever I get to that point anyway, I'm considering a FALD LCD or miniLED LCD, etc.. for my living room instead of an OLED because I keep the back of my TV to a picture window which gets intense sunlight. My current very low density FALD VA tv gets so hot I can hear the plastic housing shift in the evening. That would be a bad scenario for organics in an OLED. Being a household TV, I also can't rely on the content shifting enough either since I'm not the only one using it. A HDR OLED would be great for night viewing of movies or blinds pulled but in a full daylight room with the aforementioned risk factors I don't think I could risk a $3000+ tv. I could flip the tv and the couch to different sides but then the window would be a direct light source bouncing off the face of the TV which would look terrible. I'm sort of being forced into 75" too since they don't any 70" ones and I don't want to downgrade size from my 70" VA to 65", so it will have to be the right time before I drop that kind of money again. The 70" price range would be a lot more digestible, it's a big leap to 75" price wise. I'm still pretty happy with the 70" low density FALD VA for now and I'm directing my electronics budget for the next year+ more toward a LG OLED for my pc and a likely expensive 3080ti sc (possibly a hybrid). I just though I'd mention the living room TV and me considering not getting OLED in that room since people are talking about their samsungs and sonys.
 
GX is the same screen for a lot more money, just a different form factor focused on mounting. They did add one of those new digital tuners, but you could easily get an outside device to offer that.

IMO the only reason to wait is not to get a GX, it's to get a CX at bigger discounts later on.

Can recommend these two videos on the GX:




That said, I have the 55" GX to be used as a PC monitor, and I can't really think of another reason to get it besides the design (which on the other hand is why I got it over the CX but got a quite good price on it as well).
 
It seems to be a problem with the TV changing gamma when enabling VRR: https://www.notebookcheck.net/LG-s-...-worldwide-availability-in-June.466905.0.html

"Additionally, it looks like the current firmware has a few bugs such as VRR mode causing default gamma to be brighter above black "

And it's already been mentioned in a twitter post that it can be calibrated out. So either you'd have to fiddle with the TV settings to get the gamma correct again with VRR enabled or simply do a calibration to correct the gamma level. Either way it's not a big deal and can be fixed. Don't see why anyone would choose to turn Gsync off rather than just calibrating the TV. Yeah I get that LG should fix it themselves, not saying they shouldn't. But until they do, the USER can fix it themselves at least.

Hopefully you can edit the settings for VRR game mode and it will remember it without altering the gamma/settings on any other modes. That is, once you edit it, it will drop the edit when leaving VRR Game mode and pick it back up when you enter VRR Game mode again every time. Otherwise that is pretty annoying and not very useful to me... not acceptable as a fix.
 
Hopefully you can edit the settings for VRR game mode and it will remember it without altering the gamma/settings on any other modes. That is, once you edit it, it will drop the edit when leaving VRR Game mode and pick it back up when you enter VRR Game mode again every time. Otherwise that is pretty annoying and not very useful to me... not acceptable as a fix.

I noticed another thing.

1590613440955.png


2.4 gamma? Isn't this the wrong target for a PC monitor? IIRC 2.2 gamma is the correct target for monitors. That "bug" actually makes the near black gamma closer to 2.2 which is suppose to be the correct gamma so I want to see how the TV behaves when it's set to target 2.2 gamma off the bat and not 2.4 will it lower the near black gamma to somewhere around 2.0? Or will it not affect it at all? Also based on B7 OLED, you can change whatever settings you want in Game Mode and it won't affect other modes like Cinema...so yes just correct your gamma in game mode with VRR on and it should leave everything else like cinema mode alone. If you want it to have separate gamma for game mode with vrr on and game mode with vrr off then you may be out of luck as I'll have to double check if you can save custom presets but I don't see why you would need that when next gen consoles will also support VRR anyways so you can just have a single game mode with vrr always on with corrected gamma and call it a day. And since this affects the C9, then buying a C9 won't save you from this either. Just fix it yourself until LG fixes it officially.
 
Last edited:
GX is the same screen for a lot more money, just a different form factor focused on mounting. They did add one of those new digital tuners, but you could easily get an outside device to offer that.

IMO the only reason to wait is not to get a GX, it's to get a CX at bigger discounts later on.
Thank you so much for letting me know that! I personally mount the tv with flexible mount that allows me to move it so the stand was not important for me.

I guess I just wanted the smallest profile and looked at the GX / WX as smaller, but those models also seem to weigh more and come with speakers which... Who needs them if you are pc gaming with a creative sonic carrier or those people with huge entertainment systems.

Too bad they dont make the WX in 48/55"!


And to make sure that nvidia's 7nm 3000 series is both HDMI 2.1 and supporting 10bit 120Hz VRR HDR 4k 444 chroma since the CX can't send 12bit as it has 40Gbps instead of 48Gbps hdmi 2.1. Most are assuming both hdmi2.1 and 10bit but until we see it, it's an assumption. Better deals too though, yes.
Very valid point, I did not even consider that! Does this mean we cannot take advantage of G-sync on our 2080Ti's as well? This makes me realize the upgrade path is real:
3080/ti + CX easy 3k spent without even batting an eye. Thanks for putting that into perspective.


Can recommend these two videos on the GX:
That said, I have the 55" GX to be used as a PC monitor, and I can't really think of another reason to get it besides the design (which on the other hand is why I got it over the CX but got a quite good price on it as well).

I will check those out! How are you liking your 55 GX, mind sharing here or privately your deal? I was originally leaning GX due to how little space it takes. Yet I see it weighs a lot more and my omnimount / ergotron may not be able to handle anymore weight with the sonic carrier attached. I'd have considered the WX but 5k... Eh?
 
Last edited:
I will check those out! How are you liking your 55 GX, mind sharing here or privately your deal? I was originally leaning GX due to how little space it takes. Yet I see it weighs a lot more and my omnimount / ergotron may not be able to handle anymore weight with the sonic carrier attached. I'd have considered the WX but 5k... Eh?

Well, run over a CX with a steam roller and add a no gap wall mount you have the GX. Thats about it :)
 
I noticed another thing.

View attachment 248615

2.4 gamma? Isn't this the wrong target for a PC monitor? IIRC 2.2 gamma is the correct target for monitors. That "bug" actually makes the near black gamma closer to 2.2 which is suppose to be the correct gamma so I want to see how the TV behaves when it's set to target 2.2 gamma off the bat and not 2.4 will it lower the near black gamma to somewhere around 2.0? Or will it not affect it at all? Also based on B7 OLED, you can change whatever settings you want in Game Mode and it won't affect other modes like Cinema...so yes just correct your gamma in game mode with VRR on and it should leave everything else like cinema mode alone. If you want it to have separate gamma for game mode with vrr on and game mode with vrr off then you may be out of luck as I'll have to double check if you can save custom presets but I don't see why you would need that when next gen consoles will also support VRR anyways so you can just have a single game mode with vrr always on with corrected gamma and call it a day. And since this affects the C9, then buying a C9 won't save you from this either. Just fix it yourself until LG fixes it officially.
Seems that with HDR the recommendation has moved to a target of 2.4, particularly with OLED displays. 2.2 is still recommended with LCD displays.

Interesting article on the subject:
https://www.jigsaw24.com/articles/whats-the-best-setting-for-gamma-with-rec709-video
 
Seems that with HDR the recommendation has moved to a target of 2.4, particularly with OLED displays. 2.2 is still recommended with LCD displays.

Interesting article on the subject:
https://www.jigsaw24.com/articles/whats-the-best-setting-for-gamma-with-rec709-video

Interesting information. Again I believe we can just correct the gamma through calibration so this still shouldn't be an issue. Doesn't the CX now support hardware calibration with calman? I have an X-Rite i1 Display pro so hopefully that'll work, if not I guess I can always resort to making an ICC profile in windows but that's not as optimal.
 
Interesting information. Again I believe we can just correct the gamma through calibration so this still shouldn't be an issue. Doesn't the CX now support hardware calibration with calman? I have an X-Rite i1 Display pro so hopefully that'll work, if not I guess I can always resort to making an ICC profile in windows but that's not as optimal.
When it comes to games most still ignore your software ICC, so hardware is still the way to go.
 
On the console front hopefully the ALLM mode automatically switching to game mode doesn't alter any other settings after you already set them up manually. The info I've seen seems to say it just automatically switches TVs (and in some cases, receivers too if the video is passed through one) to GAME mode when it detects the game console input as active and back out of game mode once the device is switched to a non console device on another input.

-----

The RTings review of the C9 , ....... the RTings review of the E9 ..... and the RTings review of the CX quote these values:

C9
TPNLdZo.png
"Overall, it follows the gamma target well, but some near-black details are crushed. This can be seen in the spike at the beginning of this higher resolution gamma plot. Increasing the Brightness setting does help compensate for this a bit, but doesn't completely correct it. "

"After calibration, this TV has nearly perfect accuracy. The white balance dE and color accuracy are both extremely good, and any remaining inaccuracies are completely unnoticeable.

The TV features an auto-calibration feature. This feature still requires a licensed copy of CalMAN, and a colorimeter."



-------------------------------------------

E9
Hq74q1s.png

"...inaccuracies in the pure whites as the color temperature is warm, giving off a yellowish tint. The gamma follows the target well, except in some very bright scenes that appear brighter than they should "

"After calibration, the LG OLED E9 has almost perfect accuracy. Any remaining inaccuracies are very hard to notice without the use of a colorimeter.

Just like the C9, this TV features an auto-calibration feature. This feature still requires a licensed copy of CalMAN and a colorimeter."



--------------------------------------

CX
mkpcZZ4.png

"Overall, the gamma follows the target fairly well, but dark scenes are too dark and some bright scenes are too bright."

"After calibration, the color accuracy is nearly perfect. The white balance dE is nearly zero, so shades of gray are displayed almost perfectly. Any color inaccuracies aren't visible without the aid of a colorimeter.

The
C9 has an auto-calibration feature, but the CX doesn't have one yet."
 
Seems that with HDR the recommendation has moved to a target of 2.4, particularly with OLED displays. 2.2 is still recommended with LCD displays.

Interesting article on the subject:
https://www.jigsaw24.com/articles/whats-the-best-setting-for-gamma-with-rec709-video


Very interesting, thanks.

Some excerpts:

"However, a straight power curve of 2.4 is correct only if the display has a zero black level and an infinite contrast ratio, which no real-world display has. "

"So, it seems the choice is this –
2.4 gamma in well-controlled grading/mastering conditions (particularly if your monitor has good dynamic range; OLED or Dolby reference monitor) and 2.2 in brighter editing rooms with lower-end LCDs. "
 
I have a HUGE criticism of the CX: The ABL is WAAAAYYYYYYY too aggressive.

I just played the last 2 chapters FF7 Remake, and everytime it went to a bright or all white screen, the image darkened considerably compared to normal highlights. In a vacuum, I wouldn't care, but it's not; I have a Samsung Q90R that I've used throughout most of this game, and let me tell you, the brights/highlights are blindingly bright on that TV.

In most scenarios, this issue does not present itself, so most people will not care. I will maintain that content with super fast motion looks considerably better on OLED. But for heavy HDR content? I think I like the Q90R is better, not because of the darks, but because of the brights. The Q90R just punches so hard when it comes to highlights and super bright screens.
 
Yeah brightness has always been a weakness of OLED. Personally I use my display in a really dark room so I actually prefer the lower brightness of OLED when compared to my Acer X27 which is capable of 1000 nits highlights. In a bright room no doubt the Acer will have that extra HDR punch but in a really dark room it's just too much for me and the oled ends up being easier on my eyes. Ambient lighting conditions really plays a key role here.
 
Yeah brightness has always been a weakness of OLED. Personally I use my display in a really dark room so I actually prefer the lower brightness of OLED when compared to my Acer X27 which is capable of 1000 nits highlights. In a bright room no doubt the Acer will have that extra HDR punch but in a really dark room it's just too much for me and the oled ends up being easier on my eyes. Ambient lighting conditions really plays a key role here.

Same here. If I crank my OLED up, it’s painful on the eyes...but I’m not in a super bright room, so it's a perfect setup really. The additional brightness of an LCD set would not really benefit me and...well there would be all of the downsides.
 
On the console front hopefully the ALLM mode automatically switching to game mode doesn't alter any other settings after you already set them up manually. The info I've seen seems to say it just automatically switches TVs (and in some cases, receivers too if the video is passed through one) to GAME mode when it detects the game console input as active and back out of game mode once the device is switched to a non console device on another input.

My understanding is that the Instant Game Response (ALLM) just turns it to low lag mode automatically on whatever input or picture preset you want rather than switching to the game mode preset. That's at least how it has been on my C9, e.g. I still have it on the ISF dark room preset on the input used by my PS4 Pro.

As for ABL, haven't noticed anything weird on C9 but personally I have not played FF7 Remake, just watched a lot when my girlfriend was playing it so I may have missed the scenes where ABL kicks in.
 
I have a HUGE criticism of the CX: The ABL is WAAAAYYYYYYY too aggressive.

I just played the last 2 chapters FF7 Remake, and everytime it went to a bright or all white screen, the image darkened considerably compared to normal highlights. In a vacuum, I wouldn't care, but it's not; I have a Samsung Q90R that I've used throughout most of this game, and let me tell you, the brights/highlights are blindingly bright on that TV.

It's either too minor for my eyes to see ABL/ASBL in games and movies or it's just too minor on my C9 in general. I play the FF7 remake at the moment too, but i have never seen that the picture darkened considerably there, nor in any other game. For me the brightness stays the same all the time.
Even with 100 oled-light and contrast in SDR, i can't see any ABL.

I have some experience with QLED too. I owned a Samsung Q9FN in 2018 for a week and it was my most disappointed TV purchase ever.
And the same issues i had with this TV are still present on 2019 models, except that the 2018 had better local dimming in game mode:
1. PWM 120Hz in game mode: Slightly better motion clarity, but worse motion duplications (esp. 30FPS games), effects similar to stroboscopes occur and the image in general felt "unsettled" for me.
2. Samsung's arbitrary subpixel-control: (at 20:19, go fullscreen): QLEDs lacking clarity and definedness. WRGB subpixel-structure in this category is above everything IMO.
3. HDR tends to be overbright: (5:35)

As for 1 and 2, these issues maybe nearly invisible for someone else though, esp. at further viewing distance.
 
Last edited:
My understanding is that the Instant Game Response (ALLM) just turns it to low lag mode automatically on whatever input or picture preset you want rather than switching to the game mode preset. That's at least how it has been on my C9, e.g. I still have it on the ISF dark room preset on the input used by my PS4 Pro.

As for ABL, haven't noticed anything weird on C9 but personally I have not played FF7 Remake, just watched a lot when my girlfriend was playing it so I may have missed the scenes where ABL kicks in.

VRR Black Level Issue - Posterization surrounding white objects , HGiG questions vs Gamme adjustments
=====================================================================================

Sorry what I meant when I said ALLM was really the HGiG feature.
(HDR Gaming Interest Group) setting under "Dynamic Tone Mapping" that has it's own tone mapping, so that tone-mapping won't be done twice on both the console then on the tv (as per flatpanelshd.com review of the CX)

"Another development is HGiG Mode, which was actually added to 2019 LG OLED TVs via a firmware after launch. HGiG (HDR Gaming Interest Group) can be found as a setting option under 'Dynamic Tone Mapping'.

Enabling HGiG will ensure that tone-mapping adheres to the HGiG's specification for console games in HDR, meaning that tone-mapping will not be done twice (first on console and then on the TV).

HGiG is a way to make sure that you experience HDR console games from PlayStation and Xbox the way that the game creators had intended."

gpccNbK.jpg

So I am hoping that adjusting the gamma in Game mode to compensate for brighter/broken black depths when VRR on is won't get "overwritten" if HGiG is turned on. HGiG can be turned off but I'd rather the black level issue was able to be turned off instead.

--------------------------------------------------------------------------------

I'm still not convinced that adjusting the gamma will fix the issue since in some reports the issue presents itself as gradients around white areas, almost like FALD blooming.

https://www.avsforum.com/forum/40-o...9-dedicated-gaming-thread-consoles-pc-31.html

Yesterday:
"I've actually just noticed it for the first time over the weekend, and it really stuck out and smacked me in the face. It really bugged me. So much so, that I disabled G-sync for the time being. The game was Ori and the Will of the Wisps. The gamma is all jacked up when G-S ync is on, highly visible in the map screen as posterization in the gradients surrounding anything white (like an icon, or the cursor) on the otherwise pure black screen. The same thing happens during gameplay in the Mouldwood Depths area in game that takes place in almost pure darkness; nasty gradients surrounding the illuminated areas. Pure black is pure black (no glow) but it's obvious there's something wonky going on.

--------------------------------------------------------------------------------

ABL, HDR screen brightness
=========================

Just a reminder that the majority of most scenes stay in the 100nit or lower range in HDR for most content. The super bright HDR color ranges are highlights, detail-in-colors across bright areas, and bright light sources, etc. dynamically in scenes.

https://www.resetera.com/threads/hdr-games-analysed.23587/
quotes about assassin's creed odyssey in particular:
"Ubisoft's recommendation for the paper white slider "adjust the value until the paper and hanging cloth in the image almost saturates to white".
However like the brightness slider, this is here to allow you to adjust the output of the game to match your viewing conditions, i
f you are in a cinema like controlled light source environment technically you would set to 80nit, however as your surrounding light increases you will prefer higher values".

"Setting the games to the technical correct settings also highlights how
HDR10/Dolby vision is too dim for many consumers. You'll also see how developers are still getting to grips with these new technologies, you'll see that the HUD in AC becomes a little too dark as the main game image becomes calibrated correctly."

Final Fantasy XV from the resetera HDR games analysed link:
xzvttv8.jpg



According to RTings CX review, the CX has aggressive ABL like the C9, E9.
"The CX has decent HDR peak brightness, enough to bring out highlights in HDR. There's quite a bit of variation when displaying different content, and it gets the least bright with large areas, which is caused by the aggressive ABL. "
That is how it is with HDR.
With SDR, there is a Peak Brightness Setting. Since it limits the peak brightness it doesn't seem compatible with HDR.
x6eTr0w.png

From the Rtings C9 Review, regarding SDR settings concerning ABL:
"If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes)."

--------------------------------

Some HDR scenes could be all white like the red dead snow scenes. Full screen whites will probably look more dim ("greyish").
I don't know which scenes you were talking about in FinalFantasy VII remake but the % windows of the screen have limits:

100% window would kick down to 146nit (140nit sustained).
50% window 302nit (287nit)
25% window 456nit (433nit)
10% window 813nit (775nit)
2% window 799nit (752nit)

ABL is the major reason we aren't going to have a high risk of burn in on these OLEDs so it is the trade-off for HDR content.
--------------------------------------

This video shows some FF7 remake's HDR encoding levels using color mapping:


A few shots from the video where there are somewhat large bright areas:
dZlQscs.png

BSTIBRA.png





...
 
Last edited:
I have a HUGE criticism of the CX: The ABL is WAAAAYYYYYYY too aggressive.

I just played the last 2 chapters FF7 Remake, and everytime it went to a bright or all white screen, the image darkened considerably compared to normal highlights. In a vacuum, I wouldn't care, but it's not; I have a Samsung Q90R that I've used throughout most of this game, and let me tell you, the brights/highlights are blindingly bright on that TV.

In most scenarios, this issue does not present itself, so most people will not care. I will maintain that content with super fast motion looks considerably better on OLED. But for heavy HDR content? I think I like the Q90R is better, not because of the darks, but because of the brights. The Q90R just punches so hard when it comes to highlights and super bright screens.

Is that in "PC mode"? I recall there is a new setting this year that controls this, Peak brightness or something similar, but I mostly use PC mode and its not available there. I actually have a Samsung Q95T on order as well, and the only reason for that is brightness/ABL (I mainly use my 55" GX as a PC monitor and also for work, so white backgrounds is quite common). Don't care to much about burn in though, if it happens it will happen but by then the TV will have paid for itself anyway. Problem is that I also hope to use the same monitor for PC gaming and thats where it gets more compliacted.
 
Last edited:
Is that in "PC mode"? I recall there is a new setting this year that controls this, Peak brightness or something similar, but I mostly use PC mode and its not available there. I actually have a Samsung Q95T on order as well, and the only reason for that is brightness/ABL (I mainly use my 55" GX as a PC monitor and also for work, so white backgrounds is quite common). Don't care to much about burn in though, if it happens it will happen but by then the TV will have paid for itself anyway. Problem is that I also hope to use the same monitor for PC gaming and thats where it gets more compliacted.

He is talking about HDR content. If you view my reply just before yours I quoted the settings to avoid ABL in SDR mode for SDR content. You using your monitor in SDR mode could certainly avoid ABL using those settings.
According to RTings CX review, the CX has aggressive ABL like the C9, E9.
"The CX has decent HDR peak brightness, enough to bring out highlights in HDR. There's quite a bit of variation when displaying different content, and it gets the least bright with large areas, which is caused by the aggressive ABL. "
That is how it is with HDR.
With SDR, there is a Peak Brightness Setting. Since it limits the peak brightness it doesn't seem compatible with HDR.
View attachment 248742
From the Rtings C9 Review, regarding SDR settings concerning ABL:
"If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes)."
 
The ABL hate and venmous rhetoric is pretty heavy in this video but he probably gets a lot of views from reactions:



--------------------------------

A test apparently designed to exacerbate the issue, in SDR if not using the settings mentioned in the previous reply:

 
Last edited:
One more reply since I've been doing a lot of digging at the moment....

Out of curiosity I looked at a few of the flagship LED/QLED FALD LCD tv reviews and they have horrible contrast, even for a FALD VA.

The samsung Q900R's native contrast is only 1630:1 , and only gets 6905:1 with FALD which is horrible. My 43" edge lit samsung and tcl VA screens at my computer get 4800:1 and 6150:1 native respectively. The Q900R is one of the few other TV's with HDMI 2.1.

Samsung Q80T has a native contrast of 3042:1 and with FALD active gets 4225:1, which is still very weak for high density FALD, HDR VA. It has hdmi 2.1

For comparison, from the PG27UQ FALD IPS gaming reivew on tftcentral..
4V9syka.png
Note that the Rting's contrast values use a checkerboard pattern so are around 50% window, which equates to 5,925:1 on the PG27UQ with FALD active.


The LG OLEDs of course, have "infinite" to 1 contrast.

https://www.rtings.com/tv/tests/picture-quality/contrast-ratio
 
One more reply since I've been doing a lot of digging at the moment....

Out of curiosity I looked at a few of the flagship LED/QLED FALD LCD tv reviews and they have horrible contrast, even for a FALD VA.

The samsung Q900R's native contrast is only 1630:1 , and only gets 6905:1 with FALD which is horrible. My 43" edge lit samsung and tcl VA screens at my computer get 4800:1 and 6150:1 native respectively. The Q900R is one of the few other TV's with HDMI 2.1.

Thats because of the "wide viewing angle filter"

It also looks pretty shitty in comparison with WRGB OLEDs at 17:35 :

The ABL hate and venmous rhetoric is pretty heavy in this video but he probably gets a lot of views from reactions:



--------------------------------

A test apparently designed to exacerbate the issue, in SDR if not using the settings mentioned in the previous reply:



This guy is a known idiot.
 
He is talking about HDR content. If you view my reply just before yours I quoted the settings to avoid ABL in SDR mode for SDR content. You using your monitor in SDR mode could certainly avoid ABL using those settings.

I belive we are talking about two different things, I mainly talk about "lack of brightness" here rather than the actual fluctuations. For static content like workstuff, fluctuations isn't really a problem but low brightness is.
 
Looks like the CX models are getting stellar reviews.

Being that I’m recovering from back surgery and am doing most of my gaming through my Nvidia Shield gamestreaming to my bedroom Sony 65XBR900E I might as well wait until Black Friday for lower prices and see how all the issues pan out. I really hope this popularity of these TVs get Nvidia to reimplement 10bit support in their consumer GPUs. They must already be aware of the demand. Has anyone posted this to their user forum?
 
Looks like the CX models are getting stellar reviews.

Being that I’m recovering from back surgery and am doing most of my gaming through my Nvidia Shield gamestreaming to my bedroom Sony 65XBR900E I might as well wait until Black Friday for lower prices and see how all the issues pan out. I really hope this popularity of these TVs get Nvidia to reimplement 10bit support in their consumer GPUs. They must already be aware of the demand. Has anyone posted this to their user forum?

Yes, which seem to reflect in interest. CX/GX owners thread on AVSForum has 163 pages, the one for Q90T/Q95T has 22...

Have the Q95T on order though so I guess I will find out, main reason is because I use it 95% as a PC monitor. But to be honest, I have a hard time seeing it beat my 55" GX. For a more normal usage scenario OLED seems like the obvious choice these days.
 
Hello, I never see one test who tell me :
- if gsync works at 120 Hertz and the imput lag with this mode.
- the impact when hdr is on with the imput lag
- can we have both hdr and gsync on together.

I have a 2080ti and I need a summary of what this TV is capable for pc enthusiasts.

All coverage of the 55 cx doesn't give the reality of what the TV can do.

As a PC gamer I am afraid to see that there 0 good 4k monitor. That why all my hope are on this TV.

Also the c9 VS cx what are the big difference for pc gamer.?
 
Well, run over a CX with a steam roller and add a no gap wall mount you have the GX. Thats about it :)

I might add that the sound from the GX is really not bad for a TV. I have heard that the same goes for the CX but at least according to the Digital Trends review, the GX is a step up. Have actually been using it all day long to play some electronic music and not to shabby even though no match for a proper sound system of course.
 
Hello, I never see one test who tell me :
- if gsync works at 120 Hertz and the imput lag with this mode.
- the impact when hdr is on with the imput lag
- can we have both hdr and gsync on together.

I have a 2080ti and I need a summary of what this TV is capable for pc enthusiasts.

All coverage of the 55 cx doesn't give the reality of what the TV can do.

As a PC gamer I am afraid to see that there 0 good 4k monitor. That why all my hope are on this TV.

Also the c9 VS cx what are the big difference for pc gamer.?

- gsync works at 120hz @ 1440p or less. input lag is roughly 6-10ms. LFC is supported (even though it's not advertised).
- hdr appears to have little to no impact on input lag as long as game mode is enabled.
- hdr is confirmed working for 1440p/120hz/gsync.

CX vs C9 are almost identical. The only major difference is that the CX supports 4k/120 (no gsync) and BFI 120hz. Outside of that, they're very close to the same.

No issues using my 55" CX as a PC monitor so far. Works great and looks great.
 
Hello, I never see one test who tell me :
- if gsync works at 120 Hertz and the imput lag with this mode.
- the impact when hdr is on with the imput lag
- can we have both hdr and gsync on together.

I have a 2080ti and I need a summary of what this TV is capable for pc enthusiasts.

All coverage of the 55 cx doesn't give the reality of what the TV can do.

As a PC gamer I am afraid to see that there 0 good 4k monitor. That why all my hope are on this TV.

Also the c9 VS cx what are the big difference for pc gamer.?
IMHO as a pc gamer the 48CX soley as a pc gaming monitor will shine best with a 3080Ti *IF* the 3080Ti has full HDMI 2.1 on it. Right now nobody knows if Nvidia will put HDMI 2.1 in their next gen GPUs. Nvidia refuses to comment one way or another. I would say wait and see what the 3080Ti brings to the table before buying the 48CX. Now that's just my own opinion. Everyone has to decide for themselves.
 
For anyone using an OLED as a PC monitor (and not just for gaming) I really recommend tinkering with text settings a bit. I have replace the system font (Segoe ui) with MS Sans Serif and also switched text rending to Greyscale instead of ClearType RGB and it makes a noticeable difference. Of course, screen size still make them a big jagged but at least characters are sharp and without fringes, fuzziness etc.
 
- gsync works at 120hz @ 1440p or less. input lag is roughly 6-10ms. LFC is supported (even though it's not advertised).
- hdr appears to have little to no impact on input lag as long as game mode is enabled.
- hdr is confirmed working for 1440p/120hz/gsync.

CX vs C9 are almost identical. The only major difference is that the CX supports 4k/120 (no gsync) and BFI 120hz. Outside of that, they're very close to the same.

No issues using my 55" CX as a PC monitor so far. Works great and looks great.

Might I ask what settings you use for it to make it most suitable for PC usage? I have found that the ISF Bright Room with some changes and in PC mode gives a good result. Perhaps I am imagining things, but for some reason ABL does not seem to be as agressive in ISF Bright Room. I've also given the colors a boost in Intel Control Panel (laptop) as I find the colors strangly muted (even when its not HDR related problems).
 
IMHO as a pc gamer the 48CX soley as a pc gaming monitor will shine best with a 3080Ti *IF* the 3080Ti has full HDMI 2.1 on it. Right now nobody knows if Nvidia will put HDMI 2.1 in their next gen GPUs. Nvidia refuses to comment one way or another. I would say wait and see what the 3080Ti brings to the table before buying the 48CX. Now that's just my own opinion. Everyone has to decide for themselves.

Yep. I think I'm going to go ahead and purchase the 48CX just because it will be nice to get back down to that size. The 55" is completely usable, but the 48" is better for me based on my experience with the Samsung 48JS9000 that I had previously. Best case, Nvidia will add HDMI 2.1 and the CX will be the pinnacle of gaming displays. Worst case, Nvidia won't add HDMI 2.1 and I'll have the features of the C9 in a 48" panel size, which will still be a solid upgrade from my 55B7.
 
Thats because of the "wide viewing angle filter"

It also looks pretty shitty in comparison with WRGB OLEDs at 17:35 :



This guy is a known idiot.


Yes I realize but either way the contrast is very poor. Considering the very poor FALD contrast in the 2020 sets with 1 hdmi port, the glow (or dim corona) halos in FALD zone balancing along with the other weaknesses shown - OLED ABL is a fair tradeoff. I didn't even think it was that bad in the hate video personally, and he was playing a SDR game off of a console or emulator which means he could have easily set SDR mode to never have ABL. A more relevant example would be ABL kicking on in HDR content side by side with the color temperature mapped version.

I thought there might be a 2020 version of the Q9 but with hdmi 2.1 so I was looking out of curiousity for a future living room purchase. The Q9FN has native contrast of 6055:1 but more importantly the Q9FN with FALD active it has 19,018:1 contrast ratio. The 2020 QLED/LED LCD sets with 1 hdmi 2.1 port that I mentioned are horrible by comparison, and their game mode makes them even worse.

Edit:
The Q90/Q90R RTings review quotes 11,200:1 contrast ratio with FALD active so there is that one at least.
"The Q90 has a great local dimming feature. There's very little blooming, but it tends to dim the edges of bright objects, causing a vignetting effect, and small highlights like stars are crushed. In Game Mode, the local dimming doesn't react as quickly to changes in a scene, leading to more visible blooming. "

" Unfortunately the TV's 'Ultra Viewing Angle' optical layer makes the pixels hard to see clearly. We observed the same issue on the Q900R pixel photo. "

"HDMI 2.1 : UNKNOWN"


-------------------------------------------------------------

ABL, screen brightness - a few addons
=================================

Right now on my big 43" VA screens I use a software ABL like effect on my browsers which kicks in, called "Turn off the lights". ToTL is avaiable on both firefox and chrome. It's a global setting with a slider rather than kicking on at a threshold like ABL, but sometimes a web page loads for a second before the "Turn of the lights" addon kicks in, where it cuts the brightness down to my default setting. I much prefer the dimmed ToTL when on white web pages. I also use the "Color Changer" addon for sites I frequent though, which will change the color of the background and other colors, using a color wheel if you want to micro manage it or set the defaults differently. Personally I will not be using my OLED for static app windows but I thought I'd mention these addons, and ToTL might work well for running a youtube tab on the OLED besides. Otherwise I run the AwesomeTube app on windows 10.


https://www.turnoffthelights.com/
I set turn off the lights' default setting to be a very modest amount of dimming at 20% rather than the dark default settings. It works on all web pages not just youtube, but on youtube and several other sites it preserves the video frame without dimming it. It has other youtube features like disabling auto play, filling the whole tab with the video frame automatically, volume with mouse/wheel movement, etc. While it can leave video frame alone on many sites, It does however dim images along with the rest of the web page, which is a good thing depending. You can always open the picture in a new tab and click the ToTL icon to toggle it off if you want to. You can also set links to be click through too so links still work. Another nice feature is that you can toggle a dimming slider to show on the web page if you want to adjust it on the fly. If you click the dimming slider, it toggles on/off to a dim square instead of the long slider so it stays out of the way. Or you can hide it completely in the settings if you prefer, which is the default setting.

I realize there are dark modes for these sites, plus I could use the color changer addon show below but I'm just using these two as examples of how TurnOffTheLights works:
screenshot_TurnOffTheLights_Firefox-Slickdeals.png
1T5jXNx.png


https://addons.mozilla.org/en-US/firefox/addon/site-color-changer/
It's very easy to swap the colors around on the fly or turn the color changer off/on per site with this drop down settings window.
(example picture, not the colors/settings I use personally)
212950.png
 
Last edited:
- gsync works at 120hz @ 1440p or less. input lag is roughly 6-10ms. LFC is supported (even though it's not advertised).
- hdr appears to have little to no impact on input lag as long as game mode is enabled.
- hdr is confirmed working for 1440p/120hz/gsync.

CX vs C9 are almost identical. The only major difference is that the CX supports 4k/120 (no gsync) and BFI 120hz. Outside of that, they're very close to the same.

No issues using my 55" CX as a PC monitor so far. Works great and looks great.
I can't decide between C9 and CX. I'd love to use the OLED as Ultrawide. So for example use a custom resolution of 3840x1600 to get a 21:9-ish experience with black bars on top and bottom. Would be awesome if you could confirm 3840x1600 @ 120HZ works on your 55'' CX. :D
 
Back
Top