LG 48CX

Returned , end of the OLED journey for me, been a fun three weeks but not going to bother with a replacement given the time it takes to setup.Prolly gonna get the 38 " Acer from Costco that has HDR and FreeSync (but no KVM :( ) to replace my Dell. Plus 3 months to try out/return and may be get another 48 CX around Thanksgiving time... GLTA
giphy.gif
 
Thanks, may get CALMAN home (assuming that's what you got) soon but am happy with these settings for now. One quirky thing I noticed is that sometimes the i1D pro would pick up a black level of 0.03 but may have been some ambient light.

Yeah I got CalMan Home for LG. Did you hack in the FSI EDR from Lightspace (that's for WRGB OLED)? Before I did that I got constant "low light level" errors on true black with an i1D Pro.
 
Is anyone using their LG CX 48 in a multi-monitor setup with other HDMI monitors? I am having issues with windows being shuffled around when the monitors go to sleep. It's the same problem that displayport has called hot-plug detect. It's a huge reason I have avoided disaplyport monitors and now my CX 48 is acting like a displayport monitor in that regard. I have been using these monitrs for years without this ever happening but now that I added the CX 48 to my setup about half the time when I come back after being away long enough for the display to turn off I come back to open windows being resized and shuffled around. Even if I have nothing on the CX 48 at all. Everything is on the other monitors but it happens anyway. It's annoying when you have lots of stuff open and I like to keep things in certain places and it all just moves. Has anyone else experienced this? Is there a way to disable the behavior in the settings some how? I know some DP moniotors actually have a setting to disable it, but HDMI isn't supposed to act this way to begin with.

Try turning on Quick Start+ as it seems to be better at keeping the display connected. What Windows probably tries to do is it seems the CX 48" getting disconnected and as it should, moves screens on it to another display. Neither Windows nor MacOS are particularly great about this feature so using a third party solution to reposition windows after a display is connected is the way to go. DisplayFusion window position profiles are a good option on Windows, on MacOS the Stay app works pretty well.

With Displayfusion it's important to make sure how the window titles are defined if you want to position for example browsers and those windows are unfortunately only identifyable from each other by the title. For my main browser window I set the title to "Google Chrome" because that is always in the title.
 
My CX 55" updated to firmware version 03.10.41 last night, not sure whats changed. Seems to be behaving the same as before the update
 
Fairly useless updates and fixes in recent builds..... hope they get around to 100/120Hz Chroma downsampling, Screenshift, HDMI handshake bugs and proper HDMI v2.1 suppport or similar core items uncovered by users here and elsewhere.

You can always fine release notes and change log on the LG support site.
Expand 'References'.

https://www.lg.com/us/support/softw...US&category=CT10000018&subcategory=CT30017665

This firmware hit on 27-Jul-2020, but a number of users on here reported getting it a while earlier, way before it was posted for download on the LG site.

3. Release History:

[03.10.41]
1. To improve RF channel subtitles of Sling TV application


[03.10.20]
1. Apple AirPlay improvements
2. Support AMD FreeSync
3. Sport Alert improvements
4. Support ATSC 3.0 (WX/GX Model only)

[03.00.70]
1. Fix the case where certain eARC is running as ARC

[03.00.60]
1. Fix the noise which has been occurred when sending video from iPhone to TV with Airplay app
2. Fix volume fluctuation when setting STB volume to low level (1~2)
3. Fix Bluetooth initialization popup and the issue which magic remote control sometimes does not work

[03.00.45]
1. Fixed HDMI No signal issue in eARC On state
 
Last edited:
What to play on a 55" OLED? Star Trek a Final Unity of course - 1995 Dos game running in DOS box...

 
Last edited:
Promoting displayfusion functionality again, even though I have no affiliation with them. I just really like the app.
The reason I'm detailing this below is to show that if you are using multiple monitors with the other monitor(s) being used for your desktop apps, there is practically zero reason to have the taskbar ever show on the primary gaming/media OLED display.

Displayfusion taskbars can be enabled on your secondary monitors.
=================
The displayfusion taskbars can:
..have app icons with live mini-window popups just like the regular windows taskbar.
..have a popup tray (and the clock/chronograph) with tray icons you can interact with just like the regular windows taskbar
..have a windows icon with start menu popup just like the regular windows taskbar.
..when you hit the windows key on your keyboard, the start menu pops up in the lower left corner on whatever monitor your mouse is on.
..you can force the application shortcuts you make on each taskbar to launch on the same monitor as the taskbar the shortcut is on (recommended)
..you can force the app to run as admin by it's shortcut within the displayfusion shortcut editing options instead of the windows shortcut menus.
..adjust the size of the taskbar
..move the windows button to left or right side, or disable it
..move the clock and tray to left or right side or set either of them to be hidden
..

Alt+TAB menu (or Ctrl+ALT+Tab which makes the popup persistent until you click on something)
=====================================================================
Displayfusion can change the behavior of your alt+Tab menu:
..you can change the width and height of the thumbnails
..you can choose whether the thumbs appear on all monitors or only on the current monitor, current windows.
.."current windows only" is very useful so I'm not seeing all of the app thumbnails of what is running on the other monitors.
3dITr9B.png

Hiding the windows taskbar on the primary monitor
==========================================
There is another app I use called taskbar hider, along with the Translucent Taskbar app. I set a shortcut key in the taskbar hider app to lock my primary monitor's windows taskbar away entirely. I can always hit the shortcut key again to show it but I have little reason to now:
.. All of my tray icons are available on my other monitors. All of their apps are running active on their own monitor's taskbar.
.. If I need to access a media or game app on my primary monitor, I can just alt+Tab or ctrl+alt+Tab there and it will only show the apps that are running on that monitor to choose from.
.. If I need to launch a game or media app to my primary monitor, I have displayfusion shortcut keys (tied to streamdeck buttons) to launch and place them with a keypress or button press. However it would be easy to just hit the windows key while the mouse is on the primary monitor, and keep those primary monitor aimed app shortcuts up on top in the start menu popup. Displayfusion should remember where the app was located last time it was run.

Notification area and Search
=======================
So far the only things that the displayfusion taskbars don't have that I am aware of is the windows notification area and the search icon (or field if you've left it a text entry field). However the roll-out notification panel is easily accessible with the Win+A shortcut (A for "actions") and the search popup is available with Win+S, even with the taskbar locked away hidden. The A and the S are right next to each other on a regular keyboard too incidentally.
 
Last edited:
Promoting displayfusion functionality again, even though I have no affiliation with them. I just really like the app.
The reason I'm detailing this below is to show that if you are using multiple monitors with the other monitor(s) being used for your desktop apps, there is practically zero reason to have the taskbar ever show on the primary gaming/media OLED display.

Looks like displayfusion is the way to go. The other program mentioned earlier MonitorKeeper doesn't work for me. It just doesn't do anything and there are no options. My windows still shuffle when it's running. I hope displayfusion will for this. I remember trying it a little while ago to try and get the taskbar to completely hide but couldn't get that to work either. But I will read all your posts and see if I can get that working too. It just seemed kinda buggy at the time but I will give in another shot. Thank you very much for all the info
 


Lol funny how some claim that OLED isn't bright enough, while that plasma could only do 35 nits full field. Yes that's right 35 nits. Differently technology and different times I know, but still something worth mentioning. BFI also increases motion clarity to match or even exceed plasma. BFI FTW! Just wish LG would address the black crush issue caused by BFI.
 
So the 48" won't be available in Australia ;(

Been tickling the idea of importing one from the states.

But US is 120V / 50-60Hz.

Will i have issues using this TV here in Australia 230V 50Hz even if I use an adaptor?

On 2nd though it probably is still not a good idea, because the Warranty will only be for inside US. :(

Looks like I won't get the glory of 48" Oled as a gaming monitor.

Virtually every TV and monitor accepts 110v-240v 50-60Hz these days. Simplifies supply chains. Just need a plug adapter.
 


Lol funny how some claim that OLED isn't bright enough, while that plasma could only do 35 nits full field. Yes that's right 35 nits. Differently technology and different times I know, but still something worth mentioning. BFI also increases motion clarity to match or even exceed plasma. BFI FTW! Just wish LG would address the black crush issue caused by BFI.


Haven't seen the video but the featured plasma was not the brightest plasma though and if I recall correctly, that was the main reason Panasonic lost the last ever shoot out including plasmas to Samsung. At least that was the reason I bought it back in the day. And yes, I thought that was to dim also :D
 


Lol funny how some claim that OLED isn't bright enough, while that plasma could only do 35 nits full field. Yes that's right 35 nits. Differently technology and different times I know, but still something worth mentioning. BFI also increases motion clarity to match or even exceed plasma. BFI FTW! Just wish LG would address the black crush issue caused by BFI.


35 nits full screen brightness?

OLED is stupidly bright by comparison.
 
35 nits full screen brightness?

OLED is stupidly bright by comparison.

Yeah. I mean I understand that we need the brightness for HDR and all, but I still don't really give a rat's ass for higher brightness. One of my buddies has a Vizio P Series Quantum that's capable of 1800 nits on a 10% window and I didn't think it looked any better than OLED. It just managed to deliver a lot more impact in a really bright room(which is where it was setup in).
 
I can understand your comparisons for SDR but brightness is very different from SDR in HDR. It's the highlights and light sources. Most of the scene or much of the scene remains in SDR range on average.

Whatever your peak brightness ceiling is, that is your peak *color* brightness, or color volume. When a display or settings hit the color ceiling, any detail-in-color beyond that point is lost. Sort of like how detail-in-blacks is lost on the bottom end a lot of screens with poor black levels. When a display hits it's color brightness ceiling it has to either roll down to a range of the best colors it is capable of, show a blob of the same peak color, or in the past it would just clip to white. Beyond that narrow SDR band - more realistic highlights, reflections, and light sources in HDR material with a more absolute and taller scale. With SDR you have a pretty narrow band to begin with and are just ramping up the brightness of the whole scene relatively. So it's an apples and oranges comparison. 1500nit HDR does not make the bulk of any given scene shoot up to 1500nit, HDR 10,000 does not make the bulk of a scene shoot up to 10,000 nit. Maybe the sun could be upward to 10k, some tiny bright reflections or glint off off a metal gun barrel might be 4k nit, a laser blast might be 400 - 800 nit, etc. As you can see in the images below, the bulk of those scenes are below 100nit. In fact, when HDR is using absolute values a lot of people have complained that HDR is too dim - since it was mastered to realistic values at levels for a dark theater or dim to dark home theater environment. HDR is for a much taller range of colors (brighter color values and also detail-in-colors) and deep black depths all at once - not scaling the main bulk of the scene way brighter.

https://www.theverge.com/2014/1/6/5276934/dolby-vision-the-future-of-tv-is-really-really-bright
"The problem is that the human eye is used to seeing a much wider range in real life. The sun at noon is about 1.6 billion nits, for example, while starlight comes in at a mere .0001 nits; the highlights of sun reflecting off a car can be hundreds of times brighter than the vehicle’s hood. The human eye can see it all, but when using contemporary technology that same range of brightness can’t be accurately reproduced. You can have rich details in the blacks or the highlights, but not both.

So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better."

361131_jE1pSgZ.jpg


361132_7X2ncu9.png

361133_kaag2Cm.jpg

VR6gxX2.png

3r1M8aR.png



I mean I understand that we need the brightness for HDR and all, but I still don't really give a rat's ass for higher brightness. One of my buddies has a Vizio P Series Quantum that's capable of 1800 nits on a 10% window and I didn't think it looked any better than OLED. It just managed to deliver a lot more impact in a really bright room(which is where it was setup in).

Well comparing a 800nit color at small % of screen on an oled to a 1200 - 1400nit FALD LCD is a lot different than comparing to a 350 - 400nit sdr screen or a low nit Plasma. The fald array has to offset the dim or glow halos from different areas so they don't overreach into each other, so it can't hit the peak numbers it quotes all of the time.. The OLED is per pixel so overall a lot of people consider it a more impressive HDR display compared to the FALD densities available currently.
 
Last edited:
I can understand your comparisons for SDR but brightness is very different from SDR in HDR. It's the highlights and light sources. Most of the scene or much of the scene remains in SDR range on average.

Whatever your peak brightness ceiling is, that is your peak *color* brightness, or color volume. When a display or settings hit the color ceiling, any detail-in-color beyond that point is lost. Sort of like how detail-in-blacks is lost on the bottom end a lot of screens with poor black levels. When a display hits it's color brightness ceiling it has to either roll down to a range of the best colors it is capable of, show a blob of the same peak color, or in the past it would just clip to white. Beyond that narrow SDR band - more realistic highlights, reflections, and light sources in HDR material with a more absolute and taller scale. With SDR you have a pretty narrow band to begin with and are just ramping up the brightness of the whole scene relatively. So it's an apples and oranges comparison. 1500nit HDR does not make the bulk of any given scene shoot up to 1500nit, HDR 10,000 does not make the bulk of a scene shoot up to 10,000 nit. Maybe the sun could be upward to 10k, some tiny bright reflections or glint off off a metal gun barrel might be 4k nit, a laser blast might be 400 - 800 nit, etc. As you can see in the images below, the bulk of those scenes are below 100nit. In fact, when HDR is using absolute values a lot of people have complained that HDR is too dim - since it was mastered to realistic values at levels for a dark theater or dim to dark home theater environment. HDR is for a much taller range of colors (brighter color values and also detail-in-colors) and deep black depths all at once - not scaling the main bulk of the scene way brighter.



View attachment 266491


View attachment 266492

View attachment 266493

View attachment 266499

View attachment 266500





Well comparing a 800nit color at small % of screen on an oled to a 1200 - 1400nit FALD LCD is a lot different than comparing to a 350 - 400nit sdr screen or a low nit Plasma. The fald array has to offset the dim or glow halos from different areas so they don't overreach into each other, so it can't hit the peak numbers it quotes all of the time.. The OLED is per pixel so overall a lot of people consider it a more impressive HDR display compared to the FALD densities available currently.

I know this already.....The entire scene obviously isn't going to be 4,000 nits or 10,000 it's only the parts that NEED to be. I have both an OLED and a real HDR LCD (X27). If I'm already blinded by 1,000 nit highlights I'm not sure I'd be able to appreciate 4k or 10k nit highlights because I'll be squinting my eyes so much from the searing brightness levels.
 
Returned , end of the OLED journey for me, been a fun three weeks but not going to bother with a replacement given the time it takes to setup.Prolly gonna get the 38 " Acer from Costco that has HDR and FreeSync (but no KVM :( ) to replace my Dell. Plus 3 months to try out/return and may be get another 48 CX around Thanksgiving time... GLTA
Anything other than the X27 / X35 / PG27UQ / PG35VQ are trash for HDR.
 

:LOL:. Anyways, not trying to say that I don't want HDR or anything, it's great and all and I enjoy having it. All I'm saying is I'm pretty much satisfied with the current 700-1000 nit ceilings that OLED's can hit (The Panasonic OLED is capable of close to 1000 nits). Don't really care for 4000 or 10,000 nits just like some are perfectly happy with 120Hz and don't care for 240/480/1000Hz displays. Let's also be real here, it's highly possible that LG may never solve the brightness problem on OLED and we may be stuck with this spec all the way until MicroLED becomes viable in small sizes. If that's the case, I'd personally be ok with it.
 
This might interest LG C9 and CX owners: Vinz80 on AVSForum has made an app that not only allows setting YUV444 10-bit with dithering (at least when using the Club3D adapter) but also controlling the LG TV programmatically. See
https://www.avsforum.com/threads/2019-lg-c9–e9-dedicated-gaming-thread-consoles-and-pc.3123168/post-60011471

For adding a command for toggling BFI on or off, see my post here:
https://www.avsforum.com/threads/2020-lg-cx–gx-dedicated-gaming-thread-consoles-and-pc.3138274/post-60013766
 
Is anyone else having trouble playing Windows 10 App netflix in 4K, with HEVC Video Extensions from Device Manufacturer from app store?
I thought the adapter was HDCP2.2 compliant?
link to HEVC Codec for 4K netflix windows 10 app Get HEVC Video Extensions from Device Manufacturer - Microsoft Store


I get this netflix error U7361-1255-C00D7159. Soon as i install HEVC to play 4k videos, netflix doesn't like it. If i remove the adapter netflix plays 4k videos in 60hz fine. But without the adapter no 120hz.

Did the adapter firmware breack HDCP 2.2? I never tried netflix4k hevc, before i updated the adapters firmware so i don't know.

pic shows hdcp2.2 working.


wonder why netflix and hevc doesnt like it?
 
Last edited:
Is anyone else having trouble playing Windows 10 App netflix in 4K, with HEVC Video Extensions from Device Manufacturer from app store?
I thought the adapter was HDCP2.2 compliant?
link to HEVC Codec for 4K netflix windows 10 app Get HEVC Video Extensions from Device Manufacturer - Microsoft Store


I get this netflix error U7361-1255-C00D7159. Soon as i install HEVC to play 4k videos, netflix doesn't like it. If i remove the adapter netflix plays 4k videos in 60hz fine. But without the adapter no 120hz.

Did the adapter firmware breack HDCP 2.2? I never tried netflix4k hevc, before i updated the adapters firmware so i don't know.

pic shows hdcp2.2 working.


wonder why netflix and hevc doesnt like it?


I just tested this and it seems to work perfectly fine for me. I'm running the Club3D adapter at 4K 120 Hz 10-bit HDR, installed the HEVC extensions, using Windows 10 version 2004 and pressing Ctrl+Shift+Alt+D in the Netflix app shows it is outputting 3840x2160.
 
oh really wow? Did you update your adapter with the firmware that's been going around?

maybe the cac-1085 firmware update i installed broke hdcp .. or the resolution changer thing that guy on avs forum posted, although i did reinstall nvidia driver...

What video card and TV you running ?
What nvidia driver you using?11

Can you run cyberlink UHDBR advisor? https://www.cyberlink.com/prog/bd-support/diagnosis.do
it will list if your setup is hdcp compliant

I will try 10bit, see if that works
Check ItemResultInfo
Windows 10 or above : YesYes
Intel® SGX Technology (CPU/Main Board) : SGX not supportedNo
6 GB System Memory : 16320 MBYes
HEVC 10-bit(DXVA) : YesYes
H264(DXVA) : YesYes
HEVC-10bit/AVC Codec (GPU) : YesYes
HDCP 2.2 (GPU/Display) : NoNo
Advanced Protected Audio/Video Path (GPU) : NoNo
UHD-BD Optical Disc Drive : NoNo
Windows Media Player ™ 9 or above : YesYes
HDR - High Dynamic Range (GPU/Display) : YesYes
8 or 10 bit : 10 bitYes
PowerDVD 17 Ultra (Retail version) or above : Not foundUpgrade
 
The Netflix app doesn't launch on my PC when HDR is enabled, with or without the adapter. It just crashes with this error in Event Viewer:
Code:
Faulting application name: wwahost.exe, version: 10.0.19041.1, time stamp: 0x484cb762
Faulting module name: MFMediaEngine.dll, version: 10.0.19041.423, time stamp: 0x473cc5ba
Exception code: 0xc0000005
Fault offset: 0x00000000000ec18b
Faulting process id: 0x1098
Faulting application start time: 0x01d6698e18d48a1f
Faulting application path: C:\WINDOWS\system32\wwahost.exe
Faulting module path: C:\Windows\System32\MFMediaEngine.dll
Report Id: 9754a397-c754-4236-b4c5-6cdcaf3aeee7
Faulting package full name: 4DF9E0F8.Netflix_6.97.752.0_x64__mcm4njqhnhss8
Faulting package-relative application ID: Netflix.App
 
oh really wow? Did you update your adapter with the firmware that's been going around?

maybe the cac-1085 firmware update i installed broke hdcp .. or the resolution changer thing that guy on avs forum posted, although i did reinstall nvidia driver...

What video card and TV you running ?
What nvidia driver you using?11

Can you run cyberlink UHDBR advisor? https://www.cyberlink.com/prog/bd-support/diagnosis.do
it will list if your setup is hdcp compliant

I will try 10bit, see if that works
Check ItemResultInfo
Windows 10 or above : YesYes
Intel® SGX Technology (CPU/Main Board) : SGX not supportedNo
6 GB System Memory : 16320 MBYes
HEVC 10-bit(DXVA) : YesYes
H264(DXVA) : YesYes
HEVC-10bit/AVC Codec (GPU) : YesYes
HDCP 2.2 (GPU/Display) : NoNo
Advanced Protected Audio/Video Path (GPU) : NoNo
UHD-BD Optical Disc Drive : NoNo
Windows Media Player ™ 9 or above : YesYes
HDR - High Dynamic Range (GPU/Display) : YesYes
8 or 10 bit : 10 bitYes
PowerDVD 17 Ultra (Retail version) or above : Not foundUpgrade

Yes I have the updated 3ff (or something like that) firmware that I got from Club3D. Same one that is going around.

Cyberlink UHDBR advisor gives pass for HDCP 2.2. I have a Palit 2080 Ti Gaming Pro.
 
Yes I have the updated 3ff (or something like that) firmware that I got from Club3D. Same one that is going around.

Cyberlink UHDBR advisor gives pass for HDCP 2.2. I have a Palit 2080 Ti Gaming Pro.
OK found the problem, rolling back nvidia driver to v446.14 fixed it.

as per this post https://www.nvidia.com/en-us/geforce/forums/geforce-graphics-cards/5/391028/encountered-netflix-hdcp-error-with-2080ti-v4516/?topicPage=2]()&commentPage=2

Woo that's great now, lol all this time i've been paying for netflix 4k, and wasn't even using HEVC codec, and it was only running in 1080p :p

For anyone else in the future who comes across this problem... there ya go !
 
The Netflix app doesn't launch on my PC when HDR is enabled, with or without the adapter. It just crashes with this error in Event Viewer:

Faulting package-relative application ID: Netflix.App[/CODE]

That sounds like a reinstall windows error to me.
or uninstall netflix app and reinstall ?

maybe try command prompt and run sfc /scannow
 
OK found the problem, rolling back nvidia driver to v446.14 fixed it.

as per this post https://www.nvidia.com/en-us/geforce/forums/geforce-graphics-cards/5/391028/encountered-netflix-hdcp-error-with-2080ti-v4516/?topicPage=2]()&commentPage=2

Woo that's great now, lol all this time i've been paying for netflix 4k, and wasn't even using HEVC codec, and it was only running in 1080p :p

For anyone else in the future who comes across this problem... there ya go !

Other than less easy typing for search, if you plan to watch it fullscreen (which would make better use of the 4K res) you could just use the built-in Netflix app on the LG TV. That's what I use most of the time.
 
Virtually every TV and monitor accepts 110v-240v 50-60Hz these days. Simplifies supply chains. Just need a plug adapter.

BHPhoto advised against using any adaptor with this Tv because it can cause some serious damage to it.

This is all too worrying for an expensive matter, no warranty for international customers and the possibility of damaging using an adaptor, i think im just gonna have to pass on this.

Ill ssve the money for next best gaming monitor in the future.
 
BHPhoto advised against using any adaptor with this Tv because it can cause some serious damage to it.

This is all too worrying for an expensive matter, no warranty for international customers and the possibility of damaging using an adaptor, i think im just gonna have to pass on this.

Ill ssve the money for next best gaming monitor in the future.

Europe uses 220V/50Hz, why not just order one from Europe? They had them available before the US.
 
BHPhoto advised against using any adaptor with this Tv because it can cause some serious damage to it.

This is all too worrying for an expensive matter, no warranty for international customers and the possibility of damaging using an adaptor, i think im just gonna have to pass on this.

Ill ssve the money for next best gaming monitor in the future.

Does not make sense. Plenty of products I buy use different types of power adapters. There is no added risk unless you mess something up by somehow plugging in the wrong type of plug into the mains or whatever (or buying some really shoddy adapter but why would you when good ones are dirt cheap already).

Generally speaking I would not want to import a TV or monitor from too far away either though. I'm sure this model will come to the Oceania market eventually, but might have to wait a few months.
 
I've setup my CX 48 a bit differently for PC use now. I am considering keeping the Club3D adapter because the firmware update at least reduced the black screen / no signal issue a good amount so it does not occur all the time so you are not unplugging the adapter power constantly. I'm hoping a future firmware fixes it further as this would let me coast until later this year before buying a 3000 series GPU. My 2080 Ti is powerful enough.

Based on these AVSForum posts: 1, 2 I've set the TV like this:
  • Windows 10 HDR on.
  • SDR content slider set to 10% (about the same brightness as OLED light at 35 in SDR).
  • 4K 120 Hz via Club3D adapter, if HDR is on this seems to output 4:4:4 correctly. Who knows why it works in HDR but not SDR.
  • YUV 4:4:4 10-bit with dithering enabled. Set this mode via Vinz80's ColorControl app because NVCP does not support this option. This is supposed to have less banding than RGB for some reason but looks the same to me.
  • Pixel shift off.
  • Quick Start+ off so pixel shift does not turn itself back on.
  • HDR Game mode picture preset, Warm 2 color temp and otherwise default settings.
  • Dynamic tone mapping set to HGIG. This looks the same as Off but if games come out that support HGIG then those will work right.
  • Mastering peak brightness set to 1000 in HDMI override menu. This seems to be because Windows might not detect and handle this correctly otherwise.
  • Color space set to BT2020 in HDMI override menu. The Club3D adapter seems to output this incorrectly otherwise. With this set, SDR content still looks correct as long as HDR mode is enabled.
  • Automatic static brightness limiter disabled using my old Samsung Galaxy S4 (it has an IR blaster) and the LG Service Remote Control app. ASBL seemed more aggressive in HDR mode but was annoying even in SDR. With it disabled the screen no longer dims for example as I am writing this post.
With this setup there seems to be no issues with image quality in SDR content and it looks visually the same as if SDR was enabled so there should be no burn in issues either. This is probably the intended way to use the Win10 HDR mode but they have just done a really bad job at explaining it and it might not work as well with LCDs.
 
I am blown away at the image quality of the new CX 48", but I am having a terrible time getting custom resolutions to work. It will be heart breaking if I have to return it because though I love movies at the full 48" size, that's outside my comfort zone for gaming (I'd pay more for a smaller OLED!). Have others been able to get 2560x1440p 120hz NO SCALING or 3840x1620p 120hz working? It seems like 120hz breaks scaling on the CX (60hz works). 120hz/144hz doesn't break scaling on my LG 27" 27GL83A for example, smaller window sizes are no problem.

These 60hz resolutions work: 3840x1620p 60hz (ultra widescreen, black bars top bottom), 2560x1440p 60hz NO SCALING (~32" window in center of screen, correctly), 1920x1080p 60hz NO SCALING (~24" window in center of screen, correctly).

These 120hz resolutions fail: 3840x1620p 120hz (ultra widescreen, simply fails when trying to switch to this resolution), 2560x1440p 120hz NO SCALING (instead of ~32" window which it should do, the CX scales it anyways to the full 48"), 1920x1080p 120hz NO SCALING (instead of ~24" window which it should do, the CX scales it to a strange ~36").

For competitive FPS gaming, I'm looking for a smaller scaled window size at 120hz BFI. For casual RPG gaming, I'm looking for 120hz ultra widescreen VRR. Has anyone been able to figure some or all this out? I'm lost and in need of help...

(TV settings: PC HDMI input set, Game mode, just scan on, deep color. Nvidia settings: custom resolution or resize with no scaling/perform scaling on GPU.)
 
Last edited:
I am blown away at the image quality of the new CX 48", but I am having a terrible time getting custom resolutions to work. It will be heart breaking if I have to return it because though I love movies at the full 48" size, that's outside my comfort zone for gaming (I'd pay more for a smaller OLED!). Have others been able to get 2560x1440p 120hz NO SCALING or 3840x1620p 120hz working? It seems like 120hz breaks scaling on the CX (60hz works). 120hz/144hz doesn't break scaling on my LG 27" 27GL83A for example, smaller window sizes are no problem.

These 60hz resolutions work: 3840x1620p 60hz (ultra widescreen, black bars top bottom), 2560x1440p 60hz NO SCALING (~32" window in center of screen, correctly), 1920x1080p 60hz NO SCALING (~24" window in center of screen, correctly).

These 120hz resolutions fail: 3840x1620p 120hz (ultra widescreen, simply fails when trying to switch to this resolution), 2560x1440p 120hz NO SCALING (instead of ~32" window which it should do, the CX scales it anyways to the full 48"), 1920x1080p 120hz NO SCALING (instead of ~24" window which it should do, the CX scales it to a strange ~36").

For competitive FPS gaming, I'm looking for a smaller scaled window size at 120hz BFI. For casual RPG gaming, I'm looking for 120hz ultra widescreen VRR. Has anyone been able to figure some or all this out? I'm lost and in need of help...

(TV settings: PC HDMI input set, Game mode, deep color. Nvidia settings: custom resolution or resize with no scaling/perform scaling on GPU.)

You probably have to make sure the TV is also not scaling anything in addition to NCVP being set to no scaling. I'm thinking NVCP has been configured for no scaling but the TV itself is probably taking that 1080p/1440p signal and then trying to scale it up? As for custom ultrawide resolutions, I believe that's been confirmed that they do not work above 60Hz.
 
You probably have to make sure the TV is also not scaling anything in addition to NCVP being set to no scaling. I'm thinking NVCP has been configured for no scaling but the TV itself is probably taking that 1080p/1440p signal and then trying to scale it up? As for custom ultrawide resolutions, I believe that's been confirmed that they do not work above 60Hz.

Oh good call, I do have just scan on in the tv settings (updated post). It works exactly as expected at 60hz, but not at 120hz.

Huh, I didn't realize ultrawide resolutions don't work above 60hz. Do folks expect that to change with HDMI 2.1 graphics cards or is that just a huge oversight in the CX firmware?
 
Oh good call, I do have just scan on in the tv settings (updated post). It works exactly as expected at 60hz, but not at 120hz.

Huh, I didn't realize ultrawide resolutions don't work above 60hz. Do folks expect that to change with HDMI 2.1 graphics cards or is that just a huge oversight in the CX firmware?

Yes ultrawide resolutions will work just fine at 120 Hz with a HDMI 2.1 GPU. I am able to get them with the Club3D adapter. Same should be the case for 1080p and 1440p with 1:1 scaling as the display will still think it's getting 4K.
 
For some reason, the text in Game console mode is sharper compare to the PC mode (icon). What's the benefit of running in PC mode vs Game console?
 
For some reason, the text in Game console mode is sharper compare to the PC mode (icon). What's the benefit of running in PC mode vs Game console?

PC mode allows for 4:4:4 color, Game console will output 4:2:2. In Game console mode there is more fringing on text in my experience and colors are less accurate. If text is sharper then maybe check if it uses different settings in PC vs game console mode.
 
Back
Top