LG 48CX

A little extra detail to add to what I mentioned before...

As some may know, when you hold the mic button down and say "turn the screen off" (or by doing it in the OSD) it will turn the emitters off without dropping the display from the windows+nvidia array, allowing a quick button press on the remote to instantly display the content again. I'm mentioning it again because I confirmed last night that it works the same even while in a HDR game. I did it during jedi fallen order in HDR a few different times and it re-displayed it again with issue. I can still hear the game audio as well unless I mute it with my volume button. This is way more convenient than minimizing a full screen exclusive game to a black background and how clunky it transitions out of HDR/full screen and back, and the fact that the game could spontaneously pop back up while I was away. I thought it might work this way, seamlessly, with a fullscreen exclusive HDR title running but I wasn't sure until I tried it. I'm assuming it will keep the eARC pass-through audio going with the screen "off" this way too but I haven't tried my shARC yet to find out.

It doesn't unlock your mouse from fullscreen exclusive for side screen(s) obviously but it's a very safe and convenient way to "turn off" the display at any time without losing your game or dropping your display"monitor" and window layouts/placements. Some walk-aways I have that are "only for a minute" turn out to be quite awhile. No, not just bathroom visits :LOL: .

----------------------------------------------------------


Very cool for people who want smaller options. With my 2nd desk island setup I think 48" ended up just about perfect for me however. In fact, for letterboxed content any smaller would be bad at my normal minimum available viewing distance in this configuration. That is unless were to adjust my mobile desk taller and overlap the long rectangular desk my monitors are on by more, and then I'd be sitting too close to view my side monitors comfortably.
 
  • Like
Reactions: hhkb
like this
This is just the panel though, so we don't know yet when or if it will be available in a consumer model. But there is hope now. I think 42 would be a great improvement for a PC monitor - the PPI would be vastly improved over the 48.
 
I get what you want one for and while the ppi is increased technically - the ppi's perceived density in use is relative to your viewing distance. So it's more about how much room you have, and if you desire it to be on the same desk sitting very close and not have it on a TV mount stand (flat footed or caster-wheeled), wall-mounted, or on a 2nd desk, bench. mini-hutch, etc. The optimal THX wide viewing angle is 45deg - 50deg so if that remains the same the ppi of a 4k 16:9 screen's contents will look the same no matter what size screen you are using.
 
Yeah you're right. I think for most people though 42" PPI will make sense given typical desk depths of 60-80cm.
 
For example, according to this calculator..

...a 48" 4k at 43" viewing distance (74ppd) <<------ 43" or greater (up to 48" ~> 85.1ppd if seat back relaxed slightly in chair) viewing distance is when I'm using the side monitors more

...a 48" 4k at 41" viewing distance (71.0ppd) <<--- ~41" is sometimes my gaming distance on it.

are higher ppd than

...a 32" 4k at 24" viewing distance (63.7ppd).

http://phrogz.net/tmp/ScreenDensity...sizeUnit:in,axis:diag,distance:41,distUnit:in

http://phrogz.net/tmp/ScreenDensity...sizeUnit:in,axis:diag,distance:24,distUnit:in

-----------------------------------------------------------------------------------

A 42" 4k would have the same ppd as my my 48" screen if:
... viewing the 42" screen at 36" - from screen to eyeballs compared to 41" view distance on the 48" screen
....viewing the 42" screen at 37.5" or greater from screen to eyeballs compared to 43" or greater view distance on the 48" screen.

so it's ~ +/- 5 inch difference in viewing distances between 42" screen and 48" screens.

kJphNu1.png

LUXgBOf.png

32" 4k screen for comparison at same ppd

dwQg0Le.png






---------------------------------------------------------------------------------------------------------

Interestingly, in order to be in the THX standard optimal "sweet spot" of ~ 45 to 50 degree viewing angle (for films) at 50deg your viewing distances for a 48" and 42" screen would be as shown below:

2:39:1 is a little wider, like an ultra-wide version of the same screen height, but the degree calculation is in relation to the screen width so the recommended degrees should still be a good reference.


http://www.acousticfrontiers.com/2013314viewing-angles/
Nx1vsnv.gif


48" 4k screen = 44.4" minimum in order to have 50deg viewing angle
w7zaY99.png

42" 4k screen = 38.9" minimum in order to have 50deg viewing angle

WsB4tSA.png

31.5" screen's minimum viewing distance for 50 deg THX = 29.2"
Du2FNYy.png
 
Last edited:
I managed to snag a 3080 FE this weekend. I have it hooked to my CX48 (4k 120Hz 10bpc RGB Full) and an Acer Predator XB271HU (1440p 144Hz). I set Gsync to be on for fullscreen and windowed, limit fps to 117 in NVCP, and Vsync to be based on ingame. When I tested a game (Genshin Impact) that's ingame capped at 60fps, I get massive stutters. When I turn Gsync off, I get smooth gameplay like my previous GTX 1080. I also tested having Gsync on BUT having the Acer monitor set at 60Hz (or unplugged) and I get no (or almost no) stutter. Has anyone experienced something like this? Could it be the GPU is not capable of providing Gsync at 120-144Hz for BOTH displays? Or is my settings wrong somewhere causing the stutters? Or Nvidia driver/LG firmware problem?
 
I like this trend. Maybe one day we'll finally see the end of rubbish LCD tech. Unlike the death of CRTs, I won't be sad at all. 1000 Hz 32" OLED gaming panels anyone? :)
 
I managed to snag a 3080 FE this weekend. I have it hooked to my CX48 (4k 120Hz 10bpc RGB Full) and an Acer Predator XB271HU (1440p 144Hz). I set Gsync to be on for fullscreen and windowed, limit fps to 117 in NVCP, and Vsync to be based on ingame. When I tested a game (Genshin Impact) that's ingame capped at 60fps, I get massive stutters. When I turn Gsync off, I get smooth gameplay like my previous GTX 1080. I also tested having Gsync on BUT having the Acer monitor set at 60Hz (or unplugged) and I get no (or almost no) stutter. Has anyone experienced something like this? Could it be the GPU is not capable of providing Gsync at 120-144Hz for BOTH displays? Or is my settings wrong somewhere causing the stutters? Or Nvidia driver/LG firmware problem?
I would try to enable vsync on driver and disable in game, sometimes it helps.
 
I managed to snag a 3080 FE this weekend. I have it hooked to my CX48 (4k 120Hz 10bpc RGB Full) and an Acer Predator XB271HU (1440p 144Hz). I set Gsync to be on for fullscreen and windowed, limit fps to 117 in NVCP, and Vsync to be based on ingame. When I tested a game (Genshin Impact) that's ingame capped at 60fps, I get massive stutters. When I turn Gsync off, I get smooth gameplay like my previous GTX 1080. I also tested having Gsync on BUT having the Acer monitor set at 60Hz (or unplugged) and I get no (or almost no) stutter. Has anyone experienced something like this? Could it be the GPU is not capable of providing Gsync at 120-144Hz for BOTH displays? Or is my settings wrong somewhere causing the stutters? Or Nvidia driver/LG firmware problem?

Did you try disabling the FPS limiter? That can cause stutter in some games in my experience (Hades being a notable example).
 
Did you try disabling the FPS limiter? That can cause stutter in some games in my experience (Hades being a notable example).
also this, and disable all the overlays, even nvidia overlay. Some game engines doesnt like these stuf.
 
I would try to enable vsync on driver and disable in game, sometimes it helps.

I agree. Make sure you aren't duplicating anything like a frame limiter in the game + RTSS , or dynamic resolution enabled somehow tied to it or something so I'd turn that off.

Did you try disabling the FPS limiter? That can cause stutter in some games in my experience (Hades being a notable example).

You can also try a different limit, like -5 or so. Might be cutting it close for that game.

also tested having Gsync on BUT having the Acer monitor set at 60Hz (or unplugged) and I get no (or almost no) stutter. Has anyone experienced something like this? Could it be the GPU is not capable of providing Gsync at 120-144Hz for BOTH displays? Or is my settings wrong somewhere causing the stutters? Or Nvidia driver/LG firmware problem?

You can try turning hardware scaling on in the nvidia control panel for both displays for the hell of it. That used to work when 60hz and 120hz screens stuttered and didn't play nice together a long time ago.
 
4 years is pretty good though. What OLED light setting did you use before and what is it at now? And for how long was it running at the original setting?
Was set to 100; that resulted in the Windows task bar burning in after about a year. I've since reduced to 30. You actually do adjust to it after a day or two.

LG substantially improved burn in issues with 7 series and later, by changing the subpixel structure to reduce effect of reds on burn in. I believe rtings showed this improved burn in for bright static logos from 600 hours on 6 series to 5000 hours on 7 series. So you should get way better burn in protection if you upgrade.
Which is what I've heard previously, and why I always note my experience is on a B6 and not a later model. I'll keep this in mind when I upgrade [as soon as I can get a GPU. I'm told they allegedly exist.]
 
b1gLoRD , hhkb , elvn
Thanks. It wasn't any of those. I found out the problem. It was HWInfo being displayed on the secondary Acer monitor. Whenever it's on the screen, it causes stutters (I guess from constantly updating the display with new numbers.) I now have it still running but minimized and have no more stutters. And I can run CX48 at 4k 120 and Acer at 1440p 144Hz. :)
 
I can't believe that we might see a 42" C1 OLED. RIP CX48... your days are numbered.

I think my CX55 also just shed a tear.
 
Who knows if the 32" monitor is even 120hz, since it's targeting photo/video editing.

Would be really stupid if this monitor is still using DP 1.2 and HDMI 2.0 in 2021. Although I guess it's possible it could use DP 1.4 and HDMI 2.1 yet somehow still be limited to 60Hz.
 
3.21.16 is available on engineering mode. Supposedly fixes game mode peak brightness and doesn't revert any of the other recent fixes.

Just tested for myself. It brought the completely reset calibration for HDR Game mode in Calman from 625nit to 750nit.

I did a full Autocal calibration and it came out a lot better. Whatever was dimming game mode was throwing off the entire calibration in the process and it's noticeably better now.
 
Last edited:
Is it a good idea to run ULMB for games that are ingame capped at 60fps? I mean, if you already run the game consistently at 60fps, then Gsync isn't really used, so wouldn't ULMB be better for this case? (Or is there a downside to using ULMB at 60fps?)
 
Is it a good idea to run ULMB for games that are ingame capped at 60fps? I mean, if you already run the game consistently at 60fps, then Gsync isn't really used, so wouldn't ULMB be better for this case? (Or is there a downside to using ULMB at 60fps?)
When you say ULMB do you mean BFI? This TV doesn't have ULMB in the Nvidia sense. BFI is kinda rough at 60hz, I (and I think most people) can see the flicker. If you can play the game locked at 60FPS but at 120Hz with it just doubling every frame, that might look ok.
 
60hz requires the "high" BFI setting to look decent which means brightness (even with OLED light 100) will only be adequate for a really a dark room. And that is assuming the flickering doesn't bother you to begin with. Depending on content and on your eyes, it could end up being rather unpleasant.
 
Is it a good idea to run ULMB for games that are ingame capped at 60fps? I mean, if you already run the game consistently at 60fps, then Gsync isn't really used, so wouldn't ULMB be better for this case? (Or is there a downside to using ULMB at 60fps?)

The downside to BFI (like ULMB on other screens) is an extra frame of input lag on the CX (rtings measured ~7ms increase at 120Hz) and significantly reduced brightness. However the motion clarity of BFI is pretty phenomenal in my opinion. Playing fighting games for example it feels very CRT like.

But to use BFI at 60fps without tearing you will need to enable vsync most likely, which will add further delay.

Gsync is still worth it at 60 fps because you can get a tear-free experience with no additional input delay. So use gsync if you want minimal latency.

Also keep in mind, if you do use vsync (like any console would), the CX itself has top tier input lag at 60Hz. Most high refresh LCDs, due to the way scanout conversion works, have awful input lag at 60Hz - usually around 40-50ms. The CX is around 13ms. You will only do better with native 60Hz LCD screens or a CRT.
 
Last edited:
Even if you can't consciously register the flicker it can still cause eye strain over time even at 120fps-@-120Hz. It's also incompatible with HDR since it drops the peak color brightness so much. For example, I'm playing Jedi Fallen order in HDR (at 60hz locked at 57fps since I don't have a 3000 series gpu yet) and it looks gorgeous in HDR mode. It wouldn't look anything like that with BFI. Of course the bulk of titles aren't HDR or quality HDR at this point though either.

https://www.tftcentral.co.uk/reviews/lg_cx_oled.htm#mbr
At 60Hz the BFI behaves a bit differently. In the low and medium modes it is unfortunately not in sync with the refresh rate and you get two "strobes" per refresh rate cycle. It inserts a black frame every 8.33ms which has the negative impact of causing a doubled ghost image to moving content and doesn't look very good. You can see this clearly from the pursuit camera photos above. If you switch up to the high mode then the "strobing" is brought in sync with the refresh rate at 60Hz and you get a much clearer picture with great motion clarity again. It makes a huge difference relative to normal mode (BFI turned off) at 60Hz making the moving image much sharper, clearer and easier to track. The issue though here is that the BFI is then every 16.67ms which introduces a noticeable flicker to the screen (at 60Hz frequency). For motion and gaming this is harder to spot, but could still cause you issues with eye strain and discomfort. It's certainly very noticeable for static images when you first turn it on. Some people may like it still and at least at 60Hz it's a usable option that really helps with motion clarity. We would not recommend using the low or medium modes at 60Hz though.

..
You may want to consider lowering the game resolution to accommodate higher frame rates here we think if you don't have the system power to run 4K @ 120Hz reliably.

I've heard high has flickering and more overt eye strain but their review claims it's fine at 120fps
Turning BFI on brings about some decent improvements in motion clarity as well which is great. At 60Hz the only mode usable really is 'high' where the strobing is in sync with the refresh rate at 60Hz. This massively improves the motion clarity relative to normal 60Hz mode, but does introduce a very noticeable flickering. At 120Hz all the low/medium/high modes are usable and improve the image very nicely, just with differing max brightness levels. Go for the highest level you can while still having the screen at the right brightness for your preference.
 
  • Like
Reactions: MrTX
like this
V-sync on this is just as laggy as on LCDs, there is no difference other than pixel response times, obviously (which can make a significant difference on some panels but still). You can do lagless v-sync if you have spare horsepower and the game isn't too demanding. Or just cap 0.01 below refresh rate for low lag v-sync (about 1-2 frames, not bad). Fastsync or equivalent (such as Windows DWM when running in windowed or borderless) is also a decent solution, although it's not nearly as smooth if you don't have insanely high framerates.

jorimt (same guy behind the blurbusters g-sync guide) received hardware from nvidia and tested BFI as consistently having ~1 frame of lag at any refresh rate (so about 17ms at 60hz or 9ms at 120hz etc). Again, not bad at all and totally usable if you want/need that motion clarity instead of VRR.
 
V-sync on this is just as laggy as on LCDs, there is no difference other than pixel response times, obviously (which can make a significant difference on some panels but still). You can do lagless v-sync if you have spare horsepower and the game isn't too demanding. Or just cap 0.01 below refresh rate for low lag v-sync (about 1-2 frames, not bad). Fastsync or equivalent (such as Windows DWM when running in windowed or borderless) is also a decent solution, although it's not nearly as smooth if you don't have insanely high framerates.

jorimt (same guy behind the blurbusters g-sync guide) received hardware from nvidia and tested BFI as consistently having ~1 frame of lag at any refresh rate (so about 17ms at 60hz or 9ms at 120hz etc). Again, not bad at all and totally usable if you want/need that motion clarity instead of VRR.

Ah yeah, whoops. I misread the rtings chart. It shows clearly an additional frame of lag with BFI on (it's very confusing they don't label the 120Hz explicitly...)

1610457220690.png
 
  • Like
Reactions: MrTX
like this
When you say ULMB do you mean BFI? This TV doesn't have ULMB in the Nvidia sense. BFI is kinda rough at 60hz, I (and I think most people) can see the flicker. If you can play the game locked at 60FPS but at 120Hz with it just doubling every frame, that might look ok.
I meant the ULMB option in NVCP (the option includes Gsync, ULMB, or Fixed). But now that everyone mentions how it introduces more input lag, flicker, and lower brightness, I guess I'll stick with Gsync or Fixed for consistent 60fps capped games.

One more question:
Is the CX48's Freesync range 48-120Hz?
 
Last edited:
V-sync on this is just as laggy as on LCDs, there is no difference other than pixel response times, obviously (which can make a significant difference on some panels but still). You can do lagless v-sync if you have spare horsepower and the game isn't too demanding. Or just cap 0.01 below refresh rate for low lag v-sync (about 1-2 frames, not bad). Fastsync or equivalent (such as Windows DWM when running in windowed or borderless) is also a decent solution, although it's not nearly as smooth if you don't have insanely high framerates.

jorimt (same guy behind the blurbusters g-sync guide) received hardware from nvidia and tested BFI as consistently having ~1 frame of lag at any refresh rate (so about 17ms at 60hz or 9ms at 120hz etc). Again, not bad at all and totally usable if you want/need that motion clarity instead of VRR.
For practical experience at least I cannot notice any significant difference between BFI on/off in lag. Personally with my previous KS8000 TV I could notice lag between its SDR and HDR modes (21ms vs 35ms) but this was using consoles where controller lag is also a factor.

On the LG CX with a 2080 Ti on PC and using Doom Eternal as the test game since it can actually run at 4K 100-120 fps, V-Sync off, I did not notice BFI adding any noticeable input lag at least on the medium/auto settings that IMO work best for brightness. I did notice a slight increase in input lag when using any other mode than Game mode even with Instant Game Response enabled, you could feel a bit of lag when moving the mouse.

But overall all of these modes would be perfectly playable and only people extremely sensitive to input lag would notice.
 
I meant the ULMB option in NVCP (the option includes Gsync, ULMB, or Fixed). But now that everyone mentions how it introduces more input lag, flicker, and lower brightness, I guess I'll stick with Gsync or Fixed for consistent 60fps capped games.

One more question:
Is the CX48's Freesync range 48-120Hz?
ULMB does not work if your monitor does not support it and the LG CX obviously does not. ULMB is a feature of the G-Sync modules.

CX Freesync range is 40-120 Hz.
 
  • Like
Reactions: MrTX
like this
I can't believe that we might see a 42" C1 OLED. RIP CX48... your days are numbered.

I think my CX55 also just shed a tear.

Based on HDTVTest's video I do not think we will see a 42" C1, most likely starting with the C2. For me personally I'm not going to bother upgrading from my CX until the "OLED EVO" technology trickles down to the 42" panel, even if I have to buy a G series OLED to get it. Higher PPI of the 42" is nice sure, but that's really all it offers and I want more of an upgrade than just a few PPI.
 
Guys, what OLED light setting do you use for Windows use, like browsing etc?
I have 30 and its pretty dark. I feel like 80 is like normal LCD monitor. And this is in total dark room at night.
 
Back
Top