LG 48CX

Nope, not according to the testing Chris (dude that makes video above) has done. DX12 renders everything to full screen borderless. Edit: I suspect because there is a lot of pressure to have overlays. There's a whole community of input lag fanatics that follow him and they spend 100s of hours trying to get DPC latency, input lag, windows settings and bios settings perfect for zero lag. These guys refuse to use on-board audio because it can cause DPC latency spikes. They even refuse to use more recent versions of Windows 10 because of DPC latency spikes. It's too much for me, makes gaming feel like a job and I'm an IT professional!

Here's Chris spending 30 minutes making Windows changes just for COD MW. At 2:50 he talks about FSE:

Yes, it is extremely maddening that Microsoft chose to do this. I don't understand why borderless fullscreen can't just be an option instead of mandatory. I never run any third-party apps or otherwise that I need to alt+tab to when I'm playing games, and I don't use the Xbox app integration, so I don't need the tradeoffs of running a game in a window.
 
Well June will be a interesting month then if that's when it actually releases. Looking forward to what the professionals and the users have to say about LG 48 CX
 
The B/C/E7 and 8 series (presumably 9 as well?) have had multiple incremental improvements to cut IR and increase the life of the panel. It was one of the reasons 3D was cut (to increase brightness) IMHO in addition to changes to the size of the least long lasting sub pixel to increase total lifespan. Even with occasional short term retention that can be flattened the panels should last a long time in normal use outside of torch mode and torture tests.
 
The B/C/E7 and 8 series (presumably 9 as well?) have had multiple incremental improvements to cut IR and increase the life of the panel. It was one of the reasons 3D was cut (to increase brightness) IMHO in addition to changes to the size of the least long lasting sub pixel to increase total lifespan. Even with occasional short term retention that can be flattened the panels should last a long time in normal use outside of torch mode and torture tests.

Yep.

I briefly contemplated waiting for the late year price reductions (can you imagine getting one for under $1000?) but at $1,500 I think I'd rather secure one earlier and enjoy it for an extra several months. Now if we just knew the price of the 3080 Ti...I hope it doesn't cost as much as this thing.
 
Yep.

I briefly contemplated waiting for the late year price reductions (can you imagine getting one for under $1000?) but at $1,500 I think I'd rather secure one earlier and enjoy it for an extra several months. Now if we just knew the price of the 3080 Ti...I hope it doesn't cost as much as this thing.

I'm going with the 3080 Ti costing more, lol.
 
I'm going with the 3080 Ti costing more, lol.

With how my 1080 Ti is aging in modern games I'll probably switch back to AMD. Before the 1080 Ti I had an R9 290, that thing got faster and faster every year! I hope big Navi is a beast.

From Techpowerup performance analysis of COD MW:
"We can clearly see that NVIDIA's Pascal generation of GPUs is at a large disadvantage here, while Turing does much better. It seems that NVIDIA focused their optimization efforts on improving performance for Turing, possibly through the use of Turing's concurrent FP+Int execution capabilities.
On the AMD side, the improvements look fairly constant across generations, showing that AMD took a more general approach. Only Radeon VII falls behind, no idea why."

Freesync will work on the CX with a firmware update later in the year according to the AVS owners thread.

Speaking of the owners thread, here are passballtotucker comments:

Two things I noticed with my CX:

1. It seems like LG CX can do BFI and G-SYNC at the same time. I got my 55" CX last Saturday. I have g-sync on with OLED motion pro on high. I started up a game and it seems like they both were active. I'm curious if any other cx owners noticed this.

2. When in 4k 120hz, the bottom and right side of the screen are cut off. Turning off screen shift fixes this, but turning the TV off then back on cuts it off again. Screen shift remains off but it can be fixed by turning it on then off again. I'm also wondering if any other cx owners experienced this as well.

Normally bfi isn't selectable with gsync on. I turned it off then turned bfi, then turned gsync on and they were both working.

When restarting my computer I'd have to redo it. Sometimes it takes a couple tries to work. Maybe they'll make it a real option in a firmware update.
 
Last edited:
Yes, it is extremely maddening that Microsoft chose to do this. I don't understand why borderless fullscreen can't just be an option instead of mandatory. I never run any third-party apps or otherwise that I need to alt+tab to when I'm playing games, and I don't use the Xbox app integration, so I don't need the tradeoffs of running a game in a window.

I understand the low latency and better performance push (within reason) but people still dominate competitive games with fullscreen dx12 like it is. I'm not a smash up derby arena fps shooter gamer really and prefer rpgs and adventure games (some difficult though like souls and souls-like), some challenging coop action games (vermintide 2 on Legend difficulty) etc.. but if I started feeling it aesthetically and mechanically in my gameplay's look and handling it would bother me - including if it lowered the fps noticeably compared to dx11 fullscreen exclusive mode. I'm all for more options so they really should allow dx12 true fullscreen mode as an option but I wouldn't count on it. I run a few older games in DX11 mode exclusive fullscreen (and turning TAA off in games) for sli compatibility already because the frame rate gain is more than worth it to me so dx12 hasn't been all pros to me ever since it came out. Once I get a 3080ti I'll prob just go with DX12 across the board with it's "fullscreen" methods and and move on.

1. It seems like LG CX can do BFI and G-SYNC at the same time. I got my 55" CX last Saturday. I have g-sync on with OLED motion pro on high. I started up a game and it seems like they both were active. I'm curious if any other cx owners noticed this.

2. When in 4k 120hz, the bottom and right side of the screen are cut off. Turning off screen shift fixes this, but turning the TV off then back on cuts it off again. Screen shift remains off but it can be fixed by turning it on then off again. I'm also wondering if any other cx owners experienced this as well.

Normally bfi isn't selectable with gsync on. I turned it off then turned bfi, then turned gsync on and they were both working.

When restarting my computer I'd have to redo it. Sometimes it takes a couple tries to work. Maybe they'll make it a real option in a firmware update.

That's interesting but are you sure it's actually active and that if it is that it's not both interpolation + BFI being turned on? Do you have game mode (low lag) activated too?

Black frame insertion is like a strobe light more or less so if you are running a roller coaster of frame rates and it dips into the lower end it would probably be very eye fatiguing.
For example
a 75fps average graph might be like 45fps at the low end
a 90 to 100fps average graph might be around 60fps to 70fps at the low end

Any BFI/strobing under 100fps SOLID is really bad imo to start with. Even 100 can be eye fatiguing after a while. It's basically PWM. Most people avoid PWM outright in monitors for good reason. So BFI would benefit a lot from really high fps, or some future advanced low input lag and no artifacts interpolation to increase the frame rate 2x, 3x, or more - so that the frame rate lowest is still 120hz strobes (more would be better but that's the cap on the 4k tvs for now).

BFI/strobing also lowers the effective brightness which results in a more dull image, losing color vibrancy range even in SDR which means it's not compatible with HDR material at all really (especially with oled which have lower % color brightness ceilings already). So a duller image incapable of getting the most detail in what are supposed to be gradations of bright colors in objects/materials at the per pixel level that oled is capable of.

I'm sure there are a few easy to render SDR games that get massive frame rates who's lows are above 100 or 120fps which could benefit from BFI but in general it's not for me. Games with that high of a frame rate probably don't need VRR (variable refresh rate) anyway.

---------------------------------------------------
According to HDTVTest on the LG C9:
"The performance is similar to last year's. The BFI resulted in too much flicker to be used comfortably, especially in brighter scenes."

Rtings:
"For 60hz content this displays each frame for half the duration, with black frames between. It works well to clear up persistence blur and is useful for gamers. Those sensitive to flicker may find it distracting and it reduces the screen brightness significantly though so it isn't for everyone."

"At the time of testing, this isn't possible with one-hundred and twenty hz signals as it causes every second frame to be lost. " - not sure if that last part is the case now though. That no 120hz BFI rate comment was a yr ago. I haven't confirmed that... even if it was possible it would still be limited by hdmi 2.0 gpus' 120hz at 1440p and 60hz at 4k for now.
 
Last edited:
I have no clue how it works but there are definitely dx12 titles with a fullscreen exclusive mode that actually works contrary to what you can read on the Internet (it may require disabling fullscreen optimizations but I always do that anyway). It is easily noticeable if you try to alt tab out of the 2019 CoD MW or the latest Tomb Raider, for example (you'll see the flicker and everything). Also lets you use a different refresh rate from the desktop one (which makes alt tabbing clunky but is evidence that it actually works).

IIRC it's just an option on the developers' side like multi GPU so many of them may not bother implementing it, but it still exists.
 
Last edited:
  • Like
Reactions: elvn
like this
I understand the low latency and better performance push (within reason) but people still dominate competitive games with fullscreen dx12 like it is. I'm not a smash up derby arena fps shooter gamer really and prefer rpgs and adventure games (some difficult though like souls and souls-like), some challenging coop action games (vermintide 2 on Legend difficulty) etc.. but if I started feeling it aesthetically and mechanically in my gameplay's look and handling it would bother me - including if it lowered the fps noticeably compared to dx11 fullscreen exclusive mode. I'm all for more options so they really should allow dx12 true fullscreen mode as an option but I wouldn't count on it. I run a few older games in DX11 mode exclusive fullscreen (and turning TAA off in games) for sli compatibility already because the frame rate gain is more than worth it to me so dx12 hasn't been all pros to me ever since it came out. Once I get a 3080ti I'll prob just go with DX12 across the board with it's "fullscreen" methods and and move on.



That's interesting but are you sure it's actually active and that if it is that it's not both interpolation + BFI being turned on? Do you have game mode (low lag) activated too?

Black frame insertion is like a strobe light more or less so if you are running a roller coaster of frame rates and it dips into the lower end it would probably be very eye fatiguing.
For example
a 75fps average graph might be like 45fps at the low end
a 90 to 100fps average graph might be around 60fps to 70fps at the low end

Any BFI/strobing under 100fps SOLID is really bad imo to start with. Even 100 can be eye fatiguing after a while. It's basically PWM. Most people avoid PWM outright in monitors for good reason. So BFI would benefit a lot from really high fps, or some future advanced low input lag and no artifacts interpolation to increase the frame rate 2x, 3x, or more - so that the frame rate lowest is still 120hz strobes (more would be better but that's the cap on the 4k tvs for now).

BFI/strobing also lowers the effective brightness which results in a more dull image, losing color vibrancy range even in SDR which means it's not compatible with HDR material at all really (especially with oled which have lower % color brightness ceilings already). So a duller image incapable of getting the most detail in what are supposed to be gradations of bright colors in objects/materials at the per pixel level that oled is capable of.

I'm sure there are a few easy to render SDR games that get massive frame rates who's lows are above 100 or 120fps which could benefit from BFI but in general it's not for me. Games with that high of a frame rate probably don't need VRR (variable refresh rate) anyway.

---------------------------------------------------
According to HDTVTest on the LG C9:
"The performance is similar to last year's. The BFI resulted in too much flicker to be used comfortably, especially in brighter scenes."

Rtings:
"For 60hz content this displays each frame for half the duration, with black frames between. It works well to clear up persistence blur and is useful for gamers. Those sensitive to flicker may find it distracting and it reduces the screen brightness significantly though so it isn't for everyone."

"At the time of testing, this isn't possible with one-hundred and twenty hz signals as it causes every second frame to be lost. " - not sure if that last part is the case now though. That no 120hz BFI rate comment was a yr ago. I haven't confirmed that... even if it was possible it would still be limited by hdmi 2.0 gpus' 120hz at 1440p and 60hz at 4k for now.

The comments above about BFI+gsync aren't mine, feel free to follow up with the dude that posted them in the AVSforum thread.

Your points against BFI regarding brightness are unfair and a tad dated, any monitor made in the last two years has ample brightness to spare with BFI enabled. These 400 nit light canons are searing my retinas! I have BFI on 24/7 on my Samsung and it's still way too bright and I have to lower brightness. Regarding HDR and BFI, I agree with you when talking about non-fps games. Shadow of War (PC) at 4k HDR on my C7 OLED is one of the best gaming experiences I've ever had, it would have been even better with Gsync/Freesync no doubt. I had a lot of tearing in that game because the frames weren't there and I wanted the graphical fidelity. Vsync helps but I hate input lag more than anyone I've ever met.

I'm the polar opposite gamer to you, I'll trade graphical fidelity for more frames most of the time. I play FPS games on all low settings to keep frames above my refresh rate. Then I cap the framerate at the refresh rate to get lower input lag (the intent is 90 to 95% gpu utilization). The Battle(non)sense video below explains this process in detail and why it's so important.

Gsync is largely useless to me in FPS games but BFI is absolutely critical for motion resolution. BFI is the best way to deal with sample and hold motion blur. COD Warzone is so incredible I'll probably put 1000+ hours plus into that game over the next year or two. I cannot play FPS games without BFI, it's just not the same. To get above 120fps I'll be rendering that game at 1440p 120 hz anyway unless the 3080ti/Big-navi blow everyone away. Generational GPU performance increases have really slowed down in the last few years and it's killing me!



Edit: The C9 cannot do BFI above 60hz, that's why there is the flicker. At 120hz that shouldn't be an issue with BFI in the CX.
 
Last edited:
D3D12 programs can run in full screen exclusive mode. All that bullshit happened because Activision Blizzard lied about it regarding World of Warcraft. They famously lied and said it wasn't possible to do just because they wanted to cut features and simplify the maintenance of their game.

I've even made test D3D12 programs that run in full screen exclusive mode. Don't believe that fud.
 
D3D12 programs can run in full screen exclusive mode. All that bullshit happened because Activision Blizzard lied about it regarding World of Warcraft. They famously lied and said it wasn't possible to do just because they wanted to cut features and simplify the maintenance of their game.

I've even made test D3D12 programs that run in full screen exclusive mode. Don't believe that fud.

Yeah, Blizzard definitely lied about it. It was an amazingly good dx12 implementation for a non native game though, I'll give them that. But a lot of people take their words as gospel it seems.
 
I'm the polar opposite gamer to you, I'll trade graphical fidelity for more frames most of the time.

Not quite polar opposite really. I aim for a balance. 100fps average in order to span around 70 - 100 - 120 so that much of the frame rate graph is in the higher hz capability of my high hz monitor. That's meant running things at veryhigh to very high+ rather than ultra a lot of the time. Even better if I can get 120fps which can result in something like 90 - 120 - 150 (capped 5 below my monitor's max refresh rate though).

BFI/strobing being tied to the frame rate needs even higher than that kind of VRR range though imo, 100fps-Hz as the common minimum of the graph for strobe rate and even that gets eye fatiguing after awhile. The strobing still dulls the screen luminance and color vibrancy heights, and would be even worse for HDR content if it even works at all with it enabled. The quotes about BFI being dim were from HDTVtest and Rtings.com in regard to the LG C9 OLED's BFI being flickering / PWM and lower brightness. I can't imagine running BFI at 60hz

---------------------------------------------------
According to HDTVTest on the LG C9:
"The performance is similar to last year's. The BFI resulted in too much flicker to be used comfortably, especially in brighter scenes."

Rtings:
"For 60hz content this displays each frame for half the duration, with black frames between. It works well to clear up persistence blur and is useful for gamers. Those sensitive to flicker may find it distracting and it reduces the screen brightness significantly though so it isn't for everyone."

"At the time of testing, this isn't possible with one-hundred and twenty hz signals as it causes every second frame to be lost. " - not sure if that last part is the case now though. That no 120hz BFI rate comment was a yr ago. I haven't confirmed that... even if it was possible it would still be limited by hdmi 2.0 gpus' 120hz at 1440p and 60hz at 4k for now.


If we get 240hz, 360hz and on up to 1000Hz monitors in the future with some futuristic high quality interpolation to multiple frames several times each, the strobe per frame time would be much smaller. Once we get up to 480hz (and later to 1000hz) filled with frames, the strobing will become unnecessary to remove sample and hold blur in the first place. By that time VR and AR with virtual screens will prob be the go to display format though at the rate things are going (no pun intended).

okw997S.png
 
Not quite polar opposite really. I aim for a balance. 100fps average in order to span around 70 - 100 - 120 so that much of the frame rate graph is in the higher hz capability of my high hz monitor. That's meant running things at veryhigh to very high+ rather than ultra a lot of the time. Even better if I can get 120fps which can result in something like 90 - 120 - 150 (capped 5 below my monitor's max refresh rate though).

BFI/strobing being tied to the frame rate needs even higher than that kind of VRR range though imo, 100fps-Hz as the common minimum of the graph for strobe rate and even that gets eye fatiguing after awhile. The strobing still dulls the screen luminance and color vibrancy heights, and would be even worse for HDR content if it even works at all with it enabled. The quotes about BFI being dim were from HDTVtest and Rtings.com in regard to the LG C9 OLED's BFI being flickering / PWM and lower brightness. I can't imagine running BFI at 60hz




If we get 240hz, 360hz and on up to 1000Hz monitors in the future with some futuristic high quality interpolation to multiple frames several times each, the strobe per frame time would be much smaller. Once we get up to 480hz (and later to 1000hz) filled with frames, the strobing will become unnecessary to remove sample and hold blur in the first place. By that time VR and AR with virtual screens will prob be the go to display format though at the rate things are going (no pun intended).

View attachment 231464

I can't imagine BFI at 60hz either but I could be verrrry happy with 1440p 120hz 5MS input lag and BFI on the high setting I think. Regarding higher refresh rates, you'd need a gpu capable of those frames which as far as I know doesn't exist. Even 360fps is pretty much unattainable for most people.
 
Has anyone compared the various motion smoothing settings, like how does "low" compare to "high" etc. Is BFI enabled on low etc. In a quick demo I saw the low setting didn't affect brightness much at all, was considering it as a "always on" setting possibly also when watching TV.

That and if BFI has any positive effect vs burn-in as it refreshes pixels way more often are probably my personal greatest interest regarding the new CX series.
 
Last edited:
I can't imagine BFI at 60hz either but I could be verrrry happy with 1440p 120hz 5MS input lag and BFI on the high setting I think. Regarding higher refresh rates, you'd need a gpu capable of those frames which as far as I know doesn't exist. Even 360fps is pretty much unattainable for most people.

Yes for now (unless it's a very easy to render game like L4D2) which is why more advanced interpolation in the future to multiply frames 2x, 3x and up is probably one of the ways forward to fill higher Hz. There is machine learning upscaling of lower resolutions to essentially full fidelity native resolution in development right now by nvidia, amd, and oculus though, which supposedly can get performance gains of over 60% (though that figure could be marketing).

Blur Busters - The Amazing Journey To Future 1000Hz Displays with blur free sample and hold
1000fps at 1000Hz would sufficiently approximately match 1ms CRT phosphor persistence, for a flicker-free sample-and-hold display. Technologically, this is achievable through interpolation or other frame rate amplification technologies on an ultra-high refresh rate display.

Frame Rate Amplification Tech (FRAT) — More Frame Rate With Better Graphics - (Blurbusters.com Published Apr 4, 2019)

If we get 240hz, 360hz and on up to 1000Hz monitors in the future with some futuristic high quality interpolation to multiply frames several times each, the strobe per frame time would be much smaller. Once we get up to 480hz (and later to 1000hz) filled with frames, the strobing will become unnecessary to remove sample and hold blur in the first place. By that time VR and AR with virtual screens will prob be the go to display format though at the rate things are going (no pun intended).

okw997S.png
Has anyone compared the various motion smoothing settings, like how does "low" compare to "high" etc. Is BFI enabled on low etc. In a quick demo I saw the low setting didn't affect brightness much at all, was considering it as a "always on" setting possibly also when watching TV.

I know you are talking about interpolation as well but according to HDTVTest, they removed low medium high for BFI. So the last i heard that feature in particular is on/off (always full assumingly) now.
 
Last edited:
Has anyone compared the various motion smoothing settings, like how does "low" compare to "high" etc. Is BFI enabled on low etc. In a quick demo I saw the low setting didn't affect brightness much at all, was considering it as a "always on" setting possibly also when watching TV.

That and if BFI has any positive effect vs burn-in as it refreshes pixels way more often are probably my personal greatest interest regarding the new CX series.

Burn in is basically "burn out". It's caused by wearing out the oleds on one part of the screen more than the others and they come dimmer than the other part of the screen.
If you're using BFI the oleds aren't on as long so maybe it could help. But if you turn the brightness up to compensate they'll be driven just as hard, it might be even worse. I don't know how long an individual OLED is supposed to last at each brightness level, but I don't think i'ts linear, it probably decreases even more the closer to peak brightness you get. It could be something like 100% brightness shortens the life 5X as much as 50% brightness does.
 
BFI and strobing are often quoted as lowering the brightness by around the same percent as they reduce the blur. For example, 20% blur reduction --> 20% reduction in brightness, 75% blur reduction ---> 75% reduction in brightness. That is a huge hit on screen brightness capability and will not work with HDR color brightness ranges. OLEDs are limited in brightness and sustained brightness ~ %window already.

The BFI setting is either on or off according to the hdtvtest update on the CX tvs, so how aggressively it strobes and dims/dulls screen luminance and vibrancy can't be adjusted, other than perhaps by frame rate if that.

I don't think you can even do even moderately high brightness in SDR unless you disable all the protections, and HDR is based on a static set of values so you shouldn't be altering the white point or gamma in HDR mode. HDR is absolut values not a narrow band you tweak up and down like SDR. Also, both modes have ABL.

The C9 can do a fullscreen sustained of 151 nit. , SDR real scene peak brightness is 335 nit. Most SDR LCD screens are up to 350nit so that is not a lot by comparison, and if 335nit is reduced as far as 75% it would be down to as low as 84 nit, and that's not even figuring in post ABL considerations.


Rings C9 OLED review (link)
The C9 OLED has good peak brightness with SDR content. Small highlights are brighter than on the C8 or B8, but this results in a more aggressive Automatic Brightness Limiter (ABL), which dims the screen significantly when larger areas of the screen get bright.

The C9 has a new Peak Brightness setting, which adjusts how the ABL performs. Setting this to 'Off' results in most scenes being displayed at around 303 cd/m², unless the entire screen is bright, in which case the luminosity drops to around 139 cd/m². Increasing this setting to 'Low', 'Med', or 'High' increases the peak brightness of small highlights. If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes).

OcEWpkc.png


Since I don't know how "high" the on/off BFI setting is when on I can only guess what % it might dim/dull the screen. If you assume 1 per frame it could be 60 - 75% . ABL in SDR drops the screen to 140 nit , or avoiding ABL 246 - 258nit. 75% of 250nit non-ABL configured is only 62.5 nit. 75% of 140nit post ABL kick in is 35 nit. Maybe the "only on or off" BFI on the LG CX's isn't that aggressive but that is still cutting down some already small values quite a lot.

That is of course if you are keeping the brightness limits and ABL active to avoid burn in.
 
Last edited:
Yep.

I briefly contemplated waiting for the late year price reductions (can you imagine getting one for under $1000?) but at $1,500 I think I'd rather secure one earlier and enjoy it for an extra several months. Now if we just knew the price of the 3080 Ti...I hope it doesn't cost as much as this thing.
The 2080Ti debut at $1200 msrp. Nvidia says the next gen wont be as pricey so my prediction is it will be $1000usd. That way they can claim the price reduction like they said but still keep it at the most expensive end and of course people will pay it. So that's my prediction. $1000usd for the 3080Ti. Assuming there is no price gouging at release. I would love to be wrong however and see it come in cheaper.
 
BFI and strobing are often quoted as lowering the brightness by around the same percent as they reduce the blur. For example, 20% blur reduction --> 20% reduction in brightness, 75% blur reduction ---> 75% reduction in brightness. That is a huge hit on screen brightness capability and will not work with HDR color brightness ranges. OLEDs are limited in brightness and sustained brightness ~ %window already.

The BFI setting is either on or off according to the hdtvtest update on the CX tvs, so how aggressively it strobes and dims/dulls screen luminance and vibrancy can't be adjusted, other than perhaps by frame rate if that.

I don't think you can even do even moderately high brightness in SDR unless you disable all the protections, and HDR is based on a static set of values so you shouldn't be altering the white point or gamma in HDR mode. HDR is absolut values not a narrow band you tweak up and down like SDR. Also, both modes have ABL.

The C9 can do a fullscreen sustained of 151 nit. , SDR real scene peak brightness is 335 nit. Most SDR LCD screens are up to 350nit so that is not a lot by comparison, and if 335nit is reduced as far as 75% it would be down to as low as 84 nit, and that's not even figuring in post ABL considerations.


Rings C9 OLED review (link)


View attachment 231843


Since I don't know how "high" the on/off BFI setting is when on I can only guess what % it might dim/dull the screen. If you assume 1 per frame it could be 60 - 75% . ABL in SDR drops the screen to 140 nit , or avoiding ABL 246 - 258nit. 75% of 250nit non-ABL configured is only 62.5 nit. 75% of 140nit post ABL kick in is 35 nit. Maybe the "only on or off" BFI on the LG CX's isn't that aggressive but that is still cutting down some already small values quite a lot.

That is of course if you are keeping the brightness limits and ABL active to avoid burn in.

151 is totally fine with me, I could go lower. The only time I really desperately need BFI is in FPS games: I can't use HDR anyway because of the increased input lag, can't use high graphics settings (increase input lag by nature of lower frames), etc. I'm playing COD Warzone with a mix of High and low/off settings. I'd lower every setting but I need the clarity for distance visibility.

Where did you see HDtvtest state that BFI is either on or off? I can't find it.
 
My parents won't care much watching sports and TV shows 90% of the time if it only goes to 150 nits and in a "lovely" 720p-1080p SDR content but they want the best technology despite content might not be primary "intended" use but yea they've been wanting to replace the aging 42" Panasonic plasma to a 46-49" since ages and finally the perfect upgrade arrives :p

Considering the content is that quality and sitting distance is rather close I'm glad it's just 48", on top of that I understood OLED will do 720-1080p content very nicely compared to LCD similar to Plasmas which also have a more "organic" / soft / natural look compared to the more pixelized look LCDs offer at lower quality, especially 720p.
 
Last edited:
Where did you see HDtvtest state that BFI is either on or off? I can't find it.
I'll have to see if I can find it again.. maybe it made it in at the last minute? I saw one earlier sneak peek vid where he said it had low-medium-high bfi settings , then a later vid where he said unfortunately they changed it back to either on or off due to problems with the implementation (probably artifacts or stutter or something). There should be web site reviews out soon though so I'll stay turned to find out.

---------------------------------------

RPGWiz they are brighter than that unless you use BFI. Ordinary people watching tv usually use interpolation, sometimes along with some BFI. It depends on whether there are artifacts or if it makes it weird / "spooky" looking to some people, and whether it dims the screen too much for their viewing preference and viewing environment. I'd assume that eventually even live sports will start to use hybrid log gamma HDR in the future, where the peak color brightness across the HDR 3d color volume would matter a lot.

The numbers I was guessing at if the tv used aggressive PWM~BFI were:

..at around a 250nit average screen brightness as a basis in order to avoid ABL kicking in -->
.. 62.5 nit average screen brightness potentially to your eyes in SDR after ~75% blur reduction via BFI strobing/blanking and dimming

..335nit average normal SDR screen brightness, if reduced as far as 75% with 75% blur reduction's dimming could be down to as low as -->
.. 84 nit in effect to your eyes
..335nit after ABL kicks in is 140nit - - before BFI is factored in. After BFI of 75% blur reduction is factored in that could be as low as
..35nit in effect to your eyes.

Those numbers are just guesswork but might give an idea how much color vibrancy/overall screen brightness might be lost at a very aggressive BFI amount.
---------------------------------------------

So I'm not sure if you can adjust low - medium - high BFI rather than just On/Off in the final shipped version of the 48 CX. If you can in fact moderate how aggressive the PWM / Black frame insertion is - then you could reduce how much the screen is dimmed but it would also reduce the amount of blur reduction by a similar amount. If you reduce the BFI you are still suffering a proportionate amount of screen dimming and still viewing PWM which can cause eye fatigue.

More options are always a good thing though. I'm all for more features as long as they don't infringe on each other's performance and can be turned off.
 
At least on the LG C9 the "OLED Motion" setting which turns on BFI is a single toggle switch in the all too deep down submenu. I've only tried it on Netflix content and with HDR I felt there wasn't much of a difference in brightness but that's on 24 Hz content. On the CX it might depend on refresh rate.

My understanding is that the BFI on these screens is more like a rolling scan rather than full frames and that helps retain more brightness. There was a video of it somewhere but I can't find it now.
 
Dnice in AVS forum confirmed there are 3 settings for BFI: Low, medium and high. He said the high setting lowered brightness dramatically but that was almost definitely with a 60hz source. With a 120hz source like a PC the brightness reduction should be less drastic. Dnice also provided his calibration results.
 
Last edited:
Reviews have begun! https://www.flatpanelshd.com/review.php?subaction=showfull&id=1585020920&utm_source=MadMimi&utm_medium=email&utm_content=Review:+LG+CX+OLED+-+Disney++launches+in+UK,+Germany,+Spain,+more+-+In-cinema+movies+now+available+at+home+for+$20&utm_campaign=20200324_m157528335_FlatpanelsHD+-+March+24,+2020&utm_term=_0D_0AReview_3A+LG+CX+OLED

The input lag increase with BFI scares me. They didn't test with 120hz input though so it will probably cut that number in half. I don't trust their input lag results because we've seen much lower numbers demonstrated on stage. The wait for the Rtings review is on.

"Another new development this year is 'OLED Motion Pro', which is a more effective version of the BFI (Black Frame Insertion) system found in previous OLED TVs. BFI inserts black frames into the video stream to 'reset' the human eye in order to make motion appear less blurry. You can achieve similar results (plus more) by increasing the frame rate of the content (i.e. 4K120 or higher) but it is nice to have BFI as an option for lower frame rate content, too, in order to replicate more "plasma-like" motion on an OLED panel. The improved 120Hz BFI system was actually intended for LG Display's 2019 OLED panel but was pulled before release. Now it has reappeared in the 2020 OLED panel and as such it will also be available in OLED TVs from competing brands, too. FlatpanelsHD saw the 2019 implementation in action and although we did not get a change to thoroughly examine it, it seems to us that the 2020 implementation sacrifices brightness to a higher degree. In LG CX there are five levels for OLED Motion Pro (Off, Low, Medium, High and Auto). With a special test pattern we measured 'Off' to 318 nits brightness, 'Low' to 273 nits, 'Medium' to 187 nits and 'High' to 77 nits. The exact brightness values are not important so focus on the relative change in brightness: 'Low' will reduce brightness by 15%, 'Medium' by 40% and 'High' by 75%. The 'High' setting produces visible flicker and is not recommended for any type of content (it should probably be removed). 'Medium' is more effective at increasing motion resolution than 'Low' but brightness obviously takes a more significant hit. Lastly, there is an 'Auto' option that varies between 'Low' and 'Medium' but avoids 'High'. The conclusion? Well, at its two lower settings the BFI system is definitely useful now, as opposed to BFI in previous years' OLED TVs, but improved motion resolution comes at the expense of a reduction in brightness that is a little higher than we had hoped. Also note that by engaging 'OLED Motion Pro' input lag increases slightly to 22 ms. Further improvement is welcome.

Even without HDMI 2.1, LG 2020 OLED TVs are excellent performers in the area. We measured input lag to 13 ms in Game Mode - 1080p SDR, 4K SDR and 4K HDR. At this time, we cannot measure input lag in VRR and 4K120 where it will be as low as 6 ms with, according to the company. "
 
Last edited:
Thanks for that post Pastuch!

So, since I keep my OLED light setting so low for daily use anyway, I wonder if I can just bump it up to compensate for the brightness hit with BFI enabled and achieve the same end result in terms of nits. It will be interesting to play with this feature and see what ends up looking and performing best. Thankfully, some of that work is being done for us as we speak. :)
 
Shame about the input lag increase. BFI would be useful for games with fast motion and those are usually also the ones that require the lowest input lag. That said, 22 is quite alright. It's about what my Samsung KS8000 had at best and I did not really notice it as a problem even in games that require fairly precise input timing like the Souls series for example.
 
Shame about the input lag increase. BFI would be useful for games with fast motion and those are usually also the ones that require the lowest input lag. That said, 22 is quite alright. It's about what my Samsung KS8000 had at best and I did not really notice it as a problem even in games that require fairly precise input timing like the Souls series for example.

Yeah, according to rtings my B7 is 21.x so I would be fine with that. Plus, there's always a chance that it can be improved via firmware updates. It'll probably get better as they continue to develop and tweak the feature. There was a firmware update for the 2016 sets that dramatically reduced input lag from what I remember, though I don't know if they've made similar updates since then or just tweaked it year by year as new sets are released.
 
Shame about the input lag increase. BFI would be useful for games with fast motion and those are usually also the ones that require the lowest input lag. That said, 22 is quite alright. It's about what my Samsung KS8000 had at best and I did not really notice it as a problem even in games that require fairly precise input timing like the Souls series for example.

If you read the rest of the article they say the input lag should about 7ms with a 120hz input. What I'm searching for is the following: Windows 10 + 2000 series Nvidia card + 1440P @ 120hz or 4k @ 120hz with BFI enabled on High setting. What is the input lag in this configuration?
 
Thanks for that post Pastuch!

So, since I keep my OLED light setting so low for daily use anyway, I wonder if I can just bump it up to compensate for the brightness hit with BFI enabled and achieve the same end result in terms of nits. It will be interesting to play with this feature and see what ends up looking and performing best. Thankfully, some of that work is being done for us as we speak. :)

That's exactly what I was planning on doing. My C7 is at 40 OLED light, with BFI on ultra I suspect 100 will be required.
 
You can achieve similar results (plus more) by increasing the frame rate of the content (i.e. 4K120 or higher) but it is nice to have BFI as an option for lower frame rate content, too, in order to replicate more "plasma-like" motion on an OLED panel

Nice to have another tool in the toolbox but as they said, it has diminishing returns vs high fps+highHz gameplay. The less percent of BFI and dimming you engage the gap narrows between that compared to the blur reduction benefits of raw fps+HighHz at full color brightnesses/highlight's ranges ~ vibrancy capabilities. Of course it wouldn't work correctly with HDR content either as that is based on a range of absolute values not something you dial higher and lower like SDR brightness controls, and OLED is already more limited in what color volumes it can do at what % of screen, especially for HDR material.

The nits they quoted were the average screen brightness and did not take into consideration that if ABL kicks in, the screen drops to 140nits before the BFI cuts it by up to 25%, 50%, or 75%. That is, unless you otherwise keep your screen brightness at around 250 if it's anything like the C9:
The C9 has a new Peak Brightness setting, which adjusts how the ABL performs. Setting this to 'Off' results in most scenes being displayed at around 303 cd/m², unless the entire screen is bright, in which case the luminosity drops to around 139 cd/m². Increasing this setting to 'Low', 'Med', or 'High' increases the peak brightness of small highlights. If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes).

-------------------------------------------------------------
FlatPanelsHD Review: LG CX OLED 24 Mar 2020 | Rasmus Larsen
With a special test pattern we measured 'Off' to 318 nits brightness, 'Low' to 273 nits, 'Medium' to 187 nits and 'High' to 77 nits. The exact brightness values are not important so focus on the relative change in brightness: 'Low' will reduce brightness by 15%, 'Medium' by 40% and 'High' by 75%. The 'High' setting produces visible flicker and is not recommended for any type of content (it should probably be removed). 'Medium' is more effective at increasing motion resolution than 'Low' but brightness obviously takes a more significant hit.
--------------------------------------------------------------

So assuming you rule out 75% outright, the 318 nit they are using as a basis with BFI off (and assumingly with ABL enabled~not avoided) gets
..cut 40% at medium BFI setting, which does mathematically result in 190.8 nit so is around what they measured at 187nit.
..on Low BFI 25% the math result is at 238.5nit which is lower than the 273nit they quoted as being measured so perhaps it's not as aggressive as a 25% brightness reduction, perhaps interpolation is on too which would makes sense if the input lag is higher. I'm not sure if the tradeoffs would be worth it runing low BFI compared rather than no BFI with a raw fps hovering around a 120fps+120Hz graph considering the PWM effect on your eyes and the input lag increase from BFI.
..
..Neither of those quotes show the after ABL values. ABL on regular SDR brightness cuts it down to around 140nit on a C9 before BFI brightness reduction is considered so if you used medium 40% you could end up with a result of 84nit* of scene color brightness/highlights/detail in areas of color post ABL.
...
..If you run at around 250nit color brightness/color detail ceiling in SDR with the contrast at 80 and peak brightness to "Off" in order to avoid ABL kicking in you end up with medium BFI reducing 40% of 250 -> probably having a result of around 140nit of color detail/brightness (which is coincidentally the same as what you get in the previous (default) ABL scenario after ABL kicks in but before BFI brightness reduction considerations if enabled).

--------------------------------------------------------------------
So would you want to run SDR on the screen
..normally at ~ 320nit of color detail but seeing 140nit ABL kick in intermittently
..normally + medium BFI --> 190nit of color detail (320nit minus 40%) until ABL kicks in then down to as low as *84nit (40% strobe effect subtracted from 140nit ABL) . 40% blur reduction (compared to 60fps+60hz but not 120fps+120hz), PWM, input lag.

..ABL-avoiding 80contrast, peak brightness"off" ~> 250nit seeing no ABL triggering
..ABL-avoiding 80 contrast, peak brightness "off" ~> BFI active (250 nit minus 40%) lowering it to 140nit average screen brightness (and detail limit in fields of color) throughout. 40% blur reduction compared to 60fps+60Hz, perhaps 20% vs 120fps+120hz.. +PWM, input lag increase.

*That is because as I understand it - BFI's dimming effect is on the way our eyes perceive the screen meaning the LG ABL burn in safety tech would not realize the after-BFI-effecive-brightness so would still be operating normally as if BFI wasn't a factor.
 
Last edited:
Why can’t they measure VRR now? Hdmi thing?

What do you mean by measure? HDMI 2.1 can do 4k 120hz at 4:4:4 chroma, HDMI 2.0 can't. You can run 1440p at 120hz on HDMI 2.0 though with full chroma, and I think you can run 4k 120hz at 4:2:0, 8bit, on the LG CX from a hdmi 2.0 gpu for now. Lower chroma affects text rendering and for color detail purists, color detail in fields of color. It can look similar to upscaling a lower resolution which looks kind of "muddy".

Nvidia g-sync compatible VRR screens typically start syncing at 40fps-Hz on the low end so going beneath that frame rate - in your frame rate graph at all, not just your frame rate average - isn't recommended. That means if a particular game's frame rate fluctuated +/- 30fps from it's average, in that particular case you might have to dial a demanding game's graphics settings in so that it runs 70fps-Hz average or better in order to stay above 40fps-Hz on the low end. In order to get appreciable benefits in motion clarity (blur reduction) and motion definition (more unique pages in a flip book flipping faster) out of higher hz, you prob should be shooting for 100fps-Hz average or better anyway on a 120hz+ monitor.

Hope this helps, I wasn't exactly sure what you were asking.
 
Last edited:
Nvidia g-sync compatible VRR screens typically start syncing at 40fps-Hz on the low end so going beneath that frame rate - in your frame rate graph at all, not just your frame rate average - isn't recommended.
To my understanding, this is still better than non-VRR, or at least non-VRR with V-Sync. What I'm not sure about is how quickly a follow-up frame could be transmitted, or if tearing could be allowed to limit the lag. What do you think?
 
It's way better than non VRR.. I didn't mean to imply that at all. VRR itself is great.

What I was saying is that you already have a high Hz monitor that gets appreciable blur reduction and motion definition increases starting around 100fps average, in some cases equating to a 70 - 100 - 130 graph, (120 average or higher even better).... so if running high enough frame rate ranges to get appreciable benefit out of the higher hz you would already be running your frame rate roller coaster graph +/- your average high enough to avoid sinking below 40fps-Hz. That is, imo, you should be running frame rates higher than having to worry about sub 40fps already in order to get benefit out of 100 to 120hz (or 115hz capped).

Some VR headsets use a type of interpolation that cuts anything below 90fps down to 45fps and doubles it. I'd rather not suffer 45fps in motion definition (sloshy staying on same freeze-frame longer), but it would keep the blur from sinking back to full smearing and would also keep the frame rate higher than the VRR low end cutoff. Neither nvidia or LG does this afaik, I just thought I'd mention it since it exists as a tech.

I think in the future some much better interpolation without considerable input lag or artifacts will be needed to fill high refresh rates, even if you are at 80 - 100fps solid someday you could benefit from interpolation x3, x4 , x10 for a 240hz, 360hz, 480hz, 1000hz monitor in the future.

----------

I'm not sure about the next frame time. You could probably suffer the tearing but from what I read of early freesync it would actual do a stutter when going below the VRR minimum. I'm not sure if that is still the case.

Personally I would just keep the frame rate higher by dialing in the graphics settings to 100fps-Hz average or better and use a beefy gpu like a 3080ti.
If not I'd say you are better off running a lower rez, even on this monitor you could do 1:1 ultrawide rez or a 1440p rez letterbox framed all around with ultra oled black bars "off" .. and still have full sized monitor. Running graphs that span into 40fps-Hz and lower - that low you aren't getting appreciable benefits from the high Hz in the first place. To avoid going off the VRR range's rails I'd say at least 75fps-Hz average for games that swing +/- 30fps from the average. It would depend on the game, some have less variance. In that example you'd have 45fps-Hz <---- 75fps-Hz -----> 105 fps/Hz. That is still low for keeping in the 100 - 115fps-Hz range for more of the graph (to reduce blur and increase motion definition considerably) but it would keep your graph from hitting or going below 40fps-hz VRR limit for the most part.
 
The flatpanel review said “ At this time, we cannot measure input lag in VRR and 4K120 where it will be as low as 6 ms with, according to the company.”

just curious why that isn’t possible. 6ms would be amazing
 
The flatpanel review said “ At this time, we cannot measure input lag in VRR and 4K120 where it will be as low as 6 ms with, according to the company.”

just curious why that isn’t possible. 6ms would be amazing

6ms is very possible based on what we saw at CES. They can't measure it because TFTcentral probably doesn't have the new version of the Leo Bodnar input lag tester that does 4k. The original Leo Bodnar was 1080p only.
 
LG in some CES interview already stated they are not interested in chasing lower input lag when it is already in sub 1 frame area for supported refresh rates. You have to remember that OLED also does not have response time related input lag like LCDs do.

So both input lag and response time on these TVs are issues you don't have to consider at all.
 
Back
Top