LG 48CX

The best setting for this display if you have the horsepower will be: 4K - 120Hz - BFI "high" - FPS cap set to 119.993 via RTSS - VSync set to "On" in-game.

That way you don't get the V-Sync - ON input lag, and the FPS is so closely synced still for BFI that the occasional single micro-stutter is minutes apart and not noticeable, and you have no screen tearing. 4.16ms MPRT motion clarity.

I'm not a fan of BFI in it's current forms with current display limitations. It's essentially PWM, and it dims/mutes color vibrancy. That means detail-in-colors can be lost and color brilliance muted. Therefore BFI obviously doesn't work with HDR color brightness and hdr ranges of detail-in-colors - which are already restricted on OLED in order to avoid burn-in. Additionally, reviewers say high BFI on the CX is noticeably flickery. The other levels are still flickering even if you can't really notice it consciously though, like PWM. That causes eye fatigue and even eye strain.

To me it seems like the tradeoffs for medium BFI are way too much for too little. For high the flickering rules it out entirely. For low the effect is way too little compared to just running high frame rates to reduce sample and hold blur so I wouldn't even bother taking any tradeoffs for that. Anyway, in it's current limitations BFI is not for me personally.

That said, I did look up some info about it back when it was being heavily discussed in this thread and found details about how much different the oled rolling scan BFI is from what most people are familiar with in LCD backlight strobing so I thought I'd post that reply again here with the relevant quotes.

Does BFI work with any refresh rate? Like, 75hz, 90hz, 101hz, etc? Or do you have to choose either 60 or 120hz?

I wanted to know the details of how it works as well, especially with VRR activated and working fully because your frame rate and hz would be varying and for most people dropping considerably on the low end of a game's graph. A 60-90 -130 graph didn't seem like it would play nice with bfi.

It seems like "chief blurbuster" mark r is saying
the hz/fps is unlinked from the black frame since its a per pixel strobe.

"Hz and persistence can be unlinked/unrelated thanks to strobe duty cycle adjustment."

"Does not matter if full strobe or rolling scan, as it is per-pixel duty cycle.
Note: That said, non-global illumination can cause artifacts (e.g. skewing during scan of any CRT, OLED (including strobed and nonstrobed rolling scans) or nonstrobed LCD"


"rolling strobe on OLED can be fractional refreshes, so OLED BFI can actually be arbitrary lengths unrelated to refresh cycle length. Since the off-pass can chase behind the on-pass simultaneously on the same screen at an arbitrary distance"

from blurbusters. com

Strobing on OLEDs sometimes have to behave differently because there’s no independent light source separate from pixel refresh source like for LCDs.

As a result, strobing on most OLEDs are almost always rolling-scan strobe (some exceptions apply, as some panels are designed differently OLED transistors can be preconfigured in scanout refresh, and then a illumination voltage does a global illumination at the end).

However, most large OLED panels have to do a rolling-scan strobe for reducing persistence. Also, rolling strobe on OLED can be fractional refreshes, so OLED BFI can actually be arbitrary lengths unrelated to refresh cycle length. Since the off-pass can chase behind the on-pass simultaneously on the same screen at an arbitrary distance, much like it did on the Dell UP3017Q monitor. Like changing the phosphor of a CRT to a shorter or medium persistence (except it’s a squarewave rather than a fade wave), CRT phosphor persistence length being unrelated to refresh cycle length. So BFI isn’t necessarily integer divisors here, and the meaning of “strobing” vs “BFI” is blurred into one meaning.

- An OLED with a 50%:50% on BFI will reduce motion blur by 50% (half original motion blur)
- An OLED with a 25%:75% on BFI will reduce motion blur by 75% (quarter original motion blur)

Typically, most OLED BFI is only in 50% granularity (8ms persistence steps), though the new 2019 LG OLEDs can do BFI in 25% granularity at 60Hz and 50% granularity at 120Hz (4ms persistence steps)

Except for the virtual reality OLEDs (Oculus Rift 2ms persistence), no OLEDs currently can match the short pulse length of a strobe backlight just yet, though I'd expect that a 2020 or 2021 LG OLED would thus be able to do so./QUOTE] - <edit by elvn: > they dropped bfi from the 2019 models but the numbers apply to 2020 oleds as 15% 40% it seems.


Black duty cycle is independent of refresh rate. However, percentage of black duty cycle is directly proportional to blur reduction (at the same (any) refresh rate). i.e. 75% of the time black = 75% blur reduction. Or from the visible frame perspective: Twice as long frame visibility translates to twice the motion blur.

Does not matter if full strobe or rolling scan, as it is per-pixel duty cycle.

Note: That said, non-global illumination can cause artifacts (e.g. skewing during scan of any CRT, OLED (including strobed and nonstrobed rolling scans) or nonstrobed LCD -- when viewing http://www.testufo.com/blurtrail will skew in nonlightboost at 32pps -- but stops skewing in lightboost or global strobe. Also, if viewing animation on iPad, rotate display until you see line skew)

Bell curve strobe rather than squarewave strobe can be useful and may look better for some applications other than VR, or slower motion/unsynchronized(VSYNC OFF) motion. As a slight persistence softening can reduce the harshness of microstutters from non-perfect refreshrate-framerate synchronization. But other ultrafast refresh-rate-synchronized motion, minimum motion blur dictates point strobing (as short as possible persistence, which is electronically easier with squarewave...just turn off pixel automatically mid-refresh....independent of refresh rate exact interval....aka Rolling Scan).

There is no such thing as "180Hz internal operation" as Oculus is already 80-90% black duty cycle (2ms persistence, despite 1/90sec = ~11ms refresh cycles) -- it is just a pixel-turnoff time delay, in a rolling scan algorithm. You adjust persistence simply by chasing the off-scan closer to the on-scan. See high speed videos. Screen gets darker the more you shorten a rolling scan OLED persistence, and you may get into color nonlinearities as active matrix transistors turning on/off OLED pixels aren't perfectly squarewave at the sub-millisecond level.

Good VR makes perfect framerate-refreshrare vsyncing mandatory, unfortunately for the Holodeck immersion. No ifs, buts, no protest, it is just the way the cookie crumbles for VR. In this case, striving for CRT-style curve strobing (instead of square wave strobing) is rather useless UNLESS you need it to hide imperfections (e.g. Flaws in the display) or reduce eyestrain at lower strobe rates.

1ms of persistence translates to 1 pixel of motion blur for every 1000 pixels/second motion. (We call this the "BlurBusters Law") For motion on a 4K display going one screenwidth per second, at 1ms strobe flashes, all 1-pixel fine details gets motionblurred to 4 pixels from the act of eyetracking against the visible part of the refresh cycle.

This can also be observed in action by the motion-induced optical illusion that only appears during eye tracking: http://www.testufo.com/eyetracking

If you have an adjustable-persistence monitor:
You can even see the relationship between persistence and motion blur, then use your persistence adjustment (strobe duty cycle adjustment, e.g. ULMB strobe adjustment, BENQ Blur Reduction strobe length, etc) while watching http://www.testufo.com/blurtrail ... The line blurring goes exactly twice the blurring (no matter the refresh rate) for double the persistence, and follows BlurBusters Law (1ms strobe length = 1 pixel added motion blur during 1000 pixel/sec motion, but becomes 2 pixels of blurring at 2ms persistence)

Hz and persistence can be unlinked/unrelated thanks to strobe duty cycle adjustment. But if you wanted to completely eliminate black periods, a 2ms persistence display with no strobing, would need to run 500fps@500Hz flickerfree to match Lightboost 2ms, or Oculus flicker rolling scan 2ms. This is because 1 second divided by 2ms equals 500 visible distinct frames needed to fill all 2ms slots of 1 second to avoid any blackness, while keeping motion of each frame perfectly in sync with eye movements. Even a 2000fps@2000Hz display (needed for 0.5ms full persistence woth zero black cycle) would still have human-visible motion blur in extreme conditions (e.g. Fast head turning with 4K or 8K VR while trying to read fine text on a wall). Oculus knows this. Michael Abrash confirms the relationship between persistence and motion blur.
 
Last edited:
Why would you not get VSync input lag in this situation? Just enabling it should cause it.

Because RTSS has CPU render queue based FPS cap limiting down to 1/1000th of a second precision. If you don't hit 120.000 Hz, you don't get the Input lag penalty from V-Sync - ON. Most testing has shown around .007ms below native refresh rate works the best. Hence the 119.993 Hz setting. If the refresh is slightly off, say 120.002 Hz, you simply modify the RTSS FPS cap. In this situation; 119.995 Hz.
 
That's interesting. I was always hearing 3fps and even up to 5fps to be safe, supposedly since caps weren't that reliably tight on cutting out overages. From 3 to 5 fps down to .005 Hz is quite a difference in margins.

https://graphicscardhub.com/fps-limiter/

How much FPS Limit should you Set?

It is better to limit your FPS a little below monitor’s maximum refresh rate because sometimes the FPS limiter can overshoot the FPS a little bit (due to error margin) above the target FPS that you have set, and this can result in screen tearing to happen. For example, if your monitor has maximum refresh rate of 75Hz and your graphics card is generating in excess of 100FPS then it would be better to cap the frame rate 2-3 FPS below monitor’s refresh rate, which in this case would be 72 – 73 FPS, to remain on the safer side and avoiding any kind of unaccounted marginal error that can happen with FPS limiters. You may set this number to even 70 FPS also if you are still getting screen tearing with 73 FPS with FreeSync or G-Sync enabled.
 
Last edited:
Because RTSS has CPU render queue based FPS cap limiting down to 1/1000th of a second precision. If you don't hit 120.000 Hz, you don't get the Input lag penalty from V-Sync - ON. Most testing has shown around .007ms below native refresh rate works the best. Hence the 119.993 Hz setting. If the refresh is slightly off, say 120.002 Hz, you simply modify the RTSS FPS cap. In this situation; 119.995 Hz.
Would like to see some sources on "most testing." The lag from V-Sync on comes from the the buffer needed, which is not eliminated from doing the above.
 
Would like to see some sources on "most testing." The lag from V-Sync on comes from the the buffer needed, which is not eliminated from doing the above.

Yes it is. Why so quick to try and discredit something you don't understand? The RTSS FPS cap is really clever and cheats V-Sync - "ON" by keeping the queued frame buffer minimal while keeping the image perfectly tear free.

https://blurbusters.com/howto-low-lag-vsync-on/

https://forums.guru3d.com/threads/the-truth-about-pre-rendering-0.365860/page-12#post-5380262
 
Yes it is. Why so quick to try and discredit something you don't understand? The RTSS FPS cap is really clever and cheats V-Sync - "ON" by keeping the queued frame buffer minimal while keeping the image perfectly tear free.

https://blurbusters.com/howto-low-lag-vsync-on/

https://forums.guru3d.com/threads/the-truth-about-pre-rendering-0.365860/page-12#post-5380262

Sucks we have to jump through so many hoops to achieve this. Each game should have a "minimal buffer vsync" option.
 
Should be no problem. BFI is a separate setting you can use in any mode.

As for competitive gaming, I would not get the LG 48" for that. Not if your plan is to play fps games in fullscreen. It's just too big for that. I think it's great for immersion in a lot of games, but in a fast paced game where graphics mostly don't matter and you are more concerned with seeing and reacting to things, the sheer size of the display is going to be a factor. Now, if we ignore the requirements to run it at 120 Hz (HDMI 2.1 GPU etc), running with 1:1 scaling at 1080p and 1440p would probably shrink it to a more usable size for gaming.

It's just way more cost effective and straightforward to buy a 24" 1080p 240 Hz display for that purpose.

And it won't add lag?
 
I'm not a fan of BFI in it's current forms with current display limitations. It's essentially PWM, and it dims/mutes color vibrancy. That means detail-in-colors can be lost and color brilliance muted. Therefore BFI obviously doesn't work with HDR color brightness and hdr ranges of detail-in-colors - which are already restricted on OLED in order to avoid burn-in. Additionally, reviewers say high BFI on the CX is noticeably flickery. The other levels are still flickering even if you can't really notice it consciously though, like PWM. That causes eye fatigue and even eye strain.

To me it seems like the tradeoffs for medium BFI are way too much for too little. For high the flickering rules it out entirely. For low the effect is way too little compared to just running high frame rates to reduce sample and hold blur so I wouldn't even bother taking any tradeoffs for that. Anyway, in it's current limitations BFI is not for me personally.

That said, I did look up some info about it back when it was being heavily discussed in this thread and found details about how much different the oled rolling scan BFI is from what most people are familiar with in LCD backlight strobing so I thought I'd post that reply again here with the relevant quotes.




I wanted to know the details of how it works as well, especially with VRR activated and working fully because your frame rate and hz would be varying and for most people dropping considerably on the low end of a game's graph. A 60-90 -130 graph didn't seem like it would play nice with bfi.

It seems like "chief blurbuster" mark r is saying
the hz/fps is unlinked from the black frame since its a per pixel strobe.

"Hz and persistence can be unlinked/unrelated thanks to strobe duty cycle adjustment."

"Does not matter if full strobe or rolling scan, as it is per-pixel duty cycle.
Note: That said, non-global illumination can cause artifacts (e.g. skewing during scan of any CRT, OLED (including strobed and nonstrobed rolling scans) or nonstrobed LCD"


"rolling strobe on OLED can be fractional refreshes, so OLED BFI can actually be arbitrary lengths unrelated to refresh cycle length. Since the off-pass can chase behind the on-pass simultaneously on the same screen at an arbitrary distance"

from blurbusters. com

Well let's not kid ourselves here, right now if a game has HDR then it was developed quite recently and there's a high chance that maintaining a minimum frame rate of 120 at 4k in it is going to be nearly impossible to achieve. I say BFI at this point in time would be more useful for playing non demanding titles or very old ones and these kind of games more than likely lack any HDR support so BFI killing brightness isn't really going to be an issue then. Heck if it works extremely well then I'm going to ditch my 240Hz Omen X27 because the only reason why I even bought that monitor in the first place wasn't because I'm some super hardcore competitive gamer who wanted any possible edge over other players, but because I do enjoy me some good motion clarity and my Acer X27 just wasn't really cutting it since it lacks any form of backlight strobing and even if it did then it would just most likely be full of craptastic crosstalk like the vast majority of strobed IPS panels out there. It's sample and hold blur at 120Hz combined with it's mediocre response times wasn't enough for me and I still play a lot of old, easy to run games like L4D2 and just found the experience of better motion clarity to be more enjoyable on my 240Hz Omen vs having 4k resolution on the Acer. 1440p with AA is enough for me when it comes to really old games as 4k doesn't add much since all the assets are all lower res anyway. But if the CX with BFI can offer up a better experience then I'm all for it.

I should also add that I game in a dark room with a brightness of only 120 nits in SDR so even with BFI killing brightness in SDR it's still a non issue for me, even the ABL is a non issue since it caps brightness to around 150 nits which is above what I even use anyways.
 
Last edited:
Like I mentioned earlier, if these TV's can support arbitrary refresh rates, like 75hz, 90hz, 100hz, and BFI at the same time, that opens up all sorts of possibilities. And you can always use resolution scaling too, to increase frame rate.

One person has told me they got BFI at 80hz at 1440p on a LG 48CX, but that's one person and he may have set up his custom resolution improperly or something. We need more people with these TV's to test this out to be sure
 
Okay - I’ve got the 55” CX setup. I enabled deep color for HDMI1, change display type to PC and it is connected to my PC which is running Windows 10 Pro on a RTX2080 super. I also have my Samsung 48” JU6700 connected via a DP to HDMI converter and deep color is setup and working there too. Both sets are running 3840x2160 60Hz, 4:4:4. I checked and validated the chroma via this link:
https://www.rtings.com/images/test-materials/2017/chroma-444.png I also tried changing the color settings in the Nvidia control panel - tried both "use default color settings" and "use NVIDIA color setting with 32-bit, depth, 8 bpc, RGB" - played with each setting.

I’ve been comparing text side by side with the 2 displays and the CX definitely is not as nice. Specifically, I’m seeing the individual dots that make the CX not as sharp looking (seems to be the most noticeable when I view black text against a white background). To off-set the PPI, I created a Word document where I set the size of text a little larger on the 48" vs the 55" CX so that they have the same "size" appearance and I'm still seeing a difference. I did try both with and without ClearType. With is better - no jagged edged text.

My viewing distance is approx. 34-36 inches. No, I’m not willing to change that - I’ve been running 3 of the Samsung curved 48” 4K displays for the last 4 years and love it. I am interested in better color fidelity which is why I picked up the CX -I’m testing whether or not a 55” is workable for me, and early indication is that I'll be fine with the larger size. My adjustment is going from a curved display to a flat...

I’m looking for any pointers/help with getting the text to look better. Any help or pointers is greatly appreciated.

Thanks
 
You can 'tune' ClearType, something that's recommended for monitors that use an 'inverted' sub-pixel layout, something that's common in TVs with VA panels (some, but not all). I had to do this with my 31.5" 1440p panel, as it just plain looks weird without the tuning.

Probably worth looking up to see if it helps?
 
Apparently BuyDig/Beachcamera have some units in stock, list 19 available currently.

Went to cancel my BB order and was unable, apparently shipping soon. Maybe the boatload finally arrived.
 
You can 'tune' ClearType, something that's recommended for monitors that use an 'inverted' sub-pixel layout, something that's common in TVs with VA panels (some, but not all). I had to do this with my 31.5" 1440p panel, as it just plain looks weird without the tuning.

Probably worth looking up to see if it helps?

I did try that. I went through the tuning for both displays. Still having the issue. Beautiful screen and video is wonderful on it, but I use this 8 hours a day so I might have to try the Samsung Q90T...

Any other suggestions would be great!
 
Apparently BuyDig/Beachcamera have some units in stock, list 19 available currently.

Went to cancel my BB order and was unable, apparently shipping soon. Maybe the boatload finally arrived.

Oh snap, I just got a text alert from BB. Order has shipped! I submitted a request to cancel my order from B&H.

Crossing my fingers for you, hopefully it ships soon!
 
I did try that. I went through the tuning for both displays. Still having the issue. Beautiful screen and video is wonderful on it, but I use this 8 hours a day so I might have to try the Samsung Q90T...

Any other suggestions would be great!

I know this may seem obvious, but you just went from using a 48" to a 55"...text is definitely not going to be as sharp as the latter as the pixels are larger. A far more direct comparison would be your 48" JU6700 vs. the 48CX. You can try to improve it but there's only so much that you can do to overcome the larger pixel size. It's the way it's always been.

Unfortunately I just gave my dad my old 48" JS9000 yesterday (what timing, lol) to replace his dying plasma, or else I'd set it up alongside my 48CX when it gets here and compare the text quality. But I'm sure that it's going to be an improvement over the text quality of my 55B7, which (to me) has been perfectly acceptable for the past two and a half years so perhaps I'm not as picky as some.
 
Oh snap, I just got a text alert from BB. Order has shipped! I submitted a request to cancel my order from B&H.

Crossing my fingers for you, hopefully it ships soon!
When did you order? I had an order from 6/20 that still hasn't shipped and saw someone else post they ordered 6/21 and already received theirs...not sure what's going on.

*Edit - I see you posted that you just ordered it a few days ago. Guess I'll have to call BB to see what the deal is. Sigh.
 
Last edited:
When did you order? I had an order from 6/20 that still hasn't shipped and saw someone else post they ordered 6/21 and already received theirs...not sure what's going on.

June 27 (this past Saturday; two days ago). That is weird for sure.
 
June 27th? I ordered mine on June 26th and have heard nothing lol.

Maybe they'll send me a follow-up text later tonight and be like "jk, lol" :p

The only thing I can think of, and this is purely grasping at straws, is that if they're set up to ship from distribution centers like Amazon is, maybe some distribution centers are receiving their shipments before others and the people receiving these earlier just happen to be lucky enough to live within the region that's normally served by a DC that received a shipment of displays. Otherwise, one would think that the orders would be shipped in the order received. But FIFO would only really make sense if they were all shipping from one central location. If (for example) I live near an east coast DC that has received some of these, and you live near a west coast DC that hasn't, I can see them not shipping your set from my DC in order to cut way down on shipping costs. That would really add up across millions of orders per year. But like I said, that's purely a theory.
 
Well let's not kid ourselves here, right now if a game has HDR then it was developed quite recently and there's a high chance that maintaining a minimum frame rate of 120 at 4k in it is going to be nearly impossible to achieve. I say BFI at this point in time would be more useful for playing non demanding titles or very old ones and these kind of games more than likely lack any HDR support so BFI killing brightness isn't really going to be an issue then. Heck if it works extremely well then I'm going to ditch my 240Hz Omen X27 because the only reason why I even bought that monitor in the first place wasn't because I'm some super hardcore competitive gamer who wanted any possible edge over other players, but because I do enjoy me some good motion clarity and my Acer X27 just wasn't really cutting it since it lacks any form of backlight strobing and even if it did then it would just most likely be full of craptastic crosstalk like the vast majority of strobed IPS panels out there. It's sample and hold blur at 120Hz combined with it's mediocre response times wasn't enough for me and I still play a lot of old, easy to run games like L4D2 and just found the experience of better motion clarity to be more enjoyable on my 240Hz Omen vs having 4k resolution on the Acer. 1440p with AA is enough for me when it comes to really old games as 4k doesn't add much since all the assets are all lower res anyway. But if the CX with BFI can offer up a better experience then I'm all for it.

I should also add that I game in a dark room with a brightness of only 120 nits in SDR so even with BFI killing brightness in SDR it's still a non issue for me, even the ABL is a non issue since it caps brightness to around 150 nits which is above what I even use anyways.


If you are into it that's fine , but it's still PWM and eye fatiguing.
To avoid ABL on the C9, Rtings said to set the contrast to '80' and set the peak brightness to "Off". The peak color brightness and detail-in color is then is lower at ~ 250 nit in all scenes.

75% BFI , if it were usable, could cut your perceived brightness of 250nit down by 75% to 62.5 nit.
If you use 60% BFI it could cut that 250nit down by 60% (minus 150 nit) to 100 nit due to the nature of pixel blanking's effect on perceived brightness.
40% BFI would be around 140nit (the same color brightness limit as when ABL kicks on were you not avoiding ABL and not cutting brightness with BFI).
15% isn't worth the tradeoffs vs just running high fps+Hz so not really worth mentioning imo.

BFI on high is reportedly obvious in your face flickering, but all of the other BFI settings are flickering too, just not as consciously noticeable (just like PWM). Like PWM they will be eye fatiguing and even eye-straining over time. I'd also like to know if BFI has been separated from "OLED Motion Pro" settings now because it was said that BFI in OLED motion pro increased input lag to 22ms.

It's good that there are options for people who want to mess with it, but I've pretty much ruled it out personally in it's incarnations and tech limitations currently.
 
If you are into it that's fine , but it's still PWM and eye fatiguing.
To avoid ABL on the C9, Rtings said to set the contrast to '80' and set the peak brightness to "Off". The peak color brightness and detail-in color is then is lower at ~ 250 nit in all scenes.

75% BFI , if it were usable, could cut your perceived brightness of 250nit down by 75% to 62.5 nit.
If you use 60% BFI it could cut that 250nit down by 60% (minus 150 nit) to 100 nit due to the nature of pixel blanking's effect on perceived brightness.
40% BFI would be around 140nit (the same color brightness limit as when ABL kicks on were you not avoiding ABL and not cutting brightness with BFI).
15% isn't worth the tradeoffs vs just running high fps+Hz so not really worth mentioning imo.

BFI on high is reportedly obvious in your face flickering, but all of the other BFI settings are flickering too, just not as consciously noticeable (just like PWM). Like PWM they will be eye fatiguing and even eye-straining over time. I'd also like to know if BFI has been separated from "OLED Motion Pro" settings now because it was said that BFI in OLED motion pro increased input lag to 22ms.

It's good that there are options for people who want to mess with it, but I've pretty much ruled it out personally in it's incarnations and tech limitations currently.

Yeah there's definitely the issue of PWM comfort so I just have to test it and see if it's actually useable or not. Even without BFI though, I think OLED's true instant response time might just be enough for me anyways. But then again it's being bottlenecked by a 120Hz refresh rate so wouldn't it, at least in theory, be limited to no better than 8.3ms of persistence?
 
BFI might be eye fatiguing for reading text on a white webpage, but I imagine it's fine for playing games.

I'm still using a CRT for games, some of which I play at 60hz (for 60fps locked games like Mega Man 11, Street Fighter 5)
 
BFI might be eye fatiguing for reading text on a white webpage, but I imagine it's fine for playing games.

I'm still using a CRT for games, some of which I play at 60hz (for 60fps locked games like Mega Man 11, Street Fighter 5)

Don't VR headsets also use BFI? How we can use 72-90Hz BFI VR headsets but then can't use 120Hz BFI TVs because of eyestrain eh? I have an Oculus Quest which is 72Hz and I get more discomfort from the front heavy weight distribution rather than 72Hz BFI.
 
Last edited:
I know this may seem obvious, but you just went from using a 48" to a 55"...text is definitely not going to be as sharp as the latter as the pixels are larger. A far more direct comparison would be your 48" JU6700 vs. the 48CX. You can try to improve it but there's only so much that you can do to overcome the larger pixel size. It's the way it's always been.

Unfortunately I just gave my dad my old 48" JS9000 yesterday (what timing, lol) to replace his dying plasma, or else I'd set it up alongside my 48CX when it gets here and compare the text quality. But I'm sure that it's going to be an improvement over the text quality of my 55B7, which (to me) has been perfectly acceptable for the past two and a half years so perhaps I'm not as picky as some.

I wasn’t sure about how the size difference would factor in. I have compared the Samsung 40”JU6700, last years 43” Samsung 4K and my 48”JU6700 and JS9000. While there is a difference regarding text between the 40 and 48, the 55” CX seems to be different. This is my first OLED, so maybe I’m just used to the LCD look.

I’ll keep evaluating it, and maybe the Samsung Q90T...

Thanks for taking the time to respond!
 
I wasn’t sure about how the size difference would factor in. I have compared the Samsung 40”JU6700, last years 43” Samsung 4K and my 48”JU6700 and JS9000. While there is a difference regarding text between the 40 and 48, the 55” CX seems to be different. This is my first OLED, so maybe I’m just used to the LCD look.

I’ll keep evaluating it, and maybe the Samsung Q90T...

Thanks for taking the time to respond!

You're welcome. I don't know it all - some people in this thread are a lot more knowledgeable about display tech than I am (it's a lot to keep up with!) but generally, when using the same resolution, smaller panels are going to be crisper due to having smaller pixels (which reduces the "fuzzy text" effect). There are some software tricks that can mitigate this to some degree; Macs and probably Linux seem to handle this very well due to what I assume to be different rendering algorithms.

You're correct that text quality on even those older Samsung models (when 4:4:4 is enabled) is excellent. The JU/JS series was great, and what an exciting year for large 4K gaming! But now we have native 120Hz support, G-Sync support, BFI, VRR, and much lower input lag with stunning OLED image quality. I'm so stoked to get this thing!
 
Don't VR headsets also use BFI? How we can use 72-90Hz BFI VR headsets but then can't use 120Hz BFI TVs because of eyestrain eh? I have an Oculus Quest which is 72Hz and I get more discomfort from the front heavy weight distribution rather than 72Hz BFI.

BFI is not at the same eye-fatiguing level as LCD PWM. LCD PWM pulses the back light indiscriminately, regardless of the state of the LCD panel/refresh pixel state. BFI on an OLED at 120 Hz will have very little eye-strain, since it is perfectly timed. Just like when I used my FW900 CRT at 96 Hz, I didn't have any eye-strain. The scanning was perfectly in sync and paced.
 
I did try that. I went through the tuning for both displays. Still having the issue. Beautiful screen and video is wonderful on it, but I use this 8 hours a day so I might have to try the Samsung Q90T...

Any other suggestions would be great!

Try the app Better Cleartype Tuner. It lets you select the type of font smoothing used. You will want to try both RGB and grayscale font smoothing.

Note that on the LG you need to go to the Home dashboard screen and set the input icon to PC so that it operates in PC mode. Otherwise it will have green fringing on text.

48 vs 55" will always apply so using the same viewing distance for the larger monitor is not going to work.
 
  • Like
Reactions: elvn
like this
Yes it is. Why so quick to try and discredit something you don't understand? The RTSS FPS cap is really clever and cheats V-Sync - "ON" by keeping the queued frame buffer minimal while keeping the image perfectly tear free.

https://blurbusters.com/howto-low-lag-vsync-on/

https://forums.guru3d.com/threads/the-truth-about-pre-rendering-0.365860/page-12#post-5380262
I wasn't trying to discredit anything. When I see "most" I want to see something that backs up that qualifier.

Reading through both articles confirmed what I was saying. You said nothing about the V-Sync buffer queue in the post I replied to, you said that you don't get the input lag penalty from V-Sync:
Because RTSS has CPU render queue based FPS cap limiting down to 1/1000th of a second precision. If you don't hit 120.000 Hz, you don't get the Input lag penalty from V-Sync - ON. Most testing has shown around .007ms below native refresh rate works the best. Hence the 119.993 Hz setting. If the refresh is slightly off, say 120.002 Hz, you simply modify the RTSS FPS cap. In this situation; 119.995 Hz.
The input lag from the buffer is still there, but the FPS limiter eliminates the added latency from the queue backing up which causes the game you're running to stall and judder. Adding the FPS limiter ensures that the buffer is constantly being cleared on every other refresh cycle without the queue backing up, preventing the game from stalling.

Using the RTSS limiter gives you minimal V-Sync input lag.
 
BFI is not at the same eye-fatiguing level as LCD PWM. LCD PWM pulses the back light indiscriminately, regardless of the state of the LCD panel/refresh pixel state. BFI on an OLED at 120 Hz will have very little eye-strain, since it is perfectly timed. Just like when I used my FW900 CRT at 96 Hz, I didn't have any eye-strain. The scanning was perfectly in sync and paced.


It's not the same but it's still pulsing. PWM pulses in relation to the average screen brightness. It's strobing effect is way less promininent at 100% brightness in the OSD for that reason, and it bottoms out if your OSD is moderate or something like 40% brightness. During periods of the same average screen brightness it's pulse is a normalized "beat". It is fluctuating it's rate to produce darker or brighter average screen brigthnesses dynamically. Either way very annoying. 3D shuttered glasses are also paced to the refresh rate and are quite eye fatiguing at 60hz per eye, though they are alternating eyes to be fair so that is more aggressive.

If high BFI is flickering, you know medium is flickering too, it's just barely outside of your "always able to register it consciously" threshold. Sometimes people can still see things like that if they look sidelong at a screen in their periphery. Anyway for someone like me - I can see certain lightbulb's flicker so even if I'm not conciously seeing the flicker's dark/light periods as a slap in the eyes, the current limitations of BFI are going to be eye fatiguing over time.

https://iristech.co/pwm-flicker-test/
eye-pwm-flicker.gif



Combined with the muted color brilliance/range and potential detail in colors lost even in SDR, incompatibility with HDR, and from some reports the increase in input lag to 22ms (unless that has changed?), the tradeoffs for medium BFI's gains compared to just using 40% - 50% blur reduction at 100fps to 120fps to me makes BFI more of a tech demo considering the current hardware limitations rather than something I'd normally use in games. But that's me personally. I'm glad it's included for people who want it. More options are good.

--------------------------------------------------------------------------
VR on Oculus Quest .. since someone mentioned it
---------------------------------------------------------------------------

I do play VR on a quest which has some eye fatigue, but I tend to play much longer gaming sessions on pc games.
A quest uses asynchronous time warp to increase frame rate more or less as a form of interpolation. That isn't an issue for eyestrain really. What is in an issue is that current VR hardware limits the resolution so much that things that appear virtually father away than say 20' or so are screen door effected overall as well as being very bad resolution and "blurry"/"fuzzy" the farther out they are. That and the fact that by default the quest like most VR setups uses foveated rendering, so the sides that are considered periphery are blurry/worse resolution. However I can manually turn the foveated view setting down for easier to render games or those that I'm playing tethered off of my pc's horsepower that happen to have enough frame rate to spare.

https://developer.oculus.com/documentation/native/android/mobile-timewarp-overview/?locale=en_US
In a basic VR game loop, the following occurs:
  1. The software requests your head orientation.
  2. The CPU processes the scene for each eye.
  3. The GPU renders the scenes.
  4. The Oculus Compositor applies distortion and displays the scenes on the headset.

"ATW is a technique that shifts the rendered image slightly to adjust for changes in head movement. Although the image is modified, your head does not move much, so the change is slight.

Additionally, to smooth issues with the user’s computer, game design, or the operating system, ATW can help fix irregularities or moments when the frame rate unexpectedly drops."

"If the time warp runs asynchronous to the stereoscopic rendering, then it may also be used to increase the perceived frame rate and to smooth out inconsistent frame rates"


What Asynchronous Time Warp (ATW) is:
-------------------------------------------------------------
Stereoscopic eye views are rendered to textures, which are then warped onto the display to correct for the distortion caused by the wide angle lenses in the headset.

To reduce the motion-to-photon delay, updated orientation information is retrieved for the headset just before drawing the time warp, and a transformation matrix is calculated that warps eye textures from where they were at the time they were rendered to where they should be at the time they are displayed. The warped pixels are almost exactly correct. A sharp rotation will leave some pixels black at the edges, but this turns out to be minimally distracting.

The time warp is taken a step farther by making it an “interpolated time warp.” Because the video is scanned out at a rate of about 120 scan lines a millisecond, scan lines farther to the right have a greater latency than lines to the left. On a sluggish LCD this doesn’t really matter, but on a crisp switching OLED, users may feel like the world is subtly stretching or shearing when they turn quickly. This is corrected by predicting the head attitude at the beginning of each eye, a prediction of < 8 milliseconds, and the end of each eye, < 16 milliseconds. These predictions are used to calculate time warp transformations, and the warp is interpolated between these two values for each scan line drawn.

The time warp may be implemented on the GPU by rendering a full screen quad with a fragment program that calculates warped texture coordinates to sample the eye textures. However, for improved performance the time warp renders a uniformly tessellated grid of triangles over the whole screen where the texture coordinates are setup to sample the eye textures. Rendering a grid of triangles with warped texture coordinates basically results in a piecewise linear approximation of the time warp.

If the time warp runs asynchronous to the stereoscopic rendering, then it may also be used to increase the perceived frame rate and to smooth out inconsistent frame rates. By default, the time warp currently runs asynchronously for both native and Unity applications.
.........

Flicker in the periphery is a side effect of running inappropriate / inadequate (very bad) frame rates
-----------------------------------------------------------------------------------------------------------------------------------

When frame rate is maintained, the experience feels real and is enjoyable. When it doesn’t happen in time, the previous frame is shown which can be disorienting. The following graphic shows an example of judder during the basic game loop:

_nc_ohc=IVCCV-9kZ8EAX-Wc2gH&_nc_ht=scontent-lga3-2.png
Basic Game Loop with Judder

When you move your head and the world doesn’t keep up, this can be jarring and break immersion.

ATW is a technique that shifts the rendered image slightly to adjust for changes in head movement. Although the image is modified, your head does not move much, so the change is slight.

Additionally, to smooth issues with the user’s computer, game design, or the operating system, ATW can help fix irregularities or moments when the frame rate unexpectedly drops.

The following graphic shows an example of frame drops when ATW is applied:

_nc_ohc=_hbOCMPJq1QAX-kYN_b&_nc_ht=scontent-lga3-2.png
Game Loop with ATW

At the refresh interval, the Compositor applies ATW to the last rendered frame. As a result, a frame with ATW will always be shown to the user, regardless of frame rate.
If the frame rate is very bad, flicker will be noticeable at the periphery of the display. But, the image will still be stable.

ATW is automatically applied by the Oculus Compositor; you do not need to enable or tune it. Although ATW reduces latency, make sure that your application or experience makes frame rate.
 
Last edited:
I wasn't trying to discredit anything. When I see "most" I want to see something that backs up that qualifier.

Reading through both articles confirmed what I was saying. You said nothing about the V-Sync buffer queue in the post I replied to, you said that you don't get the input lag penalty from V-Sync:

The input lag from the buffer is still there, but the FPS limiter eliminates the added latency from the queue backing up which causes the game you're running to stall and judder. Adding the FPS limiter ensures that the buffer is constantly being cleared on every other refresh cycle without the queue backing up, preventing the game from stalling.

Using the RTSS limiter gives you minimal V-Sync input lag.

from the provided link: https://forums.guru3d.com/threads/the-truth-about-pre-rendering-0.365860/page-12#post-5380262
Please note that all this is for vsync. You don't need to do any of this when using G-Sync or FreeSync. For those two sync methods, just cap to 3FPS below your refresh rate and you're done! Also note that all this also doesn't apply to vsync off. This guide really, really is just for non-gsync, non-freesync monitors with vsync enabled. (Or if you for some reason disabled gsync/freesync on your monitor and want to use plain vsync.)
even if you do the above, setting this value to 1 still helps as a guard against temporary framerate fluctuations (no frame capper is perfect), giving a more consistent input lag value.


all this requires a PC that's able to render frames fast enough. Since the above keeps render/frame buffers empty, it means there's no guard against frame rendering time spikes. If your PC can't maintain a solid 60FPS (or whatever your refresh is), then you might be better off not doing any of this.

This seems like it would only be useful for easier to render games (or some with game settings stripped bare) that are capable of a solid 120fps or better minimum frame rate. That makes it pretty niche especially on a 4k screen. Maybe it would work better with DLSS 2.0's rendering and AA to increase frame rates drastically, but I'm not sure how compatible everything would be and if DLSS2.0 would increase input lag at all making this pretty moot for the overall pipeline.

I'd also refer back to my post about online gaming latency and interpolation between all players making these minute differences pretty moot outside of LAN games and local/single player games.

" Put into English this means that once
you pull the trigger and this information package gets sent to the server
, (??? ms)
it then goes back from the current server time (the time the pulling the trigger package was received) by your ping (??? ms)
plus your interpolation time. (15.6ms on a 128 tick server, way more on most game's 64 tick , 22, tick and 12 tick servers)
Only then it is determined if the client hit the shot or not. "

So you won't see the effective game world update state until your shot travels to the server, then goes back from the current server time (the time the trigger package was received), by your ping (???ms) plus your interpolation time (15.6ms itself, or much more on a lot of games). That goes for everything you do, and everything everyone else is doing.
 
Last edited:
Yeah it's a nice trick and the only way I can bear to play with a mouse at 60hz on PC these days. But it's no substitute for VRR of course.

I haven't measured it but it does have very minimal lag. It's easy to toggle RTSS on and off and see for yourself, especially at 60hz where v-sync adds so much damn lag. There's some minuscule micro-stutter that happens once in a while too, which might be noticeable sometimes. It's not much but the smoothness and consistency of proper VRR is still on another level.
 
I wasn't trying to discredit anything. When I see "most" I want to see something that backs up that qualifier.

Reading through both articles confirmed what I was saying. You said nothing about the V-Sync buffer queue in the post I replied to, you said that you don't get the input lag penalty from V-Sync:

The input lag from the buffer is still there, but the FPS limiter eliminates the added latency from the queue backing up which causes the game you're running to stall and judder. Adding the FPS limiter ensures that the buffer is constantly being cleared on every other refresh cycle without the queue backing up, preventing the game from stalling.

Using the RTSS limiter gives you minimal V-Sync input lag.

Most of the input lag of V-Sync is the backed up queue, only clearing out when the queue is completely full and forced to clear. That is why traditional V-Sync ON has such horrid input lag. Hardly anything is absolute. I'll change it to "you don't get most of the Input lag penalty from V-Sync - ON". Still doesn't change anything; the method I described is still the best configuration for BFI on the 48CX. It's called "low-lag" V-Sync ON Blur-busters for a reason. It's low lag. Everything adds some input lag. Even VRR. Scan-line sync compared produces quite poor results in my testing.
 
Last edited:
Pointing back to my other post again where I was showing quotes about how BFI isn't tied to the frame rate like LCD backlight strobing is. Since BFI can blank per pixel it can "strobe" it's rate independently of the refresh rate, and even fractionally at scan line ~ "rolling scan".

per blurbuster's site and forums, from mark R:


" , strobing on most OLEDs are almost always rolling-scan strobe (some exceptions apply, as some panels are designed differently OLED transistors can be preconfigured in scanout refresh, and then a illumination voltage does a global illumination at the end). "

"rolling strobe on OLED can be fractional refreshes, so OLED BFI can actually be arbitrary lengths unrelated to refresh cycle length. Since the off-pass can chase behind the on-pass simultaneously on the same screen at an arbitrary distance"

"Black duty cycle is independent of refresh rate. However, percentage of black duty cycle is directly proportional to blur reduction (at the same (any) refresh rate). i.e. 75% of the time black = 75% blur reduction. Or from the visible frame perspective: Twice as long frame visibility translates to twice the motion blur.

Does not matter if full strobe or rolling scan, as it is per-pixel duty cycle. "

" That said, non-global illumination can cause artifacts (e.g. skewing during scan of any CRT, OLED (including strobed and nonstrobed rolling scans) or nonstrobed LCD "


What settings are you using exactly? I'm assuming you mean you are playing a non demanding game or stripping a game's settings bare at least to the point where you can get 120fps or better minimum - so that your frame rate isn't varying. While the BFI is "independent" of the refresh rate the frame rate, the solid BFI rate would always be acting in concert with a now solid frame rate in that case. Also, 120fps of frames on a 120hz monitor already cuts the blur by 50% compared to 60fps-hz even before BFI is added so that could be a big difference as a starting point itself.

What happened when you were using variable frame rates / non-120fps minimums? Were you getting artifacts? What made your results "poor" ?
 
https://www.theverge.com/2020/6/30/21308026/vizio-2021-p-series-quantum-x-m-v-4k-hdr-tv-prices-specs

Looks like Vizio is being aggressive on price.

Launching in a nebulous "fall"


Even though it’s coming later, you might be curious about how Vizio is pricing its OLED compared to LG. The 65-inch model will run $1,999.99, with the 55-inch set priced at $1,299.99. So it’s fair to say the company is being aggressive in trying to lure shoppers away from the OLED leader.

The OLED, P-Series Quantum X, and P-Series Quantum support variable refresh rate from 48 to 120Hz at up to 4K, making them excellent gaming choices since they’ll avoid screen tearing. They run what Vizio refers to as its ProGaming Engine. And the entirety of Vizio’s new lineup supports HDMI 2.1 on every HDMI port, so you’ll have the latest tech like eARC no matter if you’re buying a top-end model or something like the M-Series Quantum. They’re pretty well future proofed.


full 48gbps too

 
Last edited:
https://www.theverge.com/2020/6/30/21308026/vizio-2021-p-series-quantum-x-m-v-4k-hdr-tv-prices-specs

Looks like Vizio is being aggressive on price.

Launching in a nebulous "fall"


Even though it’s coming later, you might be curious about how Vizio is pricing its OLED compared to LG. The 65-inch model will run $1,999.99, with the 55-inch set priced at $1,299.99. So it’s fair to say the company is being aggressive in trying to lure shoppers away from the OLED leader.

The OLED, P-Series Quantum X, and P-Series Quantum support variable refresh rate from 48 to 120Hz at up to 4K, making them excellent gaming choices since they’ll avoid screen tearing. They run what Vizio refers to as its ProGaming Engine. And the entirety of Vizio’s new lineup supports HDMI 2.1 on every HDMI port, so you’ll have the latest tech like eARC no matter if you’re buying a top-end model or something like the M-Series Quantum. They’re pretty well future proofed.


full 48gbps too

Wow and VRR too? NICE definitely a nice alternative to the C9 and CX especially if it does not suffer from any of the same issues. Too bad there is no 48" version...
 
My Bestbuy order just shipped 🙂

I emailed the CEO last night and told him about your plight. :D

Got a tracking # from UPS earlier this morning - mine is scheduled to arrive tomorrow. Woo! It's getting spicy around here.

Wow and VRR too? NICE definitely a nice alternative to the C9 and CX especially if it does not suffer from any of the same issues. Too bad there is no 48" version...

Yeah, and I would want to see a full review from rtings. It certainly sounds promising, but that quote didn't have a ton of info. I'd want to see every aspect compared, because it's very possible that each will have pros/cons. LG has had years of OLED manufacturing under its belt. Hopefully Vizio comes through when it comes to features, input lag, and the implementation of their electronics package.
 
Very interested. Vizio went from being a weak tv back in 720 days to a high quality TV manufacturer in modern times imo, at least in their P and M series. My living room low # of zone fald VA 4k tv is a vizio and I've seen others. TCL is pretty decent now too, and makes practically everything in their TV's themselves without outsourcing. TCL was even subcontracted to supply samsung panels before if I remember correctly.


https://www.cnet.com/news/vizios-oled-tvs-are-coming-this-fall-starting-at-1300/
"Vizio will compete against established OLED names LG and Sony, as well as newcomers Konka and Skyworth, which both also announced OLED TVs for the US at CES 2020. Vizio sells more TVs than any of those brands, however -- its market share is No. 3 in the US after Samsung and TCL, neither of which sell OLED TVs (yet). Vizio has a reputation for both cheap TVs and, at the higher end, models that deliver excellent picture quality for the money, like the 2019 M-Series and P-Series Quantum X. In any case Vizio's entry into the OLED game, and the inevitable competition between it and LG, is a very good thing for high-end TV buyers."


We saw Vizio's OLED TV in person at its CES 2020 suite, and at first glance it looked as good as expected from any OLED TV. Vizio's representatives wouldn't confirm to CNET that its OLED panel will be sourced from LG Display, but did coyly mention the fact that LGD is the only company producing large OLED panels. Both LG and Sony use LG Display panels, as does Skyworth.

We're looking forward to comparing Vizio's OLED TVs to LG's 2020 models like the CX series, and also to the inevitable Black Friday 2020 price drops.
 
Back
Top