LG 48CX

I think "low" setting will be my set & forget setting that I believe will probably be ideal for most people (just turn up brightness a few notches and you're almost equal to being "off" as you're not going to use 100% brightness with it off anyways) but good if "auto" jumps between low-medium depending on occasion. I still expect a big difference in motion clarity vs off at "low". But I'm specifically a motion clarity enthusiast that didn't jump onto LCDs until 120Hz arrived and I'm using on my current 240Hz monitor strobing always set on as it improves the motion smoothness so much (on for example 144Hz+strobing I noticed much better motion smoothness than 240Hz and strobing off).
.

If the generalization that the % BFI reduces the brightness is in the neighborhood of the % it reduces the sample and hold blur, you are only lowering it by 15% on Low. Running a 100fps-Hz frame rate solid gives you 40% blur reduction compared to a 60fps-hz baseline already, and in addition shows 5 frames of motion definition and pathing articulation, smoothness compared to every 3 at 60fps-hz solid. If you ran a 120fps graph you'd get something like 90 - 120 - 130 (115capped) and have around 35% to nearly 50% blur reduction across the varying graph and up to nearly double the frames of motion definition shown -- without suffering PWM effect of BFI or any brightness loss (including for HDR content too).

In my opinion, from the details I've read -- on low you aren't gaining much. On medium you are losing too much. I'm not sure if interpolation keeping the frame rate~Hz and BFI rate higher is happening (and perhaps contributing to the 22ms of input lag). Otherwise if the BFI frequency is tied to your raw variable frame rate, the low end of your graph is going to be even worse and you'd be experiencing a variable PWM. Though the FlatpanelsHD review said High BFI setting produces visible flicker, don't forget that the other settings are still flickering, just not as much. Most people avoid PWM on monitors at all because it causes eye strain over time.

Perhaps for some low frame rate capped indie/arcade/console games the low setting (and perhaps with interpolation) might be worth trying out for modest sessions though since they can't get any benefit from higher fps+Hz in the first place.
 
Not necessarily. At least on my LG C9 I could not get it to output anything more than the 16:9 resolutions it supports out of the box. It just gives me garbled or no image if I try to run for example 3840x1600 using the TV's scaler. Only way to do that is to use GPU scaling, which with HDMI 2.0 limits you to 60 Hz (or probably 120 Hz 8-bit 4:2:0 on the CX) since the GPU upscales the image to 4K.

If anything, running a lower resolution that cuts some of the screen away at the top and bottom should free up more display bandwidth.

I just set up a 3840x1600 custom rez on my TCL S405 4k , 60hz screen from a 1080ti gpu over hdmi 2.0 no problem using nvidia control panel. The size and position were... "select as scaling mode: aspect ratio", "perform scaling on: display" .. by default... but I switched them to "select scaling mode: No scaling". It works perfectly with a black bar at the top and bottom.

All monitors can do it (that I have ever tried), you simply get black bars. Normally it doesn't make sense (unless some specific game allows you to see more in 21:9 and you want an advantage), but on such a big screen with bandwidth constraints on current gpu's, it could be worth it. Maybe permanently on a shallow desk/limited depth space.

I have zero problem with black bars, especially once I get a 48" OLED where BLACK = OFF pixel emitter. If I was more picky about it I would just paint my wall black or use a black backdrop.

You still end up with a pretty huge screen at 17.4" tall and 41.8" wide.

I could also sit closer for 21:10 gameplay which would make the screen even larger to my perspective .. especially if I'm playing a racing/offroad game for immersion where I wouldn't be seeing content in focus on the sides without moving my head.

.
-------------------------------------------------


---48" display 23.5" tall. 2160px tall divided by 23.5" = 91.91489 px per inch. (Or just figure by multiplying by % of screen the pixel count is)
--- So if my calculations are correct:

1600px high ~> 17.5" tall letterboxed ... leaving 3" top and 3" bottom bars

.
So........

- A 48" 16:9 tv displaying 21:10 letterboxed 3840x1600 content 1:1 would be ~17.4" tall viewable screen out of 23.5" originally.

- 35.7" 16:9 .. a 17.5" tall 1600p 21:10 mode letterboxed 48" is the same height as a 35.7" 16:9 screen but the 48" screen is 10.8" wider (or adds 5.4" to each side of a 35.7" screen)

- A 40" 16:9 monitor's regular display is around 19.6" tall.

- A 43" 16:9 monitor's regular display is around 21.1" tall. The bars on my 43" screen at 21:10 end up being 2 7/8" tall (measured with a measuring tape) - in both the top and the bottom which is in the ballpark of my per pixel height calculations.



-my 32" 16:9 is around 15.7" tall so the 48" in 21:10 would be just under 2" taller (and about 14" or +7"+/7"wider).
 
Last edited:
I hadn't tried 1440@120 since the first week or so that I owned it and the lower resolution really bothered me back then. I played some Division 2 and Last Oasis.

Division 2 didn't look great at first and the lower resolution was noticeable. The smoothness was also noticeable. After spending 2 minutes in nvidia freestyle tuning the filter for some sharpness though, I couldn't tell that I was running at the lower resolution. It looks fantastic.

Freestyle doesn't support Last Oasis. I don't think things looked as bad as Division 2 did initially but it does definitely suffer from lack of sharpness/resolution. I think I prefer my usual 3840x1600@60 in this game for now.

Until we get our 3080ti's, I think it'll just be a matter of trying a few different resolution configurations on each game to find which works best. One thing for sure is that 60Hz on an oled is not the same as 60Hz on an lcd....to me, it feels maybe like 90Hz or so and I only choose that number because it's slightly worse than the 100Hz I was used to playing at on my ultrawide.
 
I think "low" setting will be my set & forget setting that I believe will probably be ideal for most people (just turn up brightness a few notches and you're almost equal to being "off" as you're not going to use 100% brightness with it off anyways) but good if "auto" jumps between low-medium depending on occasion. I still expect a big difference in motion clarity vs off at "low". But I'm specifically a motion clarity enthusiast that didn't jump onto LCDs until 120Hz arrived and I'm using on my current 240Hz monitor strobing always set on as it improves the motion smoothness so much (on for example 144Hz+strobing I noticed much better motion smoothness than 240Hz and strobing off).

I agree with everything you just said. I'm addicted to this 280hz now, Valorant is going to be easy on my 1080TI too!
 
Looks like BFI can be enabled without enabling interpolation?

Any motion tests done for the CX versus CRT and LightBoost descended LCD? (Or maybe it's still just too early? Didn't see any.)

Ok with flicker as long as I can't perceive it. My CRT does and no issues at 85 Hz or above for me. (Looks like the High setting on the CX with the reported visible flicker will not be usable though for me.)

Also ok with a dimmer screen. Would be coming from a CRT in a relatively low ambient light room.
 
You can achieve similar results (plus more) by increasing the frame rate of the content (i.e. 4K120 or higher) but it is nice to have BFI as an option for lower frame rate content, too, in order to replicate more "plasma-like" motion on an OLED panel.

I'm not sure how they tested it at Flatpanelshd in that review. They said it's probably useful for some people for gaming but they didn't go into detail.
FlatpanelsHD saw the 2019 implementation in action and although we did not get a change to thoroughly examine it, it seems to us that the 2020 implementation sacrifices brightness to a higher degree.

It didn't sound like they had actually tested a game on it at all, let alone one over 60fps-Hz (at 1440p for now). They may have been watching 24fps - 30fps video content with BFI.

-----------------------------------------------------------------------------

I'm not sure how LG BFI on the CX series works in relation to the frame rate of the content fed to it. If it's fed a low frame rate throughout like when viewing film content, console 30fps or 60fps, or the bottom end of a modest average frame rate's graph in pc gaming - I'd assume it would have to use interpolation if the BFI is tied to the framerate. That could be contributing to the high input lag in BFI modes.

Really the more I think about it, that would have to be true. Interpolation would have to keep a constant frame rate to strobe rate. Otherwise not only would the strobing be varying (and going way too slow bothering your eyes even more aggressively) but the actual brightness of the screen would be fluctuating wildly up and down within whichever BFI % reduction mode you were using.. and that is assuming you were running 80contrast with peak brightness "off" in order to avoid ABL kicking in at all on top of BFI concerns. From what I've always read, you can't just strobe/black frame a constant rate with varying frame rates or you will get a mess. You also can't drop and raise the brightness and change the amount and rate of PWM your eyes are seeing throughout.

https://blurbusters.com/faq/advanced-strobe-crosstalk-faq/

This is not crosstalk but double images/stroboscopic effects. With VRR you would be keeping the frame rate and the Hz the same of course - however if the BFI were on it's own Hz cyle some ugly things like this could potentially happen at different ratios of frame rate to BFI strobe cycle's Hz. But you can't tie BFI to the raw variable frame rate either because it would vary the brightness and peak color detail all over the place while making the pwm even worse. (It would also be varying the % of BFI blur reduction as well). So the only way I can see of keeping the BFI (and color detail peaks, brightness, and PWM effect) solid easily is by using interpolation.
vTw91yI.png


120fps+Hz solid with no BFI = 50% blur reduction, 100fps+Hz solid with no BFI = 40% blur reduction-- and that is without making the input lag 22ms, without dropping the nit range of color detail, and without PWM eye fatigue over time
BFI at lower frame rate ranges is also lacking higher frame rate's motion definition, path articulation, and smoothness (even of the entire viewport movement when mouse-looking , movement keying, or controller panning). However some content is frame rate capped to begin with (console games, unfortunately some 60fps capped pc games).
That is just a baseline comparison. I don't mean to say that you can't make blur reduction gains above your raw fps+Hz of 100 or 120 using BFI, just showing that you can get considerable blur reduction without those major tradeoffs. 15% BFI gains too lttle to make suffering the tradeoffs wothwhile imo, and 40% BFI increases the tradeoffs way too much to make the blur reduction gain worth it either. .

..normally + medium BFI --> 190nit of color detail (320nit minus 40%) until ABL kicks in then down to as low as *84nit (40% strobe effect subtracted from 140nit ABL) . 40% blur reduction (compared to 60fps+60hz but not 120fps+120hz), PWM fatigue factor to eyes, 22ms input lag.

..normally +
low BFI --> ~320nit minus 15% ~> 272nit (15% of 320.. they quoted a result of 327nit but said "the numbers don't matter"),
until ABL kicks in that down to as low as 119nit of color detail (15% strobe effect subtracted from 140nit ABL).
= 15% blur reduction compared to 60fps@60hz solid but not as compared to 100 or 120fps at 120hz.

..ABL-avoiding 80contrast, peak brightness"off" settings ~> 250nit seeing no ABL triggering
..ABL-avoiding 80 contrast, peak brightness "off" settings + medium BFI~> BFI active (250 nit minus 40%) lowering it to 140nit screen brightness (and color detail limit in fields of color) throughout. 40% blur reduction compared to 60fps+60Hz, perhaps 20% vs 120fps+120hz.. +PWM, input lag increase.

..ABL-avoiding 80 contrast, peak brightness "off" -> 250 nit seeing no ABL triggering
..ABL avoiding 80 contrast, peark brightness "off" + low BFI -> BFI active (250 nit minus 15%) lowering it to 238 nit screen brightness (and detail in fields of color) throughout. 15% blur reduction compared to 60fps@60hz solid but not as compared to 100 or 120fps at 100hz to 120hz solid, also lacking motion definition at lower frame rates and lower frame rate graphs. Suffering 22ms input lag, PWM eye fatigue.

*That is because as I understand it - BFI's dimming effect is on the way our eyes perceive the screen meaning the LG ABL burn in safety tech would not realize the after-BFI-effecive-brightness so would still be operating normally as if BFI wasn't a factor.
 
I think the brightness at Medium BFI would still be much more than I would want. Pretty dim room here.

Sounds like the mystery is how does that rolling black bar that creates the BFI effect really work. Does it run at a refresh rate independent of the source? Is there some combination that would make motion sharp, like 120 Hz solid plus Medium BFI? Or something.

Review made it sound like interpolation could be turned off. Certainly hope that's true during BFI.
 
AU/NZ update:
Emailed LG and asked why it's not on price list;

To address your concern and to set your expectation, at this point, there has been no mention of an upcoming for release date for the 48CX model in either AU or NZ market.

As we are constantly updating our inventory and introducing new products, we would encourage you to try contacting us again over the coming months and we will be more than happy to advise you should further information regarding this product come to hand.


Looks like I'll just have to grab one overseas.
 
Does BFI work with any refresh rate? Like, 75hz, 90hz, 101hz, etc? Or do you have to choose either 60 or 120hz?

I think it might be just double that of common refresh rates e.g. 48, 60, 120 Hz maybe.

Motion interpolation at least on the C9 is a separate setting from the "OLED motion" (BFI) toggle though they are in the same menu. The CX has more options for OLED motion to better tailor it.
 
Does BFI work with any refresh rate? Like, 75hz, 90hz, 101hz, etc? Or do you have to choose either 60 or 120hz?

I wanted to know the details of how it works as well, especially with VRR activated and working fully because your frame rate and hz would be varying and for most people dropping considerably on the low end of a game's graph. A 60-90 -130 graph didn't seem like it would play nice with bfi.

It seems like "chief blurbuster" mark r is saying the hz/fps is unlinked from the black frame since its a per pixel strobe.

"Hz and persistence can be unlinked/unrelated thanks to strobe duty cycle adjustment."

"Does not matter if full strobe or rolling scan, as it is per-pixel duty cycle.
Note: That said, non-global illumination can cause artifacts (e.g. skewing during scan of any CRT, OLED (including strobed and nonstrobed rolling scans) or nonstrobed LCD"

"rolling strobe on OLED can be fractional refreshes, so OLED BFI can actually be arbitrary lengths unrelated to refresh cycle length. Since the off-pass can chase behind the on-pass simultaneously on the same screen at an arbitrary distance"

from blurbusters. com

Strobing on OLEDs sometimes have to behave differently because there’s no independent light source separate from pixel refresh source like for LCDs.

As a result, strobing on most OLEDs are almost always rolling-scan strobe (some exceptions apply, as some panels are designed differently OLED transistors can be preconfigured in scanout refresh, and then a illumination voltage does a global illumination at the end).

However, most large OLED panels have to do a rolling-scan strobe for reducing persistence. Also, rolling strobe on OLED can be fractional refreshes, so OLED BFI can actually be arbitrary lengths unrelated to refresh cycle length. Since the off-pass can chase behind the on-pass simultaneously on the same screen at an arbitrary distance, much like it did on the Dell UP3017Q monitor. Like changing the phosphor of a CRT to a shorter or medium persistence (except it’s a squarewave rather than a fade wave), CRT phosphor persistence length being unrelated to refresh cycle length. So BFI isn’t necessarily integer divisors here, and the meaning of “strobing” vs “BFI” is blurred into one meaning.

- An OLED with a 50%:50% on BFI will reduce motion blur by 50% (half original motion blur)
- An OLED with a 25%:75% on BFI will reduce motion blur by 75% (quarter original motion blur)

Typically, most OLED BFI is only in 50% granularity (8ms persistence steps), though the new 2019 LG OLEDs can do BFI in 25% granularity at 60Hz and 50% granularity at 120Hz (4ms persistence steps)

Except for the virtual reality OLEDs (Oculus Rift 2ms persistence), no OLEDs currently can match the short pulse length of a strobe backlight just yet, though I'd expect that a 2020 or 2021 LG OLED would thus be able to do so./QUOTE] - <edit by elvn: > they dropped bfi from the 2019 models but the numbers apply to 2020 oleds as 15% 40% it seems.


Black duty cycle is independent of refresh rate. However, percentage of black duty cycle is directly proportional to blur reduction (at the same (any) refresh rate). i.e. 75% of the time black = 75% blur reduction. Or from the visible frame perspective: Twice as long frame visibility translates to twice the motion blur.

Does not matter if full strobe or rolling scan, as it is per-pixel duty cycle.

Note: That said, non-global illumination can cause artifacts (e.g. skewing during scan of any CRT, OLED (including strobed and nonstrobed rolling scans) or nonstrobed LCD -- when viewing http://www.testufo.com/blurtrail will skew in nonlightboost at 32pps -- but stops skewing in lightboost or global strobe. Also, if viewing animation on iPad, rotate display until you see line skew)

Bell curve strobe rather than squarewave strobe can be useful and may look better for some applications other than VR, or slower motion/unsynchronized(VSYNC OFF) motion. As a slight persistence softening can reduce the harshness of microstutters from non-perfect refreshrate-framerate synchronization. But other ultrafast refresh-rate-synchronized motion, minimum motion blur dictates point strobing (as short as possible persistence, which is electronically easier with squarewave...just turn off pixel automatically mid-refresh....independent of refresh rate exact interval....aka Rolling Scan).

There is no such thing as "180Hz internal operation" as Oculus is already 80-90% black duty cycle (2ms persistence, despite 1/90sec = ~11ms refresh cycles) -- it is just a pixel-turnoff time delay, in a rolling scan algorithm. You adjust persistence simply by chasing the off-scan closer to the on-scan. See high speed videos. Screen gets darker the more you shorten a rolling scan OLED persistence, and you may get into color nonlinearities as active matrix transistors turning on/off OLED pixels aren't perfectly squarewave at the sub-millisecond level.

Good VR makes perfect framerate-refreshrare vsyncing mandatory, unfortunately for the Holodeck immersion. No ifs, buts, no protest, it is just the way the cookie crumbles for VR. In this case, striving for CRT-style curve strobing (instead of square wave strobing) is rather useless UNLESS you need it to hide imperfections (e.g. Flaws in the display) or reduce eyestrain at lower strobe rates.

1ms of persistence translates to 1 pixel of motion blur for every 1000 pixels/second motion. (We call this the "BlurBusters Law") For motion on a 4K display going one screenwidth per second, at 1ms strobe flashes, all 1-pixel fine details gets motionblurred to 4 pixels from the act of eyetracking against the visible part of the refresh cycle.

This can also be observed in action by the motion-induced optical illusion that only appears during eye tracking: http://www.testufo.com/eyetracking

If you have an adjustable-persistence monitor:
You can even see the relationship between persistence and motion blur, then use your persistence adjustment (strobe duty cycle adjustment, e.g. ULMB strobe adjustment, BENQ Blur Reduction strobe length, etc) while watching http://www.testufo.com/blurtrail ... The line blurring goes exactly twice the blurring (no matter the refresh rate) for double the persistence, and follows BlurBusters Law (1ms strobe length = 1 pixel added motion blur during 1000 pixel/sec motion, but becomes 2 pixels of blurring at 2ms persistence)

Hz and persistence can be unlinked/unrelated thanks to strobe duty cycle adjustment. But if you wanted to completely eliminate black periods, a 2ms persistence display with no strobing, would need to run 500fps@500Hz flickerfree to match Lightboost 2ms, or Oculus flicker rolling scan 2ms. This is because 1 second divided by 2ms equals 500 visible distinct frames needed to fill all 2ms slots of 1 second to avoid any blackness, while keeping motion of each frame perfectly in sync with eye movements. Even a 2000fps@2000Hz display (needed for 0.5ms full persistence woth zero black cycle) would still have human-visible motion blur in extreme conditions (e.g. Fast head turning with 4K or 8K VR while trying to read fine text on a wall). Oculus knows this. Michael Abrash confirms the relationship between persistence and motion blur.
 
Last edited:
Hi all, new member here. I know no one "knows for sure", but was wondering if anyone had a hunch if and when LG might make an even smaller 43'' OLED? A year or two maybe? I ask (1) because it's the size I prefer and (2) there are a good chunk of 40-43" TVs out there, they just aren't OLED.

Reflection question: I have a house with a ton of windows and skylights, so bought a pair of Samsung Q80s, which I know the OLED-heads will scoff at, but it destroys my older displays still and with all of the windows does an excellent job of decent image quality with little to no reflections.

I plan on putting this "future" 43" (or this 48" if I get impatient) in my office as a monitor, which only has one window. Every video I see on these OLEDs has a massive amount of reflection and glare. I'm wondering why LG hasn't implemented some kind of coating? I realize overdoing it can affect image quality, but isn't there a happy medium on having the screen not look like a mirror? Someone probably knows why they do this?

Thanks in advance all, I didn't read all 18 pages word for word, but skimmed through and have been enjoying this thread.
 
Hi all, new member here. I know no one "knows for sure", but was wondering if anyone had a hunch if and when LG might make an even smaller 43'' OLED? A year or two maybe? I ask (1) because it's the size I prefer and (2) there are a good chunk of 40-43" TVs out there, they just aren't OLED.

Reflection question: I have a house with a ton of windows and skylights, so bought a pair of Samsung Q80s, which I know the OLED-heads will scoff at, but it destroys my older displays still and with all of the windows does an excellent job of decent image quality with little to no reflections.

I plan on putting this "future" 43" (or this 48" if I get impatient) in my office as a monitor, which only has one window. Every video I see on these OLEDs has a massive amount of reflection and glare. I'm wondering why LG hasn't implemented some kind of coating? I realize overdoing it can affect image quality, but isn't there a happy medium on having the screen not look like a mirror? Someone probably knows why they do this?

Thanks in advance all, I didn't read all 18 pages word for word, but skimmed through and have been enjoying this thread.


The screen has excellent ant-glare coating. The problem is the low max brightness of OLED means you have trouble making out details in the sunlight. So it overwhelms them with visible glare.

The LED backlights can keep up with that, and increase their brightness automatically - you're still getting the brightness of the ambient sunlight reflected on the screen, but the screen is bright enough to overcome that.

If you crave DIRECT natural light, then you're never going to be able to enjoy OLED. It's the same reason most laptop displays aren't usable direct sunlight - ambient light overwhelms your visible image.
 
Last edited:
Hi all, new member here. I know no one "knows for sure", but was wondering if anyone had a hunch if and when LG might make an even smaller 43'' OLED? A year or two maybe? I ask (1) because it's the size I prefer and (2) there are a good chunk of 40-43" TVs out there, they just aren't OLED.

Reflection question: I have a house with a ton of windows and skylights, so bought a pair of Samsung Q80s, which I know the OLED-heads will scoff at, but it destroys my older displays still and with all of the windows does an excellent job of decent image quality with little to no reflections.

I plan on putting this "future" 43" (or this 48" if I get impatient) in my office as a monitor, which only has one window. Every video I see on these OLEDs has a massive amount of reflection and glare. I'm wondering why LG hasn't implemented some kind of coating? I realize overdoing it can affect image quality, but isn't there a happy medium on having the screen not look like a mirror? Someone probably knows why they do this?

Thanks in advance all, I didn't read all 18 pages word for word, but skimmed through and have been enjoying this thread.

Probably never. We're lucky we're even getting a 48" version. 48" and 55" cost them nearly the same to make so they wouldn't really be able to sell a smaller size any cheaper than a 48".
People buying high end TVs typically don't buy them that small. But maybe if they see how many people are using the 48" as a monitor they'll try to make an even smaller screen. But since they would be making even less money selling a smaller screen they're going to hope people will be ok way a 48" screen.
So I guess there is some hope. Maybe if they can bring manufacturing costs way down and their margins are high enough to sell smaller screens they'll strip the TV components out and start selling 24" and other normal sized monitors. I doubt it though.
 
The screen has excellent ant-glare coating. The problem is the low max brightness of OLED means you have trouble making out details in the sunlight. So it overwhelms them with visible glare.

The LED backlights can keep up with that, and increase their brightness automatically - you're still getting the brightness of the ambient sunlight reflected on the screen, but the screen is bright enough to overcome that.

If you crave DIRECT natural light, then you're never going to be able to enjoy OLED. It's the same reason most laptop displays aren't usable direct sunlight - ambient light overwhelms your visible image.

Ok, if it has excellent glare/reflection coating, I should be okay in my one window office. I can get dark blinds if needed. Thanks for the response.

I'm still not sure if I'll have the willpower to wait for 43" but if I knew if and when it was coming I probably could :) Wasn't sure if there were any insiders here. I know some people mentioned the market is not as big as it may seem for people using OLEDS as monitors, but I have to think that more people would buy a 43 than a 48 for monitor purposes.
 
It seems like "chief blurbuster" mark r is saying the hz/fps is unlinked from the black frame since its a per pixel strobe.

Do you have a link to the full article/discussion?

I feel like since most customers don't really care about things like on/off cycles for motion clarity, most manufacturers won't care either. So I'll be sticking with my Diamondtron and Trinitron monitors longer than I thought
 
Could pick up a 55CX for 2000€ right now but apparently we're getting the 48 next month in Europe and the slightly smaller size will definitely make it quite a bit easier to set up for me. Most likely I'll wait but the temptation is real (and my country's lockdown is extended for 1 month minimum and is likely to go on for longer still).

I'm really bummed that my 1080 ti can't do VRR over HDMI nor integer scaling though (the RTX 2080 ti was too small a performance bump for me to care on release). Like, until the next nvidia gen comes out gaming on that TV will imply some very serious compromises.
 
  • Like
Reactions: elvn
like this
My 55CX was delivered today, needless to say I'm a happy camper here in quarantine! Quick dirty setup pic:

1440p/120Hz and 4k/120Hz Gsync both working w/ a 1.4 AmazonBasics HDMI cable & Strix 2080Ti. Very impressed so far!
Since the C9 apparently supported 100Hz input without issue, I don't suppose you'd be able to confirm/refute if dropping down to 100Hz at 4k frees up enough bandwidth to do 10bit HDR? (this may require creating a custom resolution)



I hadn't tried 1440@120 since the first week or so that I owned it and the lower resolution really bothered me back then. I played some Division 2 and Last Oasis.
With TVs, particularly the higher-end kinds, it's super important to make sure the TV is the one doing the upscaling and not the GPU. GPU scaling is almost always just a basic bilinear or bicubic (at least until "integer nearest neighbor" finally became an option), but most non-cheap TVs have much better upscalers (and even cheap TVs can be better; they won't be as fancy but at least can give a result comparable to something like the well-known lanczos3 upscaler).

The one big exception of course is if you want to use integer scaling, in which case that requires GPU scaling.
 
With TVs, particularly the higher-end kinds, it's super important to make sure the TV is the one doing the upscaling and not the GPU. GPU scaling is almost always just a basic bilinear or bicubic (at least until "integer nearest neighbor" finally became an option), but most non-cheap TVs have much better upscalers (and even cheap TVs can be better; they won't be as fancy but at least can give a result comparable to something like the well-known lanczos3 upscaler).

The one big exception of course is if you want to use integer scaling, in which case that requires GPU scaling.

In the past I would have agreed with you, but Nvidia has really upped their game in this regard by adding integer scaling and image sharpening features. Those do a really good job at making lower resolutions look better. At least my LG C9 is not very good at accepting non-16:9 resolutions, maybe with a HDMI 2.1 GPU it will work better but I tend to get blank image or garbled image (which means it gets confused as it should technically work but it doesn't have the bandwidth) with custom resolutions.
 
PSA, beware of this video on YouTube that was posted in AVS Forum's 2020 OLED owners thread claiming the CX isn't VRR ready for next-gen consoles - I think he fell for Nvidia's marketing:


As I commented on both AVS Forum and the YouTube comments, G-sync compatible over HDMI is literally just Nvidia's implementation of HDMI Forum VRR, and we already know the current gen Xbox One X supports HDMI Forum VRR on the previous gen C9 OLED (this is similar to how G-sync compatible over DisplayPort is literally just Nvidia's implementation of VESA's adaptive sync rather than being a seperate protocol).

There are only two actual VRR protocols for VRR over HDMI - freesync and HDMI Forum VRR, and the Xbox One X supports both as its VRR support works on both freesync monitors that lack HDMI Forum VRR and TVs supporting HDMI Forum VRR that lack freesync.

Also LG's own product page for the CX state that freesync support will be coming in a future firmware: https://www.lg.com/us/tvs/lg-oled65cxpua-oled-4k-tv#pdp_spec

For reference, DisplayPort also only has two different protocols - G-sync ultimate (the type with the actual G-sync module) and adaptive sync; freesync over DisplayPort is just AMD's implementation of adaptive sync while G-sync compatible over DisplayPort is just Nvidia's implementation of adaptive sync.

Oh and BTW, there's no such thing as "freesync ultimate" - there's only Freesync (no LFC, no HDR), Freesync Premium (has LFC, no HDR), and Freesync Premium Pro (has LFC, has HDR). And because AMD's implementation of LFC is driver-side, this means that even consoles can implement it as all that's needed is that the display's VRR range has a maximum that is more than 2x that of the minimum, so LG's 40-120Hz VRR range (even on the C9 at 1440p) is already LFC-capable.


In the past I would have agreed with you, but Nvidia has really upped their game in this regard by adding integer scaling and image sharpening features.
This is of course true with regards to integer scaling as I mentioned, but in terms of sharpening features the TV should have several such options as well since it was TVs that have had post-process sharpening for over a decade now.

At least my LG C9 is not very good at accepting non-16:9 resolutions, maybe with a HDMI 2.1 GPU it will work better but I tend to get blank image or garbled image (which means it gets confused as it should technically work but it doesn't have the bandwidth) with custom resolutions.
TVs are much more picky with input resolutions to the point that you can almost think of it as a whitelist - if it's not something they expected (such as 3840x1600) or isn't something very close to an existing known resolution or refresh rate (like 1920x1081 @ 61Hz), then it almost certainly won't work. This is very likely certainly what you're seeing with the garbled-ness and not something due to bandwidth constraints and is also very likely why non-16:9 resolutions do not work (though 4:3 resolutions should probably work, at least at lower resolutions like 1024x768 as that is an extremely standard resolution).

So my comment about using the TV's upscaling was more about for 2560x1440 since the likes of 1920x1080 can be integer scaled and 3840x1600 would not need any sort of upscaling anyway.
 
Last edited:
I can run 3840 x 1600 resolutions on my 43" 4k tvs at 60hz (they are 60hz tvs) over hdmi 2.0 from my 1080ti but I was purposefully running it 1:1 with bars, with scaling turned off in nvidia control panel. All I had to do was add 3840x1600 as a custom resolution in the nvidia display/driver panel. I didn't have to manually tinker with any timings at all.

---------------------------------------

Nvidia, amd, and oculus are all working on "machine learning" assisted upscaling of lower resolutions. The aim being to run a slightly lower resolution on a high resolution monitor and rely on AI upscaling and sharpening to bring the image back to near native resolution fidelity without artifacts - and with a large boost in frame rates. This would have a big effect on regular gaming with 4k and later 8k screens, but is probably even more crucial for VR since it has a resolution PER EYE (for example the 4k per eye on a pimax "8k" x and higher might be more common in the next generation of VR)... and also especially for stand alone un-tethered VR running games and VR experiences off of weaker phone chips.

NVIDIA DLSS 2.0 Tested - Too Good to be True!? | The Tech Chap (Youtube Video)

That seems like a decent cheat for the near future until they hopefully eventually start making much better interpolation technology to multiply raw frame rates.. some VR already adds some forms of interpolation to their gameplay which you could argue is ahead of what flat screen gaming has so far in that facet. I think raw interpolation duplicating frames would be a lot better than dynamic resolution done by consoles and some pc games, and checkerboarding. Perhaps you could combine machine learning upscaling/DLSS with dynamic resolution though. It would be good if you could set the resolution and fps ranges yourself however. EDIT - re-watched the video and it looks like you can indeed select between a number of actual resolutions and effective resolutions with DLSS.


LUNgVHq.png

U7q4vbz.png

Q4t9S9e.png
 
Last edited:
Could pick up a 55CX for 2000€ right now but apparently we're getting the 48 next month in Europe and the slightly smaller size will definitely make it quite a bit easier to set up for me. Most likely I'll wait but the temptation is real (and my country's lockdown is extended for 1 month minimum and is likely to go on for longer still).

I'm really bummed that my 1080 ti can't do VRR over HDMI nor integer scaling though (the RTX 2080 ti was too small a performance bump for me to care on release). Like, until the next nvidia gen comes out gaming on that TV will imply some very serious compromises.

Same boat here; waiting to upgrade my b6p until the 48cx and 3000 series hit. Then comes the fun part: rewiring.
 
I'll grab the 48cx and use a 1080Ti for now. Even 4K @ 60 will look pretty great while we wait for the next graphic card gen
 
A 1080Ti has HDMI 2.0 as well, so it too can output 4k 120Hz 4:2:0 8bit.

Unless you mean having the actual GPU grunt to run games themselves at 4k 120fps?
 
I picked up the 55" CX a few weeks ago from Nebraska Furniture Mart. My thought process was to get the 55" CX to replace my 55" LCD that sits next to my Acer XB271HU gaming display and then decide whether I wanted to use the CX as a gaming display. I watch movies, shows and surf the web on the 55" LCD and game on the Acer and there is no way I could get a new gaming display that is OLED but still watch movies on the 55" LCD! So I would have to replace them both!

Today I mirrored the two displays at 1440p@120Hz on my GTX 1080 using the only game that I play right now: World of Warcraft Classic.

First load of the game with the picture mode set to auto power save, I can see that it is doing some sort of edge processing that is really bringing out any sort of line in the game:

20200420_182133_HDR_copy.jpg


I changed the mode to HDR effort and I do notice the colors get more pronounced and it still has the edge processing. Not sure I like the edge processing. The sun outside got brighter for a bit so you can see the window reflection but this is only visible stepping back where I took the picture.

20200420_190111_HDR_copy.jpg


Here is game mode, which I assume has processing off.

20200420_190144_HDR_copy.jpg


Here is another game mode shot in the tunnel leading out of Orgimmar. It looks much better in person than in the picture.

20200420_191742_HDR_copy.jpg


So my first impressions are that, without a doubt, colors are no longer washed out and black is black! As for burn out, *shrug* just like everyone else, I am sick of LCDs so I am tempted to see how this goes. But I can honestly say, that even for an ancient game like Wow Classic, don't buy an LCD if you have the room for one of these OLEDs and you are willing to risk the burn out.
 
Machine learning upscaling ~ DLSS2, etc. should help in the future but otherwise it'll always look muddier image clarity wise when not running native resolution or higher. A similar result to what it looks like when you make a 480p movie fullscreen sort of. You can try to increase sharpness and other image filters in nvidia freestyle to compensate on supported games but native resolution is always best. I'm saying that because any artifacts you experienced could be worsened running a non native rez. Personally i would always run game mode to avoid extra processing and input lag anyway. I haven't played WoW for years but i would be curious if you would be capable of running WoW and other games in 3840 x 1600 or 3840x1200 letterboxed and at what max Hz.

Regarding Burn-In... if you are running the OLED at defaults it will have ABL kicking in which drops the peak color brightness to protect it from burning in. HDR is where the highest peak brightnesses happen but even then it is only a small percent of the screen for a limited time and in a moving location usually. While plugged in but powered down in standby, newer OLED TVs do a maintenance program to even out the wear of the oleds too. The % of screen nits and the ABL on OLED keep it from having as bright of a color volume as a 1000nit+ FALD LCD tv but those limitations are what protect it from most of the chance of getting burn in. I'm still going to run mine on an all black desktop with no taskbar or icons and limit it to games and fullscreen videos though. So it will be a media/gaming "stage" and I'll have other monitors for all of my desktop/app stuff. I realize WoW's action bars and other HUD elements are a desktop/taskbar of their own though so there is that but at least outside of the game I would have an all black screen. I'm pretty sure there are mods for WoW to have custom action bars that allow you to set the opacity of each as well as move them around, while also having the option of hiding the stock action bar. You can also change the opacity and time out of the chat windows I think..
 
Last edited:
I picked up the 55" CX a few weeks ago from Nebraska Furniture Mart. My thought process was to get the 55" CX to replace my 55" LCD that sits next to my Acer XB271HU gaming display and then decide whether I wanted to use the CX as a gaming display. I watch movies, shows and surf the web on the 55" LCD and game on the Acer and there is no way I could get a new gaming display that is OLED but still watch movies on the 55" LCD! So I would have to replace them both!

Today I mirrored the two displays at 1440p@120Hz on my GTX 1080 using the only game that I play right now: World of Warcraft Classic.

First load of the game with the picture mode set to auto power save, I can see that it is doing some sort of edge processing that is really bringing out any sort of line in the game:

View attachment 239268

I changed the mode to HDR effort and I do notice the colors get more pronounced and it still has the edge processing. Not sure I like the edge processing. The sun outside got brighter for a bit so you can see the window reflection but this is only visible stepping back where I took the picture.

View attachment 239271

Here is game mode, which I assume has processing off.

View attachment 239272

Here is another game mode shot in the tunnel leading out of Orgimmar. It looks much better in person than in the picture.

View attachment 239274

So my first impressions are that, without a doubt, colors are no longer washed out and black is black! As for burn out, *shrug* just like everyone else, I am sick of LCDs so I am tempted to see how this goes. But I can honestly say, that even for an ancient game like Wow Classic, don't buy an LCD if you have the room for one of these OLEDs and you are willing to risk the burn out.

You can actually use any of the modes and have the "instant game response" on to have it perform like game mode. Any sharpening is most likely other settings you can tweak per preset. I run mine on the ISF dark or bright room preset regardless of source because you can tweak them per source so if I watch Netflix the presets are different than if I use my PS4 or PC. For PC use whether the input is labeled PC (behind the Home button) or something else matters. Some reported banding in PC mode on the C9 but I don't know if that is fixed. Text rendering is better in PC mode without question.
 
How's motion clarity?

Without using BFI you still won't have perfect motion clarity. Even 120 Hz is not enough for that but you pretty much don't have any response time related additional motion blur. But honestly most people are going to be perfectly fine with the motion clarity of OLED regardless of refresh rate.
 
The following video of BFI on a 60Hz input (albiet with 30fps content) recorded at 240fps was posted on AVS Forum. Due to the amount of flickering, it's not wise to watch the entire video and may instead be more useful to instead step frame by frame (either use the , and . keys or just download the video with youtube-dl) or just simply set the playback speed to 0.25x; also note that timestamps for the various BFI modes are linked in the video description:




Today I mirrored the two displays at 1440p@120Hz on my GTX 1080 using the only game that I play right now: World of Warcraft Classic.

Are these photos of GPU scaling enabled or disabled?

Also regarding the sharpening, that stuff should be modifiable in the TV's input settings where you can configure all of the various options like "OLED light", "contrast", "color", etc.

Oh and um, I've never played nor looked into WoW Classic so forgive me, but is 4k too much for a GTX 1080 non-Ti on that game?
 
Last edited:
  • Like
Reactions: N4CR
like this
You can actually use any of the modes and have the "instant game response" on to have it perform like game mode. Any sharpening is most likely other settings you can tweak per preset. I run mine on the ISF dark or bright room preset regardless of source because you can tweak them per source so if I watch Netflix the presets are different than if I use my PS4 or PC. For PC use whether the input is labeled PC (behind the Home button) or something else matters. Some reported banding in PC mode on the C9 but I don't know if that is fixed. Text rendering is better in PC mode without question.
I do not see any banding or text issues but I am not running the CX as my main display yet. It is mostly connected to my second computer that has a GTX 680 pushing 1080p. I did stream Star Trek Picard using the TV App and that was amazing quality if I ignore the video compression artifacts.
 
Are these photos of GPU scaling enabled or disabled?

Also regarding the sharpening, that stuff should be modifiable in the TV's input settings where you can configure all of the various options like "OLED light", "contrast", "color", etc.

Oh and um, I've never played nor looked into WoW Classic so forgive me, but is 4k too much for a GTX 1080 non-Ti on that game?
Pictures were taken with the default driver settings since I have never changed that section. It looks like it is set to perform scaling on display. I haven't tested 4k on wow classic but I expect it to run just fine unless you are in a populated city or raiding.

My uplift desk isn't deep enough to use the 55" for gaming so I need to figure out a hack to game on this thing before I decide to make it my permanent display, which means buying a display arm or a floor stand.

Last night I did some Scarlet Monastery dungeon runs with mirror mode while comparing to picture quality between the two displays and every single time I came to the same conclusion: The CX produces a better picture in every way even at 1440p 120Hz up mixed by the display. Here is another picture that I took:
20200420_193337_HDR_copy.jpg


If this display looks this much better on a game from 2004, I can only imagine how great it is on a new game.
 
Machine learning upscaling ~ DLSS2, etc. should help in the future but otherwise it'll always look muddier image clarity wise when not running native resolution or higher. A similar result to what it looks like when you make a 480p movie fullscreen sort of. You can try to increase sharpness and other image filters in nvidia freestyle to compensate on supported games but native resolution is always best. I'm saying that because any artifacts you experienced could be worsened running a non native rez. Personally i would always run game mode to avoid extra processing and input lag anyway. I haven't played WoW for years but i would be curious if you would be capable of running WoW and other games in 3840 x 1600 or 3840x1200 letterboxed and at what max Hz.

Regarding Burn-In... if you are running the OLED at defaults it will have ABL kicking in which drops the peak color brightness to protect it from burning in. HDR is where the highest peak brightnesses happen but even then it is only a small percent of the screen for a limited time and in a moving location usually. While plugged in but powered down in standby, newer OLED TVs do a maintenance program to even out the wear of the oleds too. The % of screen nits and the ABL on OLED keep it from having as bright of a color volume as a 1000nit+ FALD LCD tv but those limitations are what protect it from most of the chance of getting burn in. I'm still going to run mine on an all black desktop with no taskbar or icons and limit it to games and fullscreen videos though. So it will be a media/gaming "stage" and I'll have other monitors for all of my desktop/app stuff. I realize WoW's action bars and other HUD elements are a desktop/taskbar of their own though so there is that but at least outside of the game I would have an all black screen. I'm pretty sure there are mods for WoW to have custom action bars that allow you to set the opacity of each as well as move them around, while also having the option of hiding the stock action bar. You can also change the opacity and time out of the chat windows I think..
If I make this my main display for Wow, I won't be running any mods to minimize the burn out. If it destroys the display then so be it, I'll enjoy it while it lasts or I will prove that burn out is over rated. lol.

I was only running at 1440p for the mirror mode to work. A GTX 3000 is in my future!
 
just saying you can completely replace the default action bar with it's useless ornamental framing and tweak a few other things if you wanted to. .. most games don't have that kind of customization available so I won't be able to do anything like that on most of my games anyway.


The things that keep oled from being 1000nit plus hdr color volumes combined with ABL and wear evening maintenence in downtime are the same things that should avoid burn in occurring. Wow in particular is known for marathon gameplay sessions though so who knows if that would increase the risk slightly as far as HUD elements go.
 
Last edited:
Nice pics! Thx I’d crank the color profile to “warm” but that’s just me, seems a bit strong on the blue.

CX looks like the best option available for both console and pc.
 
Nice pics! Thx I’d crank the color profile to “warm” but that’s just me, seems a bit strong on the blue.

It does that in pictures for some reason. I noticed the same thing with mine...blue looks very saturated in pics but nothing like that IRL.
 
It does that in pictures for some reason. I noticed the same thing with mine...blue looks very saturated in pics but nothing like that IRL.
Might be the color filter and/or the way light is emitted from the display that messes with cameras.
 
Back
Top