LG 48CX

This is being posted in relation to the C9, E9 and CX oleds and HDR display tech in general - not trying to continue arguments about any aw55 value, limitations, timeliness. It is what it is.

This HDR temperature mapping is pretty cool to watch.

The scene content is broken down as follows:

under 100 nits = grayscale
100nits+ = pure green
200nits = yellow
400nits = orange
800 nits = red
1600nit = pink
4000nit+ = pure white

[HDR] Star Wars: The Rise of Skywalker 4K Blu-ray HDR Analysis - HDTVTest
 
Last edited:
Displays always run at 100% brightness when in HDR mode. While the brightness setting most certainly works differently in this mode, it could still be more likely to trigger automatic brightness limited on the LG if you have for example a browser window open. So it's better to run in SDR mode with brightness reduced and activate HDR mode as needed.
Enabling HDR, technically, enables a wide colorspace and unlocks a much greater range of luminance levels.
If windows via the driver/gpu sends a signal to display a fullscreen image with luminance 50 50 50, it will (should) produce the same result, regardless of sdr or hdr-sdr mapping mode (assuming the hdr-sdr mapping mode is accurate and not broken - it looks accurate to the naked eye on windows).
It seems physically impossible to me that an oled display would experience different stress when the diodes are emitting the exact same light. Just because a luminance level is unlocked, doesn't mean it is used.

In hdr, if a luminance greater than the scope of SDR is sent, it will be brighter, but it should never on it's own accord just display 100% brightness - that would break hdr, and would simply be sdr with higher brightness.
 
Displays don't really run at "100% brightness" in HDR mode. All you really need to do to confirm this for yourself is literally go into Netflix and sit on an HDR show preview. The screen will switch into HDR mode. Certain things will get brighter(like text, etc) compared to the SDR signal, but the overall image brightness doesn't change very much.

A more correct way to describe it is that displays LOCK brightness control in HDR modes. And the reason they do that is simply because they need access to their full brightness capability at all times, because no display can even reach the maximum brightness specced by HDR yet, so they need as much as they can get. It would defeat the point of HDR to let you turn down the brightness in that mode with current display tech.

It's nothing to do with how stressed the display is, that's entirely down to how bright each of the pixels are actually being driven.
 
So I've been lurking in this thread a little. I'll ask the basic question. Will I be satisfied with this monitor?

My display preference leans on CRT. I want contrast. I want motion clarity. I don't care about HDR. All I need is some fast, fluid, contrasty image that's really clear in motion and I'm happy. Those are my priorities. I'm thinking that this is the monitor for me. What say the experts in this thread?
 
  • Like
Reactions: N4CR
like this
OLED is definitely high contrast and very clear in motion. I was watching a video of someone playing Terraria the other day and it was almost surreal how smooth it was after remembering how the game would blur my old LCDs with black smudginess. The motion clarity is really noticeable in both side scrollers and FPS, in my opinion.

For most people, going to OLED is a game changer. Just look at the comments by SixFootDuo, delita, and madpistol in this thread.
 
So I've been lurking in this thread a little. I'll ask the basic question. Will I be satisfied with this monitor?

My display preference leans on CRT. I want contrast. I want motion clarity. I don't care about HDR. All I need is some fast, fluid, contrasty image that's really clear in motion and I'm happy. Those are my priorities. I'm thinking that this is the monitor for me. What say the experts in this thread?

It sounds like OLED is what you've been waiting for.

If you want something small-ish (ok, massive, but smaller than the others), the CX 48" is going to be amazing to you.

If you want something slightly larger, but maybe a bit cheaper. The LG B9 or C9 are excellent choices and have virtually identical picture quality to the CX.

If you turn on BFI, you will even feel like a CRT due to the strobing effect.
 
One thing I dont really get is ABL, is its purpose to protect the panel or the pixels? Reason for asking is that if I have 50% white and 50% black side by side, I won't get ABL, but if I put up 100% white I will. Now, if the purpose was to prevent burn in and preserve the pixels, wouldn't the 50% pixels be just as affected regardless of if the other 50% is black? Or it some kind of overheating protection for the whole panel?
 
OLED is definitely high contrast and very clear in motion. I was watching a video of someone playing Terraria the other day and it was almost surreal how smooth it was after remembering how the game would blur my old LCDs with black smudginess. The motion clarity is really noticeable in both side scrollers and FPS, in my opinion.

For most people, going to OLED is a game changer. Just look at the comments by SixFootDuo, delita, and madpistol in this thread.

My only hesitation is that I do remember CRT's motion clarity. And I do have some PVM's still. So I know what they brought to the table. I'm afraid that OLED still won't deliver the goods with BFI but it's better than LCD. Oh if they (LG) would implement a rolling scan. Come on LG!!
 
My only hesitation is that I do remember CRT's motion clarity. And I do have some PVM's still. So I know what they brought to the table. I'm afraid that OLED still won't deliver the goods with BFI but it's better than LCD. Oh if they (LG) would implement a rolling scan. Come on LG!!

Motion clarity of the LG OLEDS is incredible. Far superior to LCD. This is mainly due to the pixel response, which is almost instantaneous.

What video card to you have?
 
One thing I dont really get is ABL, is its purpose to protect the panel or the pixels? Reason for asking is that if I have 50% white and 50% black side by side, I won't get ABL, but if I put up 100% white I will. Now, if the purpose was to prevent burn in and preserve the pixels, wouldn't the 50% pixels be just as affected regardless of if the other 50% is black? Or it some kind of overheating protection for the whole panel?

Power use and over-heating go hand-in-hand. It stresses the power supply, small power traces, generates excess heat for the panel and wears everything much quicker.

LCD's aren't immune either. The new 1400-nit G-Sync monitors are thick and have fans as the heat-stinks have to be large to dissipate all that heat from 1024 powerful LED's in the back-light.
 
If you turn the brightness down enough, ABL won't kick in but you'll be limited to 80 contrast, peak brightness=off and 250nit. So that limits you to SDR and 250nit worth of color heights, detail-in-colors.
I could see experimenting running it that way while playing SDR games. I'd otherwise rather have the full range of HDR capability and detail in color, highlights on that material when avaialble.

Adding BFI to that or any settings on this display is not worth it imo.

If you run 100fps average or the closer you can get to running solid 120fps-120hz you will cut the sample and hold blur down by up to 1/2 while increasing your motion definition, pathing, smoothness. 100fps+ average is not pristine CRT clarity during the fastest viewport movement/panning of the whole game world but it's decent overall. More like a soften blur within the lines during the fastest viewport panning.. "fuzzy". That is coming from LCDs too, some people say that OLED response time helps a little bit more though they still suffer from the same sample and hold blur like any display. An oled isn't going to have really bad grey-black-white response times like a VA for example, making black smearing/trailing in some transitions. It's response is always going to keep up with the refresh rate (8.3ms at 120fps solid, not average). However in the past the speed was an issue with low frame rate content like 24fps movies and made them appear choppy/suttered from how abrubt it was I think..

https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/

KlIRG0B.png

2020 Update: NVIDIA and ASUS now has a long-term road map to 1000 Hz displays by the 2030s! Contacts at NVIDIA and vendors has confirmed this article is scientifically accurate.

---------------------------------------

https://blurbusters.com/faq/oled-motion-blur/

" Even instant pixel response (0 ms) can have lots of motion blur due to sample-and-hold "

"
Motion blur occurs on the Playstation Vita OLED even though it has virtually instantaneous pixel response time. This is because it does not shorten the amount of time a frame is actually visible for, a frame is continuously displayed until the next frame. The sample-and-hold nature of the display enforces eye-tracking-based motion blur that is above-and-beyond natural human limitations. "



-------------------------------------------------------------
FlatPanelsHD Review: LG CX OLED 24 Mar 2020 | Rasmus Larsen

"it seems to us that the 2020 implementation sacrifices brightness to a higher degree. In LG CX there are five levels for OLED Motion Pro (Off, Low, Medium, High and Auto). With a special test pattern we measured 'Off' to 318 nits brightness, 'Low' to 273 nits, 'Medium' to 187 nits and 'High' to 77 nits. The exact brightness values are not important so focus on the relative change in brightness: 'Low' will reduce brightness by 15%, 'Medium' by 40% and 'High' by 75%. The 'High' setting produces visible flicker and is not recommended for any type of content (it should probably be removed). 'Medium' is more effective at increasing motion resolution than 'Low' but brightness obviously takes a more significant hit. Lastly, there is an 'Auto' option that varies between 'Low' and 'Medium' but avoids 'High'. The conclusion? Well, at its two lower settings the BFI system is definitely useful now, as opposed to BFI in previous years' OLED TVs, but improved motion resolution comes at the expense of a reduction in brightness that is a little higher than we had hoped. Also note that by engaging 'OLED Motion Pro' input lag increases slightly to 22 ms. "
--------------------------------------------------------------

Thank you for posting here and providing the details of your setup. Can you please test BFI (OLED motion pro in the TV settings) on high at both 1440p and 4k? We would love to hear your impressions of the different BFI modes too.

Another new development this year is 'OLED Motion Pro', which is a more effective version of the BFI (Black Frame Insertion) system found in previous OLED TVs. BFI inserts black frames into the video stream to 'reset' the human eye in order to make motion appear less blurry. You can achieve similar results (plus more) by increasing the frame rate of the content (i.e. 4K120 or higher) but it is nice to have BFI as an option for lower frame rate content, too, in order to replicate more "plasma-like" motion on an OLED panel. The improved 120Hz BFI system was actually intended for LG Display's 2019 OLED panel but was pulled before release. Now it has reappeared in the 2020 OLED panel and as such it will also be available in OLED TVs from competing brands, too.

BFI going to be too dark and flickery for me to even consider using it personally but would be nice to test for the people who are interested in it.

-------------------------------------------------------------------

Assuming you rule out 75% outright, the 318 nit they are using as a basis with BFI off (and assumingly with ABL enabled~not avoided) gets
..cut 40% at medium BFI setting, which does mathematically result in 190.8 nit so is around what they measured at 187nit.
..on Low BFI 25% the math result is at 238.5nit which is lower than the 273nit they quoted as being measured so perhaps it's not as aggressive as a 25% brightness reduction, perhaps interpolation is on too which would makes sense if the input lag is higher. I'm not sure if the tradeoffs would be worth it runing low BFI compared rather than no BFI with a raw fps hovering around a 120fps+120Hz graph considering the PWM effect on your eyes and the input lag increase from BFI.
..
..Neither of those quotes show the after ABL values. ABL on regular SDR brightness cuts it down to around 140nit on a C9 before BFI brightness reduction is considered so if you used medium 40% you could end up with a result of 84nit* of scene color brightness/highlights/detail in areas of color post ABL.
...
..If you run at around 250nit color brightness/color detail ceiling in SDR with the contrast at 80 and peak brightness to "Off" in order to avoid ABL kicking in you end up with medium BFI reducing 40% of 250 -> probably having a result of around 140nit of color detail/brightness (which is coincidentally the same as what you get in the previous (default) ABL scenario after ABL kicks in but before BFI brightness reduction considerations if enabled).

--------------------------------------------------------------------
So would you want to run SDR on the screen:

..normally at ~ 320nit of color detail but seeing 140nit ABL kick in intermittently
..normally + medium BFI --> 190nit of color detail (320nit minus 40%) until ABL kicks in then down to as low as *84nit (40% strobe effect subtracted from 140nit ABL) . 40% blur reduction (compared to 60fps+60hz but not 120fps+120hz), PWM, input lag.

..ABL-avoiding 80contrast, peak brightness"off" ~> 250nit seeing no ABL triggering
..ABL-avoiding 80 contrast, peak brightness "off" ~> BFI active (250 nit minus 40%) lowering it to 140nit average screen brightness (and detail limit in fields of color) throughout. 40% blur reduction compared to 60fps+60Hz, perhaps 20% vs 120fps+120hz.. +PWM, input lag increase.

*That is because as I understand it - BFI's dimming effect is on the way our eyes perceive the screen meaning the LG ABL burn in safety tech would not realize the after-BFI-effecive-brightness so would still be operating normally as if BFI wasn't a factor.
 
Power use and over-heating go hand-in-hand. It stresses the power supply, small power traces, generates excess heat for the panel and wears everything much quicker.

LCD's aren't immune either. The new 1400-nit G-Sync monitors are thick and have fans as the heat-stinks have to be large to dissipate all that heat from 1024 powerful LED's in the back-light.

So ABL has noting to do with preventing burn in and similar then? I can understand over heating but I am not sure about if the average customer would want some limitations on brightness just because of power usage. I mean, who in here is using Eco mode or similar?
 
One thing I dont really get is ABL, is its purpose to protect the panel or the pixels? Reason for asking is that if I have 50% white and 50% black side by side, I won't get ABL, but if I put up 100% white I will. Now, if the purpose was to prevent burn in and preserve the pixels, wouldn't the 50% pixels be just as affected regardless of if the other 50% is black? Or it some kind of overheating protection for the whole panel?

The screen is already limited in what it can output and for how long.

RTings LG CX review:

cHF7tlL.png

In the 50%/50% scenario you outlined the HDR peak would be 302nit (which is SDR range of brightness). Sustained 50% window would be even lower at 287nit.

You can see that the 100% sustained window drops to the ABL kick-on limit of 140nit.

OLED have organics and are subject to heat aging them prematurely. I think of an analogy like oled is some kind of glassware that can take flares of stovetop burner heat but not sustained. The idea is to keep it on simmer with some crackling flame flares on the HDR highlights without blasting the whole bottom of the pan with flame and for too long.

The heat issue makes me consider that I'd probably have to re-arrange my whole living room setup if I ever replace my FALD LCD TV in the living room with a big OLED TV. Currently the back of the big tv is to the big picture window in the room, so it gets direct sunlight if the vertical blinds are open. I'd also consider what running an oled in un-air conditioned environments in the summer might do. My living room is 76F right now but my pc room is 83F. The PC room is on the sunny side in the morning and the living room gets hit later in the day. I have some fans cycling the PC room heat out as of just now but throughout the summer it will be air conditioned 24/7 pretty much.

Like others have said, some hot running displays have thicker chasis with fans inside. The panasonic oled tvs availabile in the UK put a whole metal backplane on their oleds internally as a heatsink. This reportedly allows them to run cooler and thus brighter vs burn in limitations.
 
Last edited:
I always do "feel" tests between my old and new products. So in this case, I moved from a Dell 27" 1440p GSYNC monitor to a CX OLED.

How does gsync behave on low framerates between the 2? See the video below.

 
I always do "feel" tests between my old and new products. So in this case, I moved from a Dell 27" 1440p GSYNC monitor to a CX OLED.

How does gsync behave on low framerates between the 2? See the video below.



What GPU are you usings (might have been mentioned in the video but I have not had a chance to watch it yet)
 
2 things-
-HDR is a gimmick (pleases people that are attracted to box-stores torch mode, but not realistic at all.) From experience, just blown out pq. Own calibrated LG C9, AW55, and a 2016 LG OLED since 2016 .
-LG's input lag/latency is tv status, crap. Therefore not even close to being better. I have gamed on the C9 and not close to being as responsive. Long story short, these are tv's and not monitors. Not to mention can't even run it 4k 120hz for who knows how long.

I'm getting strong "purchase justify" vibes from you guys lol. Let me get this straight, do you recommend the AW55 over the C9/CX? If so, you have to be out of your god damn mind. I played with the AW55 at a games convention. The AW55 price/performance ratio is nothing but a pain in the ass. It's so freakin dim compared to LG, it's ridiculous. After watching Linus videos about the AW55 and the C9 it was clear that i have to go with the C9. I'm also using a 1440p 144Hz TN Monitor for games like Quake champions, COD MW, but hell...the LG C9 is also super fast, i was really impressed. People who say LGs latency is crap should not be taken seriously. Looking forward to the new NV cards to release the full potential of this TV.
 
What GPU are you usings (might have been mentioned in the video but I have not had a chance to watch it yet)

RTX 2070 Super

I'm getting strong "purchase justify" vibes from you guys lol. Let me get this straight, do you recommend the AW55 over the C9/CX? If so, you have to be out of your god damn mind. I played with the AW55 at a games convention. The AW55 price/performance ratio is nothing but a pain in the ass. It's so freakin dim compared to LG, it's ridiculous. After watching Linus videos about the AW55 and the C9 it was clear that i have to go with the C9. I'm also using a 1440p 144Hz TN Monitor for games like Quake champions, COD MW, but hell...the LG C9 is also super fast, i was really impressed. People who say LGs latency is crap should not be taken seriously. Looking forward to the new NV cards to release the full potential of this TV.

That's the reason I stopped responding. I've done comparisons between my CX and my old 144hz gsync monitor, and they feel the same. The input lag between them is so close that it's indistinguishable, even with VRR/gsync enabled. Anyone who says the B9/C9/CX is bad at gaming probably hasn't used one. They're REALLY good at gaming.
 
I'm getting strong "purchase justify" vibes from you guys lol. Let me get this straight, do you recommend the AW55 over the C9/CX? If so, you have to be out of your god damn mind. I played with the AW55 at a games convention. The AW55 price/performance ratio is nothing but a pain in the ass. It's so freakin dim compared to LG, it's ridiculous. After watching Linus videos about the AW55 and the C9 it was clear that i have to go with the C9. I'm also using a 1440p 144Hz TN Monitor for games like Quake champions, COD MW, but hell...the LG C9 is also super fast, i was really impressed. People who say LGs latency is crap should not be taken seriously. Looking forward to the new NV cards to release the full potential of this TV.


While I can see the advantages of having a Display Port instead of HDMI (or rather both) as PC user, its kind of obvious that the AW55 is outdated compared to alternatives. Especially if you look at the price tag. I like the idea of OLED monitors though. That said, if people think the AW55 is the greatest option I am happy for them.
 
  • Like
Reactions: lors
like this
I'm getting strong "purchase justify" vibes from you guys lol. Let me get this straight, do you recommend the AW55 over the C9/CX? If so, you have to be out of your god damn mind. I played with the AW55 at a games convention. The AW55 price/performance ratio is nothing but a pain in the ass. It's so freakin dim compared to LG, it's ridiculous. After watching Linus videos about the AW55 and the C9 it was clear that i have to go with the C9. I'm also using a 1440p 144Hz TN Monitor for games like Quake champions, COD MW, but hell...the LG C9 is also super fast, i was really impressed. People who say LGs latency is crap should not be taken seriously. Looking forward to the new NV cards to release the full potential of this TV.

Welcome to [H] and the thread, lors! It's been...mostly good. Lot of good info and people looking very much forward to the CX (as well as enjoying our older OLEDs while we parse all of the info on these new ones).
 
If you turn the brightness down enough, ABL won't kick in but you'll be limited to 80 contrast, peak brightness=off and 250nit. So that limits you to SDR and 250nit worth of color heights, detail-in-colors.
I could see experimenting running it that way while playing SDR games. I'd otherwise rather have the full range of HDR capability and detail in color, highlights on that material when avaialble.

Adding BFI to that or any settings on this display is not worth it imo.

If you run 100fps average or the closer you can get to running solid 120fps-120hz you will cut the sample and hold blur down by up to 1/2 while increasing your motion definition, pathing, smoothness. 100fps+ average is not pristine CRT clarity during the fastest viewport movement/panning of the whole game world but it's decent overall. More like a soften blur within the lines during the fastest viewport panning.. "fuzzy". That is coming from LCDs too, some people say that OLED response time helps a little bit more though they still suffer from the same sample and hold blur like any display. An oled isn't going to have really bad grey-black-white response times like a VA for example, making black smearing/trailing in some transitions. It's response is always going to keep up with the refresh rate (8.3ms at 120fps solid, not average). However in the past the speed was an issue with low frame rate content like 24fps movies and made them appear choppy/suttered from how abrubt it was I think..

https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/

View attachment 247629

2020 Update: NVIDIA and ASUS now has a long-term road map to 1000 Hz displays by the 2030s! Contacts at NVIDIA and vendors has confirmed this article is scientifically accurate.

---------------------------------------

https://blurbusters.com/faq/oled-motion-blur/

" Even instant pixel response (0 ms) can have lots of motion blur due to sample-and-hold "

"
Motion blur occurs on the Playstation Vita OLED even though it has virtually instantaneous pixel response time. This is because it does not shorten the amount of time a frame is actually visible for, a frame is continuously displayed until the next frame. The sample-and-hold nature of the display enforces eye-tracking-based motion blur that is above-and-beyond natural human limitations. "


250 nit is still super bright for an SDR computer display. When the panel switches to HDR mode you can set a completely different display presets for gaming. For example, I'm not nearly as concerned about input lag in games like Shadow of War, which has incredible HDR that uses different picture settings than SDR. I can't wait for testing with real 2.1 sources 4k @ 120hz with BFI enabled and hopefully better firmware. The problem with what I just said is, that's still lot of maybes! I may have to wait for the C11 because BFI @120hz with crazy low input lag for SDR gaming is a must IMHO. I've gotten used to my 280hz monitor and it's a delight for Warzone.
 
250 nit is still super bright for an SDR computer display. When the panel switches to HDR mode you can set a completely different display presets for gaming. For example, I'm not nearly as concerned about input lag in games like Shadow of War, which has incredible HDR that uses different picture settings than SDR. I can't wait for testing with real 2.1 sources 4k @ 120hz with BFI enabled and hopefully better firmware. The problem with what I just said is, that's still lot of maybes! I may have to wait for the C11 because BFI @120hz with crazy low input lag for SDR gaming is a must IMHO. I've gotten used to my 280hz monitor and it's a delight for Warzone.

The point was that that ~300nits is in the safer range for OLED brightness and it is not coincidentally around the peak hardware capability of common SDR displays (300 - 400nit). OLED at ~280nit to 300nit is a safer range vs Burn-in that allows 50% sustained screen at that brightness, and 140nit is the ABL safety trigger reflex so 100% sustained drops to 140nit. . This is what they decided on to limit the chance of burn in with some kind of HDR performance balance. As I also stated, if you change some settings and run your OLED at 250nit peak you can avoid ABL ever kicking in. In general the whole screen isn't shooting up to high nits of color, only highlights and detail-in-color fields in bright areas dynamically throughout a changing scene. So the amount all of those thresholds are activated depends on the content or scene.

Of course when actually in SDR mode you probably aren't going to be using very high brightness unless in a very bright sunlit room, since the brightness in SDR is relative.
When you have a for example 400nit SDR display and you run it at 80 brightness you are mathematically running at 320nit. How that 80 brightness translates from the OSD to the output of the screen might vary in firmwares though of course so testing hardware is probably way more accurate.

SDR brightness being relative means you are turning up the brightness or turning down the brightness of all of the content or scene at once - SDR's slim band of color volume getting bumped up or down 100 - 200 nit to taste for example. With HDR you aren't supposed to be turning anything up or down. The HDR color values~color brightness values are supposed to be static, with the higher brightness colors firing from light sources, highlights, and bright areas throughout scenes, while the rest of the screne remains "in SDR ranges". All of that color data is coded as it is supposed to be viewed.
 
Last edited:
I'm getting strong "purchase justify" vibes from you guys lol. Let me get this straight, do you recommend the AW55 over the C9/CX? If so, you have to be out of your god damn mind. I played with the AW55 at a games convention. The AW55 price/performance ratio is nothing but a pain in the ass. It's so freakin dim compared to LG, it's ridiculous. After watching Linus videos about the AW55 and the C9 it was clear that i have to go with the C9. I'm also using a 1440p 144Hz TN Monitor for games like Quake champions, COD MW, but hell...the LG C9 is also super fast, i was really impressed. People who say LGs latency is crap should not be taken seriously. Looking forward to the new NV cards to release the full potential of this TV.

Oh look another person that can read the brochure, goes off linus videos (LMFAO), and doesn't own any of these displays lol

I own a C6, C9 and AW55, all calibrated, the AW55 can't be beat in a home setting, brightness at max is too bright.
 
Not riffing off the people poking stryker again for some reason.... let him be he got a 120hz 4k display at least a year before most of us would have a 7nm and hopefully hdmi 2.1 gpu for hdmi 2.1 tvs. He paid a lot for it, more than most of us consider it being worth knowing what was coming down the pipe - but it was/is the only thing available with those features for awhile and he loves it, even if he can be curmudgeonly about it.
----------------------------

Moving on......

Contrary to what some people think, non-hybrid HDR where the values are hard coded often gets complaints that it is TOO DIM. This is because HDR is designed for a dim to dark home theater environments. The normal "SDR ranges" that most of the scenes remain in are therefore on the dimmer side unless viewed in a dim to dark room, while the highlights, detail in bright areas, and bright light sources in scenes bring realism with HDR color ranges/HDR color volumes.
 
This Stryker guy has to be 12 years old.

Don't bother. Every other post he makes is some crusade about how HDR is fake/a gimmick despite knowledgeable people having explained to him what 'dynamic range' is in photography and filmmaking a million times. Let him live alone in his delusion that SDR is all you need lol.
 
My problem with HDR is that, for GAMES, it is not quite there yet. It still very much a hit or miss situation even in 2020. It's kind of annoying that big AAA game devs like Square Enix and Rockstar can't get the HDR implementation right in their games. Red Dead Redemption 2? Looks better in SDR. FF7 Remake? Looks better in SDR.

https://www.forbes.com/sites/johnar...-here-but-it-doesnt-really-work/#69eb46c11cf5



Not mention, when the games HDR is good, you still have to have the knowledge to properly setup the HDR variables in order to get an accurate and best looking picture. I have to pray that HDTV Test or Digital Foundry puts up a video guide on what to look for and how to tweak the settings otherwise I'm just running around blind here and may end up having the settings incorrect. Am I suppose to just research every single HDR game that comes out in order to figure out how to properly setup each game? Maybe, but I would argue that HDR should be nothing more than a simple toggle on/off switch and "just work" without fiddling through a bunch of sliders to get it looking proper. So until HDR becomes a 100% hit, it's always going to be something "extra but not required" for me when getting a new display. As Elvn has mentioned probable a million times before, OLED lacks peak brightness and color volume so from that standpoint they aren't even well suited to HDR in the first place, and I don't care one bit.
 
My problem with HDR is that, for GAMES, it is not quite there yet. It still very much a hit or miss situation even in 2020. It's kind of annoying that big AAA game devs like Square Enix and Rockstar can't get the HDR implementation right in their games. Red Dead Redemption 2? Looks better in SDR. FF7 Remake? Looks better in SDR.

That's right and it's a shame. But nevertheless, there are many games which look better in HDR, Doom Eternal, Forza Horizon 4, Division 2 etc., and many console games (Spider-man, Gran Turismo, Death Stranding etc.)
And i also want to mention movies of course, even when you maybe use the display primary as a monitor, don't you want to have as many features / capabilities as possible for your money?

As Elvn has mentioned probable a million times before, OLED lacks peak brightness and color volume so from that standpoint they aren't even well suited to HDR in the first place, and I don't care one bit.

I owned a Samsung Q9FN in 2018 and due to infinitiv contrast and OLED pixel-dimming HDR impressed me way more on the C9, highlight details are so nice pronounced. Anyway, a good amount of peak brightness is important of course. I think in 6-7 years our eyes will be torn apart ;)
 
Movies have their own issue in that they are being mastered at 4000/10000 nits while TV's today are nowhere near capable of such brightness levels leaving the manufacturers to rely on tone mapping, and everyone's tone mapping implementation is different. Some people will prefer how Samsung does it while others will prefer how Sony does it which is now making the entire thing subjective. That's another thing to talk about to....no display in existence can actually do FULL HDR when you think about it. What display can hit 10,000 nits or have 100% Rec.2020 color? Every display right now is only capable of partial HDR in the end whether it's an OLED or the highest end of LCDs. I guess one can argue that HDR is "still" in it's infancy but if that's the case then yeah just another reason why I'm still not going all in for HDR right now and I only care about having 4k120Hz VRR 444 8bit SDR on my CX. Perhaps in a few more years.
 
Movies have their own issue in that they are being mastered at 4000/10000 nits while TV's today are nowhere near capable of such brightness levels leaving the manufacturers to rely on tone mapping, and everyone's tone mapping implementation is different. Some people will prefer how Samsung does it while others will prefer how Sony does it which is now making the entire thing subjective. That's another thing to talk about to....no display in existence can actually do FULL HDR when you think about it. What display can hit 10,000 nits or have 100% Rec.2020 color? Every display right now is only capable of partial HDR in the end whether it's an OLED or the highest end of LCDs. I guess one can argue that HDR is "still" in it's infancy but if that's the case then yeah just another reason why I'm still not going all in for HDR right now and I only care about having 4k120Hz VRR 444 8bit SDR on my CX. Perhaps in a few more years.

Movies vary wildly on how many nits they are mastered for. https://docs.google.com/spreadsheet...u4UI_yp7sxOVPIccob6fRe85_A/edit#gid=184653968
 
As Elvn has mentioned probable a million times before, OLED lacks peak brightness and color volume so from that standpoint they aren't even well suited to HDR in the first place, and I don't care one bit.

Well the specs on LCDs look better for HDR but in practice they're not and that's just because local dimming isn't high enough resolution to properly display HDR. Most HDR highlights are a handful of pixels at most, far below the 2% or even 1% of the screen that rtings and others measure for HDR. And so LCDs are required to massively reduce their brightness on those highlights, because otherwise they would just blow out the image with haloing around it.

In practice you probably need local dimming zones measured in the hundreds of thousands for the higher brightness capability of LCD to measure up. Unless you are watching in a brightly lit room, anyways, or you are talking specifically about scenes with very large high brightness areas like the sun. The high end local dimming LCDs are indeed better in those conditions.
 
Yes OLED can't hit 1000 nit to 1500nit of FALD LCDs but I've learned more about OLED over the years from experts:


Quote from the above HDTVTest C9 OLED review video:
"The peak brightness on paper of these FALD TVs operate in zones, rather than per pixel. So let's say if you ask an FALD LED LCD to light up a very small area with specular highlights, for example, reflections on a vehicle or fireworks against a night sky - these small bright elements are not going to come close to the measured peak brightness of 1000nits or 1500 nits simply because the FALD LED LCD TV has to balance backlight illumination between adjacent zones or else there will be too much blooming which will wash out the picture even further. This the reason why in many side by side comparisons that I, and maybe some of you, have seen - specular highlight detail on a 700nit OLED can often look brighter than even a 1600nit FALD LED LCD. When you add the total absence of blooming and glowing in dark scenes, again because of pixel level illumination, it's not surprise
that it is an OLED and not an LED LCD that has been voted as the best HDR TV by the public in our HDR shootout for two years in a row. Nevertheless I still have a soft spot for high performing LED LCDs that can present high APLscenes with more HDR impact than OLEDs which are restricted by ABL or auto brightness limiter which caps full field brightness at 150 nits for heat and power management. "


So you pick your tradeoffs.

HDR on PC sounds like a mess and hit and miss atm but HDR itself is definitely the future and well worth it if you can get it on some material. Like I said I'll be using my OLED as a media stage for both PC Games and Movies, and also PS4/PS5, so wherever I can get HDR I'll use it. There will always be plenty of SDR material, just like there is still a ton of 1080p material.

----

Yes movies are mastered at 10,000nit but UHD discs are mastered at HDR1000 mostly with some HDR 4000 and only a few HDR 10,0000 so far since there isn't any consumer hardware capable of showing it yet.

I'd think mapping HDR 10,000 to 1,000 would be easier. Mapping 10,000 or 1,000 to 750 - 800 nit sounds like it would be trickier. I'll have to read up on that more.

Going from 150 - 200 - 300 nit of color highlights, detail in color in bright areas, and from colored light sources in scenes to 500, 600, 700, and 800 nit on an OLED would be well worth it to me.

HDR is attempting to show more realism in scenes. We are still nowhere near reality at 10,000nit HDR but it is much more realistic than SDR. The Dolby public test I quoted was using 20,000nits.

https://www.theverge.com/2014/1/6/5276934/dolby-vision-the-future-of-tv-is-really-really-bright
"The problem is that the human eye is used to seeing a much wider range in real life. The sun at noon is about 1.6 billion nits, for example, while starlight comes in at a mere .0001 nits; the highlights of sun reflecting off a car can be hundreds of times brighter than the vehicle’s hood. The human eye can see it all, but when using contemporary technology that same range of brightness can’t be accurately reproduced. You can have rich details in the blacks or the highlights, but not both.

So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a
jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better."


------------------------------------

HDR Color temperature map based on HDR 10,000 in a game.
You can see that the brighter parts of the scene are mostly 100nit, 150nit until you get to the sun (10,000nit), the sun's corona and the glint of the sun off of the rifle's scope and barrel which are in the 1,000nit range and a little higher on the brightest part of the scope's glint to 4000nit. I'd expect everything above the SDR range in the scene would be tone mapped down in relation to the peak colors (peak color brightnesses or luminances) the OLED or FALD LCD are capable of. That is, rather than scaling the whole scene down in direct ratio and making the SDR range darker which would look horrible.

jE1pSgZ.jpg

--------------------------------------
 
Last edited:
Just did a LFC test on my CX.




When Gsync is off, there is definite tearing. When Gsync is on, no tearing.

LG doesn't advertise LFC... but it's doing something. ;)

RTX 2070 Super
I've done comparisons between my CX and my old 144hz gsync monitor, and they feel the same. The input lag between them is so close that it's indistinguishable, even with VRR/gsync enabled. Anyone who says the B9/C9/CX is bad at gaming probably hasn't used one. They're REALLY good at gaming.
That's one thing that looks like it can be crossed off the list thanks to your testing, really value your input to this thread, some great info and comparisons so please stick around :)
 
I personally have no issue with HDR implementation in most games and movies. Also HDR brightness between a LCD and self emissive display can't really be compared.

That's why Vesa has a separate HDR certification standard for OLED's (Display HDR True Black). Also remember that a 10% window peak brightness of 800nits sounds like a tiny portion of the screen but that's still 800,000+ pixels and has a pronounced effect in most current content. A great example is the FF7 remake where the hit sparks peak @ 1000nits and force LCD's to literally illuminate 4-6 zones while on a OLED that same effect fits the 10% window peak brightness.

EDIT: I'm getting fatigue from checking for availability of the CX. Was hoping some retailers would get stock early.

With everything going on right now I think we will be lucky to even get them on time...fingers crossed for early June availibility.
 
That's one thing that looks like it can be crossed off the list thanks to your testing, really value your input to this thread, some great info and comparisons so please stick around :)
Appreciate your kind words, sir. Only trying to help and inform others that may be interested in the CX as well.
 
Back
Top