LG 48CX

New firmware for CX series dropped a few days ago at the KR mothership... 04.40.16

Release notes for most recent two:

(04.40.16)
-Minor Software-Related Issues Improvements

(04.40.10)
-Improves speech recognition performance.
-When playing Stone Vision, the game optimizer will automatically switch to the image quality mode.
-Improved accessibility menu usability for people with disabilities.
 
I finally saw some burn in on mine, but I can only see it with a green test screen on. It is around the static Rust UI in the lower right hand corner. 2000+ hours in Rust on this screen in the last year. I cannot see the burn in during any kind of normal usage.
 
I finally saw some burn in on mine, but I can only see it with a green test screen on. It is around the static Rust UI in the lower right hand corner. 2000+ hours in Rust on this screen in the last year. I cannot see the burn in during any kind of normal usage.
Holy smokes, that's annual FTE hours (2080) without much sick time or PTO, some dedication..🙂
 
Holy smokes, that's annual FTE hours (2080) without much sick time or PTO, some dedication..🙂
Well, I will leave it running in the background while I am working all the time. So it is probably more time looking at a wall than anything else and listening for the furnaces to run out of fuel.
 
So I just tried out the recently released Windows HDR Calibration Tool on my CX and supposedly it helps to improve the Auto HDR image quality. So far I'm not seeing a whole lot of difference from before, if any really. Is it because I already did the CRU tweak to make windows report my display's peak brightness to be 800 nits? I did the calibration while having my TV be in HGiG mode btw, and again the CRU edit was already done to make the display recognized as having 800 nits peak brightness a while back prior to doing the calibration. Perhaps this is more useful to people who did not do the CRU edit.
 
So I just tried out the recently released Windows HDR Calibration Tool on my CX and supposedly it helps to improve the Auto HDR image quality. So far I'm not seeing a whole lot of difference from before, if any really. Is it because I already did the CRU tweak to make windows report my display's peak brightness to be 800 nits? I did the calibration while having my TV be in HGiG mode btw, and again the CRU edit was already done to make the display recognized as having 800 nits peak brightness a while back prior to doing the calibration. Perhaps this is more useful to people who did not do the CRU edit.
I did it with HGIG off. Pretty sure the content/game has to support HGIG to pass the values. I would assume with the level of HDR support we have historically had HGIG support would be scarce if not non existant.
As far as quality increase. It is subjective. Looks fine to me. Auto HDR seems to do something. Didn't really wow me though.
 
I did it with HGIG off. Pretty sure the content/game has to support HGIG to pass the values. I would assume with the level of HDR support we have historically had HGIG support would be scarce if not non existant.
As far as quality increase. It is subjective. Looks fine to me. Auto HDR seems to do something. Didn't really wow me though.

I'll try again with HGiG off. What I meant by not seeing a whole lot of difference was that I didn't see much difference in Auto HDR between the pre and post calibration tool image quality. Auto HDR definitely works for sure, it's just that the image before using the calibration tool and after using the calibration tool remains largely the same. The idea that the game itself has to have HGiG support in order to actually make use of it seems correct. I'm not sure why people keep on insisting that HGiG is the way to go for the most accurate or best image when the games themselves do not even support it to begin with. I've previously been using DTM ON when it comes to Auto HDR games.
 
Last edited:
I'll try again with HGiG off. What I meant by not seeing a whole lot of difference was that I didn't see much difference in Auto HDR between the pre and post calibration tool image quality. Auto HDR definitely works for sure, it's just that the image before using the calibration tool and after using the calibration tool remains largely the same. The idea that think the game itself has to have HGiG support in order to actually make use of it seems correct. I'm not sure why people keep on insisting that HGiG is the way to go for the most accurate or best image when the games themselves do not even support it to begin with. I've previously been using DTM ON when it comes to Auto HDR games.
I know what you mean. Maybe they read about it on a AV forum.
 
HGiG turns off all processing a rather than using DTM. DTM takes lower colors and steps on top of color values higher in the curve which results in lost detail in colors since you are muddying the values that should be those colors together with lifted colors. The curve lacks the range so it compresses more, so more colors share the same values where they should be at different levels. DTM also often lifts the lows and mids unnecessarily because it is working on each frame dynamically using algorithms and so its not really doing it logically. So things that should be shadowed and dark will be lifted and some mids may be lifted outside of the artist's intent or in some cases giving a washed out look to areas.

...

HDTVTEST youtube vid .... DTM On, DTM OFF, HGiG explained

We demonstrate the effects of [Dynamic Tone Mapping] "On", "Off" and "HGIG" in [HDR Game Mode] on the LG CX OLED TV

HGiG means the tv is going to be disabling and it's going to follow the PQ EOTF curve up to maybe it's peak brightness capability and it's going to hard clip. By disabling the tone mapping it's going to hand the tone mapping over to the game

He's talking about HGiG handing the tone mapping over to console games that may actually have HGiG curve data but HGiG is still the most accurate on PC since it turns off all processing and isn't passing a high nit HDR curve to the TV.

Using DTM = Off mode will send the full brightness / color volume curve to the TV and then the TV will apply it's own curve, compressing the upper part of the curve.

This from a reddit thread's replies sums it up I think:

DTM off, is still performing tone mapping, (static) up to 4,000 nits. HGiG, is a hard clip at 800 nits. Regardless of the game…or whether it supports HGiG or not.

With different TV's HGiG, the hard clip may be set to a higher value by the mfg. HGiG is the most accurate as it's a hard defined range with cutoff and no funny business, heavy compression, lifting, etc. I think DTM off is still ok but I do not like DTM on for the aforementioned reasons at the top of this reply.

To be clear, when using the DTM off setting your tv is still static tone mapping taller curves down to what the TV can do.

Monstieur explained the static tone mapping curves well much earlier in this thread:

The curves are arbitrarily decided by LG and don't follow a standard roll-off. There does exist the BT.2390 standard for tone mapping which madVR supports (when using HGIG mode).

LG's 1000-nit curve is accurate up to 560 nits, and squeezes 560 - 1000 nits into 560 - 800 nits.
LG's 4000-nit curve is accurate up to 480 nits, and squeezes 480 - 4000 nits into 480 - 800 nits.
LG's 10000-nit curve is accurate up to 400 nits, and squeezes 400 - 10000 nits into 400 - 800 nits.

Regarding MPC for HDR movies in windows not getting metadata:
I would use only the 1000-nit curve or HGIG for all movies, even if the content is mastered at 4000+ nits. All moves are previewed on a reference monitor and are designed to look good at even 1000 nits. It's fine to clip 1000+ nit highlights. HGIG would clip 800+ nit highlights.

LG defaults to the 4000-nit curve when there is no HDR metadata, which a PC never sends.
 
Last edited:
Most accurate seems arbitrary here...sure with HGiG you would get everything accurate up to 800 nits then you lose every single bit of detail beyond that. With DTM Off now you only have accuracy up to 560 nits before it starts trying to compress the rest of it to fit within the 800 nits capability of the TV but at least you would be getting that detail somewhat rather than lose it entirely. This almost seems like a pick your poison kind of thing and there isn't really a "most accurate" choice, it just depends on what you prefer. Do you want to see the details beyond 800 nits even if it's being compressed? Or would you rather just take the hard clip instead. If you played a game where the sun was outputting 1000 nits, would you rather see that sun at 800 nits or see nothing at all? I'm guessing in a game with HGiG support the game would know that the sun cannot be anymore than 800 nits, while a game that doesn't support HGiG would completely ignore that and still output 1000 nits and let your TV just hardclip it out.
 
I finally saw some burn in on mine, but I can only see it with a green test screen on. It is around the static Rust UI in the lower right hand corner. 2000+ hours in Rust on this screen in the last year. I cannot see the burn in during any kind of normal usage.
what brightness do you keep your display at or play at?
 
Most accurate seems arbitrary here...sure with HGiG you would get everything accurate up to 800 nits then you lose every single bit of detail beyond that. With DTM Off now you only have accuracy up to 560 nits before it starts trying to compress the rest of it to fit within the 800 nits capability of the TV but at least you would be getting that detail somewhat rather than lose it entirely. This almost seems like a pick your poison kind of thing and there isn't really a "most accurate" choice, it just depends on what you prefer. Do you want to see the details beyond 800 nits even if it's being compressed? Or would you rather just take the hard clip instead. If you played a game where the sun was outputting 1000 nits, would you rather see that sun at 800 nits or see nothing at all? I'm guessing in a game with HGiG support the game would know that the sun cannot be anymore than 800 nits, while a game that doesn't support HGiG would completely ignore that and still output 1000 nits and let your TV just hardclip it out.

More or less, though as you mentioned - that otherwise clipped detail is not simply added and gained. HGiG is most accurate because otherwise multiple color luminances are sharing the same slots on the higher ~"half" of the display's range in whatever compression scheme the tv uses. And oleds typically can't even do 1000nit curves so they are squashed even a little more. I guess you could say HGiG avoids muddying out of range colors together with those accurate values in the top end of the display's scale and instead cuts off the highest ranges that the tv is technically incapable of displaying in the first place.

With a few exceptions, practically all displays have relatively short % of screen brightness duration limitations as well though, plus ABL so it's quite a juggling or plate spinning act anyway.

I agree that DTM off which uses the display mfgs static tone mapping curve's set compression scheme is a good compromise but you can try dtm off or HGiG on a per game basis. That since some devs drop the ball on HDR to one degree or another so the results can vary.

DTM on however is sort of wacky world lifting lows and mids unnecessarily, "breaking" shadowed areas and lifting artistic intent of mids. Also I think lifting some of the range that would be beneath the static tone mapping curve's roll off threshold into and throughout the high end muddying detail away more than the static tone mapping curve would. It’s sort of compressing the whole range dynamically and all over the place up as well as down - dynamically on a per frame basis, not doing it logically.
 
Last edited:
Where is the latest greatest place to find the most up to date settings for gaming on the 48cx
 
More or less, though as you mentioned - that otherwise clipped detail is not simply added and gained. HGiG is most accurate because otherwise multiple color luminances are sharing the same slots on the higher ~"half" of the display's range in whatever compression scheme the tv uses. And oleds typically can't even do 1000nit curves so they are squashed even a little more. I guess you could say HGiG avoids muddying out of range colors together with those accurate values in the top end of the display's scale and instead cuts off the highest ranges that the tv is technically incapable of displaying in the first place.

With a few exceptions, practically all displays have relatively short % of screen brightness duration limitations as well though, plus ABL so it's quite a juggling or plate spinning act anyway.

I agree that DTM off which uses the display mfgs static tone mapping curve's set compression scheme is a good compromise but you can try dtm off or HGiG on a per game basis. That since some devs drop the ball on HDR to one degree or another so the results can vary.

DTM on however is sort of wacky world lifting lows and mids unnecessarily, "breaking" shadowed areas and lifting artistic intent of mids. Also I think lifting some of the range that would be beneath the static tone mapping curve's roll off threshold into and throughout the high end muddying detail away more than the static tone mapping curve would. It’s sort of compressing the whole range dynamically and all over the place up as well as down - dynamically on a per frame basis, not doing it logically.

Right but let's say a game was mastered to 1000 nits to begin with and now it supports HGiG and you have it set to 800 nits max. Aren't you technically still tone mapping it and squeezing what should be 1000 nits down into an 800 nit container? IIRC HGiG is just to avoid any sort of "double tone mapping" and instead let the game itself do the tone mapping, but it still tone mapping regardless and you are taking nits that are outside of the TV's capability and now compressing it to fit within what it can show. That sounds exactly like what DTM off is already doing except with HGiG you are simply avoiding a double tone map. So if the purpose of HGiG is to let the game tone map and compress the dynamic range into what's appropriate, then it seems like if the game does not support HGiG you should want to use DTM off in order to get the same effect of compressing 1000 nits down to 800 nits rather than clipping it off. Unless of course I'm misunderstanding things and that is exactly what HGiG is supposed to be doing in a supported game, just hard clips off any details beyond 800 nits and doesn't actually tone map anything.
 
what brightness do you keep your display at or play at?

A few reminders that might help in that vein:


....You can set up different named profiles with different brightness, peak brightness, etc.. and maybe contrast in the TV's OSD. You can break down any of the original ones completely and start from scratch settings wise if you wanted to. That way you could use one named profile for lower brightness and perhaps contrast for text and static app use. Just make sure to keep the game one for gaming. I keep several others set up for different kinds of media and lighting conditions.
  • Vivid
  • Standard
  • APS
  • Cinema
  • Sports
  • Game
  • FILMMAKER MODE
  • iisf Expert (Bright Room)
  • isf Expert (Dark Room)
  • Cinema Home

....You can change the TV's settings several ways. Setting up the quick menu or drilling down menus works but is tedius. Keying the mic button on the remote with voice control active is handy to change named modes or do a lot of other things. You can also use the remote control software over your LAN , even hotkeying it. You can change a lot of parameters using that directly via hotkeys. Those hotkeys could also be mapped to a stream deck's buttons with icons and labels. In that way you could press a streamdeck button to change the brightness and contrast or to activate a different named setting. Using streamdeck functions/addons you can set up keys as toggles or multi press also, so you could toggle between two brightness settings or step through a brightness cycle for example.

....You can also do the "turn off the screen emitters" trick via the quick menu, voice command with the remote's mic button, or via the remote control over LAN software + hotkeys (+ streamdeck even easier). "Turn off the screen" (emitters) only turns the emitters off. It doesn't put the screen into standby mode. As far as your pc os, monitor array, games or apps are concerned the TV is still on and running. The sound keeps playing even unless you mute it separately. It's almost like minimizing the whole screen when you are afk or not giving that screen face time, and restoring the screen when you come back. It's practically instant. I think it should save a lot of "burn down" of the 25% reserved brightness buffer over time. Might not realize how much time cumulatively is wasted with the screen displaying when not actually viewing it - especially when idling in a game or on a static desktop/app screen.

...You can also use a stream deck + a handful of stream deck addons to manage window positions, saved window position profiles, app launch + positioning, min/restore, etc. You could optionally swap between a few different window layouts set to a few streamdeck buttons in order to prevent your window frames from being in the same place all of the time for example.

... Dark themes in OS and any apps that have one available, web browser addons (turn off the lights, color changer), taskbarhider app, translucent taskbar app, plain ultra black wallpaper, no app icons or system icons on screen (I throw mine all into a folder on my hard drive "desktop icons"). Black screen saver if any.

... Logo dimming on high. Pixel shift. A lot of people turn asbl off for desktop but I keep it on since mine is solely for media/gaming. That's one more safety measure.
 
Last edited:
Right but let's say a game was mastered to 1000 nits to begin with and now it supports HGiG and you have it set to 800 nits max. Aren't you technically still tone mapping it and squeezing what should be 1000 nits down into an 800 nit container? IIRC HGiG is just to avoid any sort of "double tone mapping" and instead let the game itself do the tone mapping, but it still tone mapping regardless and you are taking nits that are outside of the TV's capability and now compressing it to fit within what it can show. That sounds exactly like what DTM off is already doing except with HGiG you are simply avoiding a double tone map. So if the purpose of HGiG is to let the game tone map and compress the dynamic range into what's appropriate, then it seems like if the game does not support HGiG you should want to use DTM off in order to get the same effect of compressing 1000 nits down to 800 nits rather than clipping it off. Unless of course I'm misunderstanding things and that is exactly what HGiG is supposed to be doing in a supported game, just hard clips off any details beyond 800 nits and doesn't actually tone map anything.

HGiG
=======

Pretty sure HGiG will just truncate/clip near the peak nit of the display regardless. So all colors displayed will be in the range the TV is capable of and will be shown 1:1 per color value. That's why it's quoted as being the most accurate. It's not sharing or swapping color/luminance value locations with higher out of range ones which by nature is less accurate muddying together or substituting.

From user Monstieur... he said pass using a 1000nit curve *OR* HGiG. There are other sites online and reddit threads saying similar.
"I would use only the 1000-nit curve or HGIG for all movies, even if the content is mastered at 4000+ nits. All moves are previewed on a reference monitor and are designed to look good at even 1000 nits. It's fine to clip 1000+ nit highlights. HGIG would clip 800+ nit highlights."

This from a reddit thread's replies sums it up I think:
DTM off, is still performing tone mapping, (static) up to 4,000 nits. HGiG, is a hard clip at 800 nits. Regardless of the game…or whether it supports HGiG or not.

* while the most accurate, it's still not accurate throughout really, due to %brightness sustained duration limits and of course aggressive ABL dropping values reflexively. Like I said, there are a lot of plates being spun in firmware.

...................

DTM Off
========

DTM off will be a HDR scale or curve by LG with that compresses the top end as intelligently as they could in a static, hard defined way rather than dynamically with DTM on. So while less accurate it tries to preserve some higher detail in colors (for example, in textures in games) that would otherwise be lost.

LG's 1000-nit curve is accurate up to 560 nits, and squeezes 560 - 1000 nits into 560 - 800 nits.
LG's 4000-nit curve is accurate up to 480 nits, and squeezes 480 - 4000 nits into 480 - 800 nits.
LG's 10000-nit curve is accurate up to 400 nits, and squeezes 400 - 10000 nits into 400 - 800 nits.

*Some software will allow you to choose which curve you send. For example, MadVR renderer for MPC. Movies have their own limits too. Most are HDR1,000 .. some are 4000. There are only a few HDR 10,0000 movie releases atm. If you have DTM off and send a 1000nit curve from a movie or game it will compress in the first range listed above.

......................

DTM On
=======

DTM on however is sort of wacky world lifting lows and mids unnecessarily, "breaking" shadowed areas and lifting artistic intent of mids. Also I think lifting some of the range that would be beneath the static tone mapping curve's roll off threshold into and throughout the high end muddying detail away more than the static tone mapping curve would. It’s sort of compressing the whole range dynamically and all over the place up as well as down - dynamically on a per frame basis, not doing it logically.



........................

Like I said though, there are a bunch of games where devs dropped the ball on HDR. They don't all have a HDR peak brightness slider, they might only have a middle brightness/white point slider, and some don't have a saturation slider. Games like elden ring have a HDR peak brightness slider, HDR middle brightness slider, and HDR saturation slider for example. In that case, the CRU edit to the peak brightness of the display can help. Still, since HDR is screwed up on some games it's worth experimenting with HGiG on mode or DTM off mode to see which looks best as it probably won't be the same end result on every HDR game.
 
Last edited:
From Vincent of HDTVTest's twitter:

Based on my testing, HGiG behaves correctly on 2021 LG OLEDs on the latest firmware. However, [HDR Tone Mapping] "Off" in [Game Optimiser] mode is hard clipping - that's why you see no difference to HGiG. [HDR Tone Mapping] "Off" in other non-Game picture modes does a roll-off.

John Linneman of digital foundry on twitter:

With HGIG, the game/console does all tone mapping -


Right but let's say a game was mastered to 1000 nits to begin with and now it supports HGiG and you have it set to 800 nits max. Aren't you technically still tone mapping it and squeezing what should be 1000 nits down into an 800 nit container?

The console is doing the tone mapping if the game/console supports HGiG. HGiG is turning tone mapping (even static tone mapping) off on the TV itself.

So yes but only in that case. It would be tone mapped by the console itself not the TV, as long as the console+game supported HGiG in the first place.

"HGiG on" mode active on the TV from pc sources, or any other source that isn't tone mapping on it's own end, will map colors more accurately to their actual color value "location" all the way up through the peak nit of the display, and will hard clip without any compression. According to Vincent,[HDR Tone Mapping] "Off" in [Game Optimizer] mode will also hard clip in 2021 LG firmware, at least when he posted that May 2021.

Another from that tweet of Vincent's:

Certainly, when checked using the PS5 & Xbox Series X HGiG calibration screen, my LG G1 review sample hard clips at 800 nits, which translates well to HGiG-compliant games such as Dirt5.


...........................................
 
Last edited:
That sounds to me like HGiG has two completely different behavoirs depending on whether the game supports it or not. If the game doesn't support it then HGiG is behaving more like a "static tone mapping OFF" button and just letting the display show what it can and clipping what it cannot. But if the game does support it then it is doing some actual tone mapping on the console level and compressing the dynamic range to fit within the display's capabilities while turning off the TV's tone mapping to avoid a double tone map. So what is the real intended use of HGiG here? To display everything up to 800 nits and clip everything beyond or is it to let the game compress everything including the 1000 nits details into 800 nits?
 
Correct on the first point.

In the second usage scenario it's like

From Vincent of HDTVTest's twitter:

Based on my testing, HGiG behaves correctly on 2021 LG OLEDs on the latest firmware. However, [HDR Tone Mapping] "Off" in [Game Optimiser] mode is hard clipping - that's why you see no difference to HGiG. [HDR Tone Mapping] "Off" in other non-Game picture modes does a roll-off.

From what I've read, HGiG is turning off all curves on the TV and hard clipping the TV's actual luminance / HDR color volume range or scale to the TV's actual peak. That leaves it up to the HGiG console + HGiG console game or whatever other source to do it's own tone mapping down to within the range of the TV's clip / cut-off.

This has also been found useful by users that value a greater, "1:1 accuracy" for each color value to match the actual color scale up to the peak nit of the display (~ 725 to 800nit on most OLEDs), rather than "pulling down into" or muddying, substituting higher out of range color volumes/brightness values down into the top end of the TV's actual range using compression methods and relying on what the devs of that fw decided was best.

Still, due to the poor HDR implementation by devs of some games, you might experiment with DTM= Off and HGiG on a per game basis to see if there is any improvement. (That and maybe use CRU edit to the peak nit of your screen beforehand). Due to the various poor implementations of some games, HGiG=on and DTM=off can have different visible results between different games (one might help vs poor HDR implementation more than the other but on a per game basis so you'd have to try them both out on each game).

Personally I'd use either, whichever I find I preferred for a given game but I'd probably lean toward static tone mapping (DTM=off) if finding it was preserving some details compared to HGiG mode. I just don't like DTM=on for the reasons I said in my previous recent replies.
DTM on however is sort of wacky world lifting lows and mids unnecessarily, "breaking" shadowed areas and lifting artistic intent of mids. Also I think lifting some of the range that would be beneath the static tone mapping curve's roll off threshold into and throughout the high end muddying detail away more than the static tone mapping curve would. It’s sort of compressing the whole range dynamically and all over the place up as well as down - dynamically on a per frame basis, not doing it logically.


............................................

To answer your last question, yes like you've been saying - HGiG's main use was to allow an external device to do the tone mapping itself so turn it all off on the TV, avoiding double tone mapping once on console and once on the TV. (HGiG = less processing on the TV even without a HGiG source though technically). I believe the HGiG enabled (console) games, if done right, would act nearly the same as DTM=Off (static tone mapping of the display) if the game supported HGiG,

I think it would also allow the console/console game devs to use their own tone mapping methods/algorithms instead of whatever the TV's flavor is as well. I doubt they are using LG's static tone mapping fw method. Theoretically their HGiG mapped game could be mapped better in relation to the specific game to the designer's intent. I guess for good or ill depending, as the end result compared to doing static tone mapping on your OLED on a per game basis.

..............................................
 
Last edited:
Correct on the first point.

In the second usage scenario it's like

From Vincent of HDTVTest's twitter:



From what I've read, HGiG is turning off all curves on the TV and hard clipping the TV's actual luminance / HDR color volume range or scale to the TV's actual peak. That leaves it up to the HGiG console + HGiG console game or whatever other source to do it's own tone mapping down to within the range of the TV's clip / cut-off.

This has also been found useful by users that value a greater, "1:1 accuracy" for each color value to match the actual color scale up to the peak nit of the display (~ 725 to 800nit on most OLEDs), rather than "pulling down into" or muddying, substituting higher out of range color volumes/brightness values down into the top end of the TV's actual range using compression methods and relying on what the devs of that fw decided was best.

Still, due to the poor HDR implementation by devs of some games, you might experiment with DTM= Off and HGiG on a per game basis to see if there is any improvement. (That and maybe use CRU edit to the peak nit of your screen beforehand). Due to the various poor implementations of some games, HGiG=on and DTM=off can have different visible results between different games (one might help vs poor HDR implementation more than the other but on a per game basis so you'd have to try them both out on each game).

Personally I'd use either, whichever I find I preferred for a given game but I'd probably lean toward static tone mapping (DTM=off) if finding it was preserving some details compared to HGiG mode. I just don't like DTM=on for the reasons I said in my previous recent replies.



............................................

To answer your last question, yes like you've been saying - HGiG's main use was to allow an external device to do the tone mapping itself so turn it all off on the TV, avoiding double tone mapping once on console and once on the TV. (HGiG = less processing on the TV even without a HGiG source though technically). I believe the HGiG enabled (console) games, if done right, would act nearly the same as DTM=Off (static tone mapping of the display) if the game supported HGiG,

I think it would also allow the console/console game devs to use their own tone mapping methods/algorithms instead of whatever the TV's flavor is as well. I doubt they are using LG's static tone mapping fw method. Theoretically their HGiG mapped game could be mapped better in relation to the specific game to the designer's intent. I guess for good or ill depending, as the end result compared to doing static tone mapping on your OLED on a per game basis.

..............................................

That is what HGiG is supposed to do though. I think people should just ask LG for a fourth option called "all tone mapping OFF"
 
Last edited:
They kind of did make that an option, at least in the 2021 fw:

From Vincent of HDTVTest's twitter:


Based on my testing, HGiG behaves correctly on 2021 LG OLEDs on the latest firmware. However, [HDR Tone Mapping] "Off" in [Game Optimiser] mode is hard clipping - that's why you see no difference to HGiG. [HDR Tone Mapping] "Off" in other non-Game picture modes does a roll-off.

End result is the same either way.. semantics.

Would be nice if it was in the same menu as HGiG, DTM: On , DTM: Off though like you said.
 
Last edited:
They kind of did make that an option, at least in the 2021 fw:

From Vincent of HDTVTest's twitter:




End result is the same either way.. semantics.

Would be nice if it was in the same menu as HGiG, DTM: On , DTM: Off though like you said.

He's talking about the C1. Does the CX have a Game Optimizer option? I can only recall instant game response setting.
 
Yeah I get them mixed up b/c I have a C1 in my living room. You are prob right. Either way same end result.
 
Yeah I get them mixed up b/c I have a C1 in my living room. You are prob right. Either way same end result.

I tried switching between HGiG and DTM OFF and there is definitely zero difference between the two modes after using the Windows HDR calibration tool. This makes sense because windows auto HDR will no longer send any nit signals that's beyond what the TV can display because it is now aware of the TV's capabilities after doing the calibration so there is no need to even perform static tone mapping if the TV never receives any signal above 750/800 nits.



 
There might be a way to set up the windows HDR calibration tool to be at 1000nit so it passes a 1000nit curve, if you wanted a static tone mapping option attempting to add some lost higher details. I'd have to look into it but seems feasible. From what I've read it just creates a HDR color profile.

From a reddit thread:
......................................

I upgraded to Windows 11 specifically for auto HDR and while it did work initially the calibration was off and the whites were way to bright. I'd use it on some games but would have to disable HDR on my monitor when I got off because it made browsing the web, playing other games, watching movies and such horrible. Ultimately I just stopped using HDR because it was a hassle.

With the update and the tool to calibrate your HDR profile I have enabled it and it seems to be perfect no mater what content I am viewing now. The whites are not as bright in general anymore which is fantastic and games which natively support HDR such as Destiny 2 and Cyberpunk 277 look so incredible now.

The app is separate but the functionality is built into your windows with ICC profiles that the tool helps generate. People always hate default applications so I think Microsoft limits them were they feel comfortable.

...............

All the app does is creating an HDR color profile.
Search "Color Management" in start and open the tool, you should see the profile created by the app under: ICC Profiles (Advanced Color).

Obviously, the profile gets created only after completing the calibration process in the app at least once.

If you use an Nvidia GPU, you need to set the color accuracy to "Accurate" under: Nvidia control panel > Display > Adjust desktop color settings > 2. Color accuracy mode.

I tried Ghostwire Tokyo and it looks much better, maybe I went a little to far with the saturation in the calibration app but it looks definitely different than before using the calibration app.

...................

I am aware of a bug that prevents you to go back to accurate if you ever touch the sliders under this option, take a look at this thread, especially post number 9: https://forums.guru3d.com/threads/color-accuracy-mode.435755/

Enhanced should be Accurate with the sliders applied to it, so if you keep the sliders in the default position, it should be equivalent to Accurate even if it says Enhanced.
I don't know if Nvidia ever provided an official comment on this behavior or the current state of it in the latest drivers.


I experienced the same issue a long time ago and in order to revert to accurate I had to reinstall the driver after using DDU. Since then, I am really careful when I use that area of the panel and I avoid touching any of the sliders.

.........................

If you want to truly calibrate hdr for win11 you have use cru to set your peak brightness. Windows by default is 2000nits.

Follow step 3 from this link
https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/?utm_medium=android_app&utm_source=share

This works especially well for hgig gaming

.............................


The way this is all supposed to work is the TV/monitor advertises it's peak HDR10 brightness as part of the display metadata. This is true of most PC monitors and any DisplayHDR certified monitor. When the content/windows advertises the same or lower peak brightness the display disables tone mapping. Freesync 2 Premium Pro monitors also have a feature to explicitly disable tone mapping, though I don't know whether this is used in practice or how it is signalled. Whether the resulting output is correct is up to the monitor and frequently inaccurate, though the latest monitors are getting better.

Windows apps use this HDR display metadata together with the SDR brightness slider (which is a paper white setting for SDR content in the HDR container) to decide what to do with tone mapping. This HDR calibration apps is just providing another way to tell windows what values to provide to the game using the standard windows HDR APIs that have been around since Windows 10. Games that have their own peak brightness sliders can override this but should be using these values as their default (but some don't so check the slider). Games that dont have peak brightness sliders (so-called HGIG games) should be using these values to decide their tone mapping.

Unfortunately for some reason most TV's that support Dolby Vision leave the HDR10 peak brightness metadata in the EDID blank. Windows falls back to using 1599 nits as a default (not sure why they picked this number, maybe it avoids tone mapping up to 1000 nits?). This is especially bad as when windows supplies this as content metadata it kicks the TV into 4000 nits tone mapping making everything overly dim.

HGIG calibration is a silly workaround "standard" getting users to set these values manually with a test pattern (and/or have a database of displays on the console) instead of just updating the TVs to advertise the expected HDR10 display metadata. The most annoying part is most TVs do expose this metadata as Dolby Vision metadata (to support player-lead Dolby Vision) but Windows and consoles don't read that information. I suspect making proper use of this metadata is one of the reasons Dolby Vision on consoles looks "better" (plus 12-bit dithering from the Dolby engine).

An alternative way to more objectively solve this calibration issue is to use edid-info to dump the Dolby Vision metadata and copy that into the appropriate EDID fields (or this Windows calibration app which puts the metadata in a windows specific ICC profile tag instead). This is reasonably easy for brightness but more annoying for color gamut primaries as that isn't exposed or imported by default in CRU so you have to mess with EDID editors.
Having this metadata wrong also make the Windows 11 legacy ICC profile emulation compatibility feature in HDR broken as it generates the wrong ICC profile, but the primaries don't seem to be used by much else yet.

If all of this works correctly end to end on something like an OLED monitor (to avoid backlight) you should get identical output between SDR sRGB mode and the windows desktop in HDR mode with matching SDR brightness settings (possibly with some extra banding due to all the transformations going on).

As a side note the calibration app saturation slider should be all the way to the left on a monitor/TV with accurate HDR output, anything else is messing with the image saturation using the ICC profile. The ICC profile isn't a "normal" one, it includes an undocumented Microsoft Advanced Color MHC2 tag which does global system wide color transformations in the GPU similar to novideo_srgb and applies to all apps. Until this HDR calibration tool that tag was mostly only used for factory calibrating laptop built in displays.

.................................
 
Last edited:
There might be a way to set up the windows HDR calibration tool to be at 1000nit so it passes a 1000nit curve, if you wanted a static tone mapping option attempting to add some lost higher details. I'd have to look into it but seems feasible. From what I've read it just creates a HDR color profile.

From a reddit thread:

1664918195989.png

1664918107391.png


This explains why I saw no difference in highlights before and after doing the windows 11 hdr calibration tool. I had already done the CRU edit a long time ago so that was essentially the HGiG calibration.

So I just tried out the recently released Windows HDR Calibration Tool on my CX and supposedly it helps to improve the Auto HDR image quality. So far I'm not seeing a whole lot of difference from before, if any really. Is it because I already did the CRU tweak to make windows report my display's peak brightness to be 800 nits? I did the calibration while having my TV be in HGiG mode btw, and again the CRU edit was already done to make the display recognized as having 800 nits peak brightness a while back prior to doing the calibration. Perhaps this is more useful to people who did not do the CRU edit.

Now here's my question: If the CRU edit to tell windows I don't have a 2000 nit display but only an 800 nit one, doesn't that mean Windows will now tone map 2000 nits down into an 800 nits container? That would put us back at the original problem of squeezing out of range colors/brightness down to 800 nits. What EXACTLY is CRU doing when it tells windows that your display is only capable of 800 nits? If it tells Windows to only send an 800 nit signal and clip everything above 800 nits up to 2000 nits, how is that any different from not doing the CRU edit which will let windows send the full 2000 nit signal and then letting your TV itself clip everything above 800 nits by enabling HGiG?
 
Last edited:
As I understand it, CRU is just sending the correct (well as correct as you set it to be) peak nit value of the screen to windows and then to the game. The windows HDR calibration tool is attempting to do similar. A game with a peak brightness slider is also doing similar.

Tone mapping on a game might depend on the game dev (especially a console game) but generally, from everything I've read - Tone Mapping on the display will only happen when the nits exceed the peak of the display.

So in your example where I think you are saying you hypothetically have a 2000nit peak display and you:
... use CRU edit to define the display as 800nit
...or purposefully use wrong settings in Windows HDR calibration tool (or maybe editing it's end result ICC profile) to 800nit
...or purposefully set the peak nit slider in a game, who's devs have provideded one, to ~ 800nit.

I think the result would be the same. You aren't exceeding the peak nit of the display so the display won't do any compression/static tone mapping in DTM=off mode.

A game can apparently override this but it should be using it as the foundation so if it's ignoring it, the HDR implementation is probably "broken".

...If you do none of the above and only turn on HGiG on a 2000nit peak display, you'd be turning off all processing/tonemapping/compression. The display would take all color ranges up to the peak of the display and then clip (e.g. if sent HDR4000 or HDR10,000 data).






Windows apps use this HDR display metadata together with the SDR brightness slider (which is a paper white setting for SDR content in the HDR container) to decide what to do with tone mapping. This HDR calibration apps is just providing another way to tell windows what values to provide to the game using the standard windows HDR APIs that have been around since Windows 10.

Games that have their own peak brightness sliders can override this but should be using these values as their default (but some don't so check the slider). Games that dont have peak brightness sliders (so-called HGIG games) should be using these values to decide their tone mapping.
 
Last edited:
As I understand it, CRU is just sending the correct (well as correct as you set it to be) peak nit value of the screen to windows and then to the game. The windows HDR calibration tool is attempting to do similar. A game with a peak brightness slider is also doing similar.

Tone mapping on a game might depend on the game dev (especially a console game) but generally, from everything I've read - Tone Mapping on the display will only happen when the nits exceed the peak of the display.

So in your example where I think you are saying you hypothetically have a 2000nit peak display and you:
... use CRU edit to define the display as 800nit
...or purposefully use wrong settings in Windows HDR calibration tool (or maybe editing it's end result ICC profile) to 800nit
...or purposefully set the peak nit slider in a game, who's devs have provideded one, to ~ 800nit.

I think the result would be the same. You aren't exceeding the peak nit of the display so the display won't do any compression/static tone mapping in DTM=off mode.

A game can apparently override this but it should be using it as the foundation so if it's ignoring it, the HDR implementation is probably "broken".

Right. Tone mapping on the display won't occur because we will not exceed 800 nits. But what happens to everything beyond 800 nits when you do the CRU edit? Is it all being tossed out or will Windows Auto HDR tone map 2000 nits down to 800 nits?
 
Right. Tone mapping on the display won't occur because we will not exceed 800 nits. But what happens to everything beyond 800 nits when you do the CRU edit? Is it all being tossed out or will Windows Auto HDR tone map 2000 nits down to 800 nits?


......

in your example where I think you are saying you hypothetically have a 2000nit peak display and you:
... use CRU edit to define the display as 800nit
...or purposefully use wrong settings in Windows HDR calibration tool (or maybe editing it's end result ICC profile) to 800nit
...or purposefully set the peak nit slider in a game, who's devs have provideded one, to ~ 800nit.

I think the result would be the same. You aren't exceeding the peak nit of the display so the display won't do any compression/static tone mapping in DTM=off mode.


I think the cru edit tells windows that a 800 nit screen is connected so that is all that it will work with. It'd be the same if you edited cru to only show 1920x1080 resolution. No other resolution (or in this case, color range/nits) will be available.
 
......




I think the cru edit tells windows that a 800 nit screen is connected so that is all that it will work with. It'd be the same if you edited cru to only show 1920x1080 resolution. No other resolution (or in this case, color range/nits) will be available.

This is why I gave up on HDR for PC gaming and didn't bother with it again until late 2020 after I got a CX and RTX 30 series for 4k 120Hz 10bit. It was an absolute mess on PC and even today it seems like you need to do a whole bunch of digging around to find out what is the right TV settings, Windows Settings, GPU settings, etc. on a per game basis. From all the testing I've done once you do the CRU edit there is no longer any difference between HGiG and DTM OFF so really the two options are either DTM ON or HGiG and one can look better than the other depending on the game/scene. I guess we just need much much brighter displays to avoid tone mapping altogether.
 
This is why I gave up on HDR for PC gaming and didn't bother with it again until late 2020 after I got a CX and RTX 30 series for 4k 120Hz 10bit. It was an absolute mess on PC and even today it seems like you need to do a whole bunch of digging around to find out what is the right TV settings, Windows Settings, GPU settings, etc. on a per game basis. From all the testing I've done once you do the CRU edit there is no longer any difference between HGiG and DTM OFF so really the two options are either DTM ON or HGiG and one can look better than the other depending on the game/scene. I guess we just need much much brighter displays to avoid tone mapping altogether.

It's way better than it used to be. You have three ways to set the peak brightness now. Windows HDR calibration tool does it overall for windows a lot like a game with a peak brightness slider does for it's individual game. So that is two. Then there is CRU edit which gives you more precise control over the values (though you could probably edit the icc profile that windows hdr calibration tool creates more precisely after the fact too).

The problem was, windows wasn't using the right data or range before in some content/games. Now you have a few choices how to assign the peak brightness windows uses as more or less the peak nit capability of your display.

...........

Still you could probably set up windows to use a 1000nit curve using one of those three methods if you wanted DTM=off mode to allow the TV to apply static tone mapping compression - which might preserve more details (inaccurately color value wise) that would have otherwise been lost.
 
Last edited:
It seems to me that Windows HDR calibration tool is applying a global peak nit similar to how games that have them in their own menus do. That would make the CRU edit less needed - though the calibration style tools are doing the adjustments relative to how you feel examples look, kind of like cleartype tuning, rather than using a more precise number based assignment.
 
It seems to me that Windows HDR calibration tool is applying a global peak nit similar to how games that have them in their own menus do. That would make the CRU edit less needed - though the calibration style tools are doing the adjustments relative to how you feel examples look, kind of like cleartype tuning, rather than using a more precise number based assignment.

It does something similar to the PS5's HDR calibration screen but it has a few extra sections that lets you adjust the saturation, guess that's where people are thinking that it's supposed to be a color profile. My assumption was that this is doing what CRU does so the people who have already done the CRU edit may not need to do this unless they wish to adjust the saturation. I left my saturation values in the middle so I guess that's why I see no difference pre and post calibration since CRU already took care of capping the peak nits. For people who are not keen on using CRU this would be the "official" solution I suppose.
 
According to one of the quotes I posted, regarding the Windows HDR calibration tool:

All the app does is creating an HDR color profile.
Search "Color Management" in start and open the tool, you should see the profile created by the app under: ICC Profiles (Advanced Color).

Obviously, the profile gets created only after completing the calibration process in the app at least once.

If you use an Nvidia GPU, you need to set the color accuracy to "Accurate" under: Nvidia control panel > Display > Adjust desktop color settings > 2. Color accuracy mode.

I tried Ghostwire Tokyo and it looks much better, maybe I went a little to far with the saturation in the calibration app but it looks definitely different than before using the calibration app.

So apparently it does create an actual ICC profile. Even though it's called a "color profile" it's all paramaters incl. brightness, etc.

warning regarding the Nvidia Control Panel -> Display -> Adjust Desktop color settings -> Color accuracy mode: ----->
accurate setting (suggests using enhanced):
I am aware of a bug that prevents you to go back to accurate if you ever touch the sliders under this option, take a look at this thread, especially post number 9: https://forums.guru3d.com/threads/color-accuracy-mode.435755/

Enhanced should be Accurate with the sliders applied to it, so if you keep the sliders in the default position, it should be equivalent to Accurate even if it says Enhanced.
I don't know if Nvidia ever provided an official comment on this behavior or the current state of it in the latest drivers.


I experienced the same issue a long time ago and in order to revert to accurate I had to reinstall the driver after using DDU. Since then, I am really careful when I use that area of the panel and I avoid touching any of the sliders.

Here is where the ICC profiles are referenced. You could probably find a way to automate switching/choosing with a stream deck's buttons somehow if you wanted to set up different curves e.g. 800 nit, 1000nit, different saturations.

https://pcmonitors.info/articles/using-icc-profiles-in-windows/

Display-Profile.png


Obviously it would be a pain to have to go into Colour Management and switch profiles on and off every time you wanted to play a certain game or return to the desktop, or switch between multiple profiles for different purposes. Windows 10 and 11 include a drop-down list in ‘Display Settings’ which makes this easier. Alternatively, there is an excellent and tiny utility called ‘Display Profile’ (above), created by X-Rite, which gives you a very quick and convenient way of doing this. You can download it here. This allows you to toggle between ICC profiles or use the system defaults if you essentially want to disable any ICC profile corrections. This utility lists profiles located in ‘X:\Windows\system32\spool\drivers\color’, where ‘X’ denotes the drive you’ve installed Windows to. You can therefore simply drag and drop any profiles to this folder and they should be selectable in this utility. To use system defaults and disable any specific LUT and gamma corrections simply select ‘sRGB IEC61966-2.1’ in the utility.

.....................

Yes it's like Windows added a peak brightness slider like some games have, so you wouldn't have to use CRU edit. CRU edit is just more specific. (It also allows you to add/remove resolutions e.g remove 4096x and add ultrawide resolutions). The windows HDR calibration tool is pretty recently released to PC after being on xbox. It seems like a good fix.
 
Last edited:
C3 needs to up that refresh rate past 120Hz now. 4090 is ~1.7x over a 3090 as I thought, plenty for well above 120fps.

1665493894653.png
 
For now, 4090 seems like a 4K120 card... wow.

145fps is on the lower end of what it can do since the testing was done with DLSS off and ultra settings. There's wiggle room to improve performance with optimized settings and using DLSS to easily push it into 160fps+ averages.
 
So...another day and another frustrating moment with HDR for me lol. Just tried the new Plague Tale game this morning after watching this video:



He says that the game does obey the Windows HDR Calibration tool and not to deviate too far from the in game HDR default values if you used the calibration tool which I did already. However, I am noticing some pretty insane clipping going on here, which I'm assuming is normal behavior because if the highlight is greater than 800 nits then it will simply get hard clipped off as that is what HGiG is supposed to do isn't it? There is no roll off going on so it just displays 0-750/800 nits as it should be shown then clipping out everything beyond that. So what's the problem? Seems like the game has a lot of highlight detail that's beyond 800 nits, at least in the beginning section so far and it's all getting completely clipped off so now like half of my screen is just filled with clipped highlights. Funny enough, DTM ON actually causes even more clipping than using HGiG and DTM OFF behaves exactly the same as HGiG. The areas in red is where I'm seeing a bunch of clipping, mostly the ground and the clouds/sky but like I said it takes up a large portion of the screen and I wouldn't really consider this to be a great HDR experience if the rest of the game is like this, full of clipped off detail.
 

Attachments

  • 20221018_083607.jpg
    20221018_083607.jpg
    1 MB · Views: 0
Last edited:
Back
Top