LG 48CX

Can you delete the windows HDR calibration profile from the color profiles list in windows without screwing things up? (idk if you can do that or remove the calibration ~ zero it to default in the calibration tool). The windows calibration is by your impression rather than numbers for some of it's config tools.

Here is where the ICC profiles are referenced. You could probably find a way to automate switching/choosing with a stream deck's buttons somehow if you wanted to set up different curves e.g. 800 nit, 1000nit, different saturations.

https://pcmonitors.info/articles/using-icc-profiles-in-windows/

Display-Profile.png



Obviously it would be a pain to have to go into Colour Management and switch profiles on and off every time you wanted to play a certain game or return to the desktop, or switch between multiple profiles for different purposes. Windows 10 and 11 include a drop-down list in ‘Display Settings’ which makes this easier. Alternatively, there is an excellent and tiny utility called ‘Display Profile’ (above), created by X-Rite, which gives you a very quick and convenient way of doing this. You can download it here. This allows you to toggle between ICC profiles or use the system defaults if you essentially want to disable any ICC profile corrections. This utility lists profiles located in ‘X:\Windows\system32\spool\drivers\color’, where ‘X’ denotes the drive you’ve installed Windows to. You can therefore simply drag and drop any profiles to this folder and they should be selectable in this utility. To use system defaults and disable any specific LUT and gamma corrections simply select ‘sRGB IEC61966-2.1’ in the utility.


All the app does is creating an HDR color profile.
Search "Color Management" in start and open the tool, you should see the profile created by the app under: ICC Profiles (Advanced Color).

Obviously, the profile gets created only after completing the calibration process in the app at least once.

If you use an Nvidia GPU, you need to set the color accuracy to "Accurate" under: Nvidia control panel > Display > Adjust desktop color settings > 2. Color accuracy mode.

I tried Ghostwire Tokyo and it looks much better, maybe I went a little to far with the saturation in the calibration app but it looks definitely different than before using the calibration app.

So apparently it does create an actual ICC profile. Even though it's called a "color profile" it's all paramaters incl. brightness, etc.
warning regarding the Nvidia Control Panel -> Display -> Adjust Desktop color settings -> Color accuracy mode: ----->
accurate setting (suggests using enhanced):
I am aware of a bug that prevents you to go back to accurate if you ever touch the sliders under this option, take a look at this thread, especially post number 9: https://forums.guru3d.com/threads/color-accuracy-mode.435755/

Enhanced should be Accurate with the sliders applied to it, so if you keep the sliders in the default position, it should be equivalent to Accurate even if it says Enhanced.
I don't know if Nvidia ever provided an official comment on this behavior or the current state of it in the latest drivers.


I experienced the same issue a long time ago and in order to revert to accurate I had to reinstall the driver after using DDU. Since then, I am really careful when I use that area of the panel and I avoid touching any of the sliders.


You already have cru edited to ~800nit I think, so you could leave it like that and set the TV to DTM = off so the game's curve gets static tonemapped down by LG's curve. Also, is autoHDR on? could that clash with true HDR on some games? Seems like a hdr curve problem like you said though. Is there any DTM settings in the game itself? In game peak brightness setting? Middle brightness/white setting?

DTM = On can step over other color values and wash out detail. There are some HDTVtest vids showing it wash out textures in bright highlights so that's not that unexpected really. DTM on is lifting lower color values into those occupied by higher ones. It can overbrighten the picture or parts of it. It will run out of already occupied rungs on the ladder to step onto. It does it all dynamically via analysis so can have some bad results.

I only watched that video on a basic SDR screen at the moment but those areas looked very bright in the video and obviously clipped in your screenshots. . It's tough to show HDR issues by using SDR screenshots and videos rather than using testing hardware and graphs. Even actual HDR images/videos would look different on different screens or screen settings, even different ambient room lighting.v


Edit: More Sliders and subjective editing by images. Maybe this HDR calibration and the Windows Calibration are clashing. Maybe try lowering this one?

17-10-2022_16-53-51-zhav4vp5.jpg
 
Last edited:
Can you delete the windows HDR calibration profile from the color profiles list in windows without screwing things up? (idk if you can do that or remove the calibration ~ zero it to default in the calibration tool). The windows calibration is by your impression rather than numbers for some of it's config tools.








You already have cru edited to ~800nit I think, so you could leave it like that and set the TV to DTM = off so the game's curve gets static tonemapped down by LG's curve. Also, is autoHDR on? could that clash with true HDR on some games? Seems like a hdr curve problem like you said though. Is there any DTM settings in the game itself? In game peak brightness setting? Middle brightness/white setting?

DTM = On can step over other color values and wash out detail. There are some HDTVtest vids showing it wash out textures in bright highlights so that's not that unexpected really. DTM on is lifting lower color values into those occupied by higher ones. It can overbrighten the picture or parts of it. It will run out of already occupied rungs on the ladder to step onto. It does it all dynamically via analysis so can have some bad results.

I only watched that video on a basic SDR screen at the moment but those areas looked very bright in the video and obviously clipped in your screenshots. . It's tough to show HDR issues by using SDR screenshots and videos rather than using testing hardware and graphs. Even actual HDR images/videos would look different on different screens or screen settings, even different ambient room lighting.v


Edit: More Sliders and subjective editing by images. Maybe this HDR calibration and the Windows Calibration are clashing. Maybe try lowering this one?

View attachment 519537

Yeah the weird part is that I dragged that slider all the way to left and it was still clipping. I think it's definitely some sort of conflict between Auto HDR/Windows HDR calibration and the game. Maybe this game is supposed to follow the HDR calibration tool but it isn't doing it properly. I'll try removing the calibration profile and turning auto HDR off and just adjusting the in game settings with HGiG enabled and see if I still get a bunch of clipping.
 
Yes wiping it might be a good starting point but idk if you can swap to a different profile using that windows color profile menu instead. Otherwise, you could try to save that existing HDR calibration tool color profile file into a different folder or rename it, and then run the windows calibration tool again but turn things down. Could be difficult to fine tune it in that manner though. That way you'd still have your tweaked and working windows calibration tool profile around to swap back to file wise without having to go through all of the calibration tools again trying to duplicate what you had if you were happy with it.
 
Yes wiping it might be a good starting point but idk if you can swap to a different profile using that windows color profile menu instead. Otherwise, you could try to save that existing HDR calibration tool color profile file into a different folder or rename it, and then run the windows calibration tool again but turn things down. Could be difficult to fine tune it in that manner though. That way you'd still have your tweaked and working windows calibration tool profile around to swap back to file wise without having to go through all of the calibration tools again trying to duplicate what you had if you were happy with it.

Well I deleted the color profile in color management and that seems to get me halfway to fixing the problem. Now what I had to do in order to restore all the highlight detail back was turn the in game luminance value all the way to the minimum, on the far left. I'm not sure why I need to go so far down on the HDR settings but it is what it is. Here's some screenshots showing the highlight detail being recovered starting from the Default setting which is in the middle on the luminance slider, then gradually adjusting it to the left until I reach the end. So for now it seems like in games that supposedly follow the Windows HDR Calibration tool such as Plague Tale, it may actually end up screwing with your final image in a bad way. It contradicts what GamingTech said about the OS wide Windows HDR Calibration tool working as intended for this game but hey maybe I just did something wrong on my end.
 

Attachments

  • 20221018_204603.jpg
    20221018_204603.jpg
    820.9 KB · Views: 1
  • 20221018_204620.jpg
    20221018_204620.jpg
    884.5 KB · Views: 1
  • 20221018_204647.jpg
    20221018_204647.jpg
    902 KB · Views: 0
  • 20221018_204701.jpg
    20221018_204701.jpg
    762.1 KB · Views: 0
Last edited:
  • Like
Reactions: elvn
like this
Glad you got the clipping to go away. That kind of thing looks terrible. DTM similarly can have a lot of bad tradeoffs like that and muddied/lost details from color values stepping over themselves and scenes being operated on dynamically from analysis which can vary (how bright the sky is for example) just looking at a different angle returning to the same scene - which is why I don't use it.

Maybe the game should have a checkbox to disable the game's own calibration or to not use windows' one. The HDR calibration tool is still relatively new on pc so this might just be some growing pains where the different scalings/curve mappings are stepping over one another.

Can also check a few other things..

make sure you have hdmi deep color enabled on your hdmi input in the TV OSD

make sure you are in PC mode icon RGB/444 rather than 4:2:0

You probably are in both already though.

Might check your game HDR mode settings too:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/


If your other games are working properly it's probably just the dev's fault though like you said.
 
Glad you got the clipping to go away. That kind of thing looks terrible. DTM similarly can have a lot of bad tradeoffs like that and muddied/lost details from color values stepping over themselves and scenes being operated on dynamically from analysis which can vary (how bright the sky is for example) just looking at a different angle returning to the same scene - which is why I don't use it.

Maybe the game should have a checkbox to disable the game's own calibration or to not use windows' one. The HDR calibration tool is still relatively new on pc so this might just be some growing pains where the different scalings/curve mappings are stepping over one another.

Can also check a few other things..

make sure you have hdmi deep color enabled on your hdmi input in the TV OSD

make sure you are in PC mode icon RGB/444 rather than 4:2:0

You probably are in both already though.

Might check your game HDR mode settings too:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/



If your other games are working properly it's probably just the dev's fault though like you said.

The only issue now is that by having to adjust the slider all the way to the left, the overall picture is now way dimmer and looks more like SDR. It's still better than actually playing in SDR so maybe it's more like HDR400 I guess. I'm not sure what method GamingTech does to get his measurements on the in game scene's luminance value outputs but I would say the peak brightness is definitely dimmer than your average HDR game on an OLED screen. My other settings like HDMI deep color and RGB chroma are enabled and so is PC mode. Not sure how else I can get the peak brightness to be more in line (700-800 nits) without causing clipping.
 
If you don't use windows calibration HDR profile and use DTM=off it would normally map down to your 800nit. However that game's own hdr calibration seems to be screwed up somehow, using it's own range or formula, or maybe even throwing the gamma off or something.

You could try to sacrifice one of the other default named TV mode's (HDR version of it with the HDR game running to enable that), on the TV's OSD to tweak it's settings in an attempt to compensate for what's happening in that game. However when not using SDR named game mode or HDR named game mode, you'll get a lot more input lag so that's not a great option.

. . . . . . . . . .

Apparently reshade supports 10bit HDR so you could set up reshade for that game. It's pretty easy to work with if you just want to adjust a few simple sliders for the game (brightness, saturation, black detail, white point, etc). You don't have to go crazy adding a ton of additional shaders/plugins to it. Still might be hard to compensate for a hard clip.

Some Reshade plugin recommendations from a 48" OLED reshade thread:

I'd recxomend the fake hdr setting, make sure to put levels on there so you can really play with that contrast and what not, . . . tbh just tinkering is the best thing, there is no one size fits all and your preference could be different than mine.

Magic HDR, fake HDR, prod brightness contrast, prod colour space curves. I am an HDR OLED user.


https://forum.level1techs.com/t/guide-for-installing-running-reshade-and-presets/126368

.

HOW TO RESHADE w/ qUINT LIGHTROOM.fx BASICS

. .

Magic HDR is a separate filter/addon from lightroom but you can load multiple filters

Magic HDR reshade plugin (apologize that it's from a low rez source but you can make out the lettering).
Exposure settings might help your game.

SDtuaXj.png



. . . . . . . . . . .

Trying to think of a way to trick the game into thinking you peak nits are higher. Maybe purposefully doing the windows calibration tool "wrong" to get a higher peak out of it, or using CRU to make the screen 1000 instead of 800 but I wouldn't want to change CRU edit for 1 game. I think the windows calibration method would be an easier thing to mess with since you could swap your more accurate color profile out and copy+paste it back later.

The scaling/range of the in game sliders is probably just a bad HDR implementation and it sounds like there is no way to disable the in game calibration entirely.

I'd give reshade a shot. It looks promising to me.
 
Last edited:
I'll do some more digging on it later but for now I've held off on playing the game until I can get a 4090. With the current settings I'm using the game doesn't necessarily look bad, I just feel like it's not reaching the max peak brightness that I know the CX can pull off.
 
If you ever get around to it, I think the exposure settings could have a chance to mute that white clipping / blowout on the ground. That panel comes up with a hotkey show/hide toggle . . as a side panel once reshade is loaded into that game, so you can move the sliders around experimentally while the game is running and see what results you can get.

When you run reshade it asks what game you want to add so you just navigate to the game exe. Then you pick which filters you want in your library. The screenshot I posted shows a bunch of filters in his list but if you look you can see he's only using 2 that he has checkboxed. You really don't have to go overboard with reshade and complicate it. The lightroom filter has sliders for saturation, brightness, black detail , etc kind of like photoshop/lightroom. The Magic HDR one has a few headings but I think in that game's case the input and output exposure might help.
 
Last edited:
I'll do some more digging on it later but for now I've held off on playing the game until I can get a 4090. With the current settings I'm using the game doesn't necessarily look bad, I just feel like it's not reaching the max peak brightness that I know the CX can pull off.

You might be interested in this workaround for games with that problem.

Youtube Video:

compressing HDR games to show 10,000 nits on LG C1

. . . . .

He's setting the nvidia gpu range to limited and then changing the black detail and white settings in the TVs OSD using provided test patterns.

Similarly clipping like your game:

Tvxh9fM.png


XiSaasJ.png


He changes the Nvidia setting to limited. This isn't the only step though, he changes OSD settings for black detail and the regular brightness settings.

He had that game at screen brightness 46 instead of 50, and at dark area level of -1 when in limited range. He said that it was equivalent in effect to brightness 50 and dark detail of -17 when in unlimited range. He tweaks different values for different games though.

7rJSwNH.png
 
Last edited:
You might be interested in this workaround for games with that problem.

Youtube Video:

compressing HDR games to show 10,000 nits on LG C1

. . . . .

He's setting the nvidia gpu range to limited and then changing the black detail and white settings in the TVs OSD using provided test patterns.

Similarly clipping like your game:

View attachment 520782

View attachment 520783

He changes the Nvidia setting to limited. This isn't the only step though, he changes OSD settings for black detail and the regular brightness settings.

He had that game at screen brightness 46 instead of 50, and at dark area level of -1 when in limited range. He said that it was equivalent in effect to brightness 50 and dark detail of -17 when in unlimited range. He tweaks different values for different games though.

View attachment 520784

I'm currently playing Uncharted atm since my 3080 Ti can do 4k maxed with DLSS at 100+ fps. The HDR experience here has been far less frustrating to say the least. All I did was enable HDR, adjusted the brightness slider from 5 to 7, and everything seems to look just fine as far as I can tell. I really think we shouldn't have to resort to messing around with Reshade, NVCP, CRU, etc. etc. just to get a good looking picture. Hopefully HDR gets to that point one day on PC where it's as simple as turning it on and as Jensen Huang says, "It just works".
 
Yes that would be nice. If I was very into a particular game and spending a lot of time on that game I'd go the extra mile and try to get a better result but not every game or so so games.
 
I'm currently playing Uncharted atm since my 3080 Ti can do 4k maxed with DLSS at 100+ fps. The HDR experience here has been far less frustrating to say the least. All I did was enable HDR, adjusted the brightness slider from 5 to 7, and everything seems to look just fine as far as I can tell. I really think we shouldn't have to resort to messing around with Reshade, NVCP, CRU, etc. etc. just to get a good looking picture. Hopefully HDR gets to that point one day on PC where it's as simple as turning it on and as Jensen Huang says, "It just works".
To my understanding the HGIG setting should be for that so that the game itself could provide the right settings but I don't know if anything on PC implements it.
 
To my understanding the HGIG setting should be for that so that the game itself could provide the right settings but I don't know if anything on PC implements it.

Most games that I've played do have some sliders to adjust the HDR so I guess that's currently the best we have to use with HGiG. Some games do not have any adjustment sliders at all so for those it might be better to just use DTM OFF. Hopefully one day that Windows HDR Calibration tool will act like the PS5's and all PC games will support it.
 
Last edited:
Did anyone swap to the 42C2 yet? Thinking about it as the black friday prices are looking really good. Probably will be cheaper yet though when the C3 comes out next year. I'm so used to the 48 at this point though I wonder if I'll regret downsizing.
 
I go by PPD so the screen size just determines how far away I'll have to modify my setup to sit. 42 inch 4k saves 4.5 to 6 inches in view distance compared to a 48 inch 4k. That's in regard to your size difference reasoning. Newer tech often has a few better specs like peak brightness a bit. You'll find people who swapped in the 42" OLED threads.


..
42" 4k flat: 64deg viewing angle = 29" view distance = 60 PPD

42" 4k flat: 55deg viewing angle = 35" view distance = 70 PPD

42" 4k flat: 48 deg viewing angle = ~ 41" view distance = 80 PPD <----- view distance makes an equilateral triangle pyramid/cone viewing angle so you can see the bulk of the screen surface better

. .

48" 4k flat: 64 deg viewing angle = 33.5" view distance = 60 PPD

48" 4k flat: 55 deg viewing angle = \~ 40" view distance = 70 PPD

48" 4k flat: 48 deg viewing angle = 47" view distance = 80 PPD <----- view distance makes an equilateral triangle pyramid/cone viewing angle so you can see the bulk of the screen surface better

..

759311_VcjhpLn.png

..

At 60 PPD, massaged or alternate types of text sub sampling are perhaps enough vs text fringing, and aggressive amounts of AA applied are enough to compensate for graphics aliasing in 3d game engines. However on the 2D desktop there is no AA typically outside of the AA of text sub-sampling so desktop graphics and imagery will have their pixelization uncompensated. So 70 to 80PPD (for now, or higher someday) is preferable.

Closer than 60 PPD point and your sub-sampling and AA won't be enough to compensate anymore but you will also be sitting nearer than your 60 degree human viewpoint. That results in the sides of the screen being pushed outside of 60 deg viewing angle which means you'd have to dart your eyes side to side to see anything in those "eye fatigue zones" continually during gameplay. The closer you sit, the more off axis or off angle the sides of the screen become as well, which means more viewing angle color shift and in a larger area ofnthe sides ofnthe screen.

So there is a sweet spot range of PPD and viewing angle degrees. Changing the screen size of 4k 16:9 vs. a different size 4k 16:9 just determines how far the viewing distances are for those ranges. The nearer 29" view distance for 60PPD on a 42 inch 4k could be helpful for space constraints but it's still out of bounds of a traditional near upright piano sheet music type setup's 1.5 to 2 foot viewing distance. 70 to 80 ppd is better though, especially for 2d desktop imagery and better looking text (perhaps even more so with WOLED text sub-sampling). For the vast majority of desk depths to be able to get 60PPD, and certainly to get to 70 80 PPD distances, you would have to decouple the screen from the desk using any if various mounting options. A simple slim rail spine floor footed stand is probably the easiest way.
 
Last edited:
Did anyone swap to the 42C2 yet? Thinking about it as the black friday prices are looking really good. Probably will be cheaper yet though when the C3 comes out next year. I'm so used to the 48 at this point though I wonder if I'll regret downsizing.

I've gotten used to 48 over the last 2 years so I don't see much reason to upgrade if the image quality and Hz is the same. Some would argue that the image quality would be downgraded due to the 42s lower peak brightness vs the 48. The only thing that will get me to upgrade at this point is noticeably better image quality like a QD-OLED or a decent bump in Hz to like 160Hz.
 
I've gotten used to 48 over the last 2 years so I don't see much reason to upgrade if the image quality and Hz is the same. Some would argue that the image quality would be downgraded due to the 42s lower peak brightness vs the 48. The only thing that will get me to upgrade at this point is noticeably better image quality like a QD-OLED or a decent bump in Hz to like 160Hz.
If anything Samsung's QD-OLED might actually be worse for desktop use if it keeps its odd pixel arrangement and they don't get Microsoft to support it. At least with the LG OLEDs you can mostly mitigate it with Cleartype adjustments and DPI scaling.

The things I want to see next for OLED from any manufacturer:
  • Higher refresh rate.
  • Higher peak and sustained brightness.
  • Curved 40-43" models.
  • 6-8K models at 40-55" size range.
 
If anything Samsung's QD-OLED might actually be worse for desktop use if it keeps its odd pixel arrangement and they don't get Microsoft to support it. At least with the LG OLEDs you can mostly mitigate it with Cleartype adjustments and DPI scaling.

The things I want to see next for OLED from any manufacturer:
  • Higher refresh rate.
  • Higher peak and sustained brightness.
  • Curved 40-43" models.
  • 6-8K models at 40-55" size range.

That wouldn't bother me personally since I don't use my OLED for any sort of text heavy work, or any work in general. So the trade off of bad subpixel structure for higher peak brightness and color volume would be worth it to me.
 
If anything Samsung's QD-OLED might actually be worse for desktop use if it keeps its odd pixel arrangement and they don't get Microsoft to support it. At least with the LG OLEDs you can mostly mitigate it with Cleartype adjustments and DPI scaling.

The things I want to see next for OLED from any manufacturer:
  • Higher refresh rate.
  • Higher peak and sustained brightness.
  • Curved 40-43" models.
  • 6-8K models at 40-55" size range.

At least with the LG OLEDs you can mostly mitigate it with Cleartype adjustments and DPI scaling.
------>> Or sitting at a higher PPD distance like 70 - 80 PPD + text subsampling methods. Not a fan of losing 4k desktop real-estate by scaling text personally. Also, scaling text and sub-sampling doesn't help aliasing of desktop 2d graphics and imagery. I wouldn't want to be using a PPD where pixels and sub-pixels are so large that you need to scale text due to pixelization. On very high PPD displays of course I'd scale text just so it's not microscopic though.


The things I want to see next for OLED from any manufacturer:
  • Higher refresh rate. <--- agree, but at high resolutions 4k+. Not the 1440p 16:9 higher hz ones that are put out with resulting lower PPD (outside of a 49" g9 if sitting nearer to the ~ 40" focal point)
  • Higher peak and sustained brightness. <---- agree. Especially sustained. Heatsink tech would probably help a lot which I'd trade off vs thinness. QD-OLED's blue basis and color filter theoretically allows for brighter colors at lower energy states/heat too.
  • Curved 40-43" models. <---- Curved yes but my preference would be on the larger end. I'd go up to 48" for immersion at 4k (and additionally uw resolutions on a 4k screen) 1000R ~ 39.4" focal point ~ 70PPD, 55" at 8k sitting at focal point of 1000R curve ~> 122 PPD
  • 6-8K models at 40-55" size range. <-- yes but as above, I'd prefer 1000R 48" 4k to 55" screen. Might as well skip 6k and go right to 8k on a 55" so that you'd be able to get quads of fair sized 4k windows.
  • Gaming Upscaling tech on the screen if possible so you could send 4k high hz bandwidth over the port/cable bottleneck to be upscaled to 8k on the screen without it looking muddy or introducing anything other than negligible input lag. Would be cool if it were possible to get gaming AI upscaling hardware on the monitor end.

Ultrawide resolutions would be great on higher hz, especially on a 8k screen that is ~55" 1000R. On a 4k screen when you do 32:9 or 32:10 you only get 1080px or 1200px high viewable which makes me uninterested in that aspect. These below seem like they'd fit in hdmi 2.1's bandwidth.

From using the LTT resolution/bandwidth calculator I linked - These uw resolutions look like they'd be nice to run on a 16:9 4k or 8k screenif they upped the Hz on oled tvs, even at 4k upscaled to 8k being sent as an 8k signal as far as the display would be concerned. They fit within HDMI 2.1's bandwidth, at least when using DSC 3:1 compression ratio. DP 2.0 would be nice but realistically I'd prob stick with a TV model than paying up to triple+ in some cases for a comparable desktop gaming monitor version to get dp2.0 someday and potentially end up suffering AG coating to boot which would really bother me, especially on an OLED.

(The 8k signals even if as 4k upscaled on the display end in order to get higher fps)

8k at 32:10 ultrawide 7680 × 2400 rez @ 200Hz, 10bit signal at 3:1 compression: 41.03 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

8k at 24:10 ultrawide 7680 × 3200 rez @ 150Hz, 10bit signal at 3:1 compression: 40.02 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

4k at 24:10 ultrawide 3840 × 1600 rez @ 500Hz, 10bit
signal at 3:1 compression: 40.73 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)
 
Finally managed to get an RTX 4090. Decided to pair it up to the CX instead of the X27. Maybe I'm just crazy but it feels like on nvidia there's noticeably more VRR flicker than there was when I was using my 6900 XT. Maybe Freesync Premium helped with this?
 
Finally managed to get an RTX 4090. Decided to pair it up to the CX instead of the X27. Maybe I'm just crazy but it feels like on nvidia there's noticeably more VRR flicker than there was when I was using my 6900 XT. Maybe Freesync Premium helped with this?
Interesting. Might want to compare what games and especially what frame rate ranges you are viewing including the lower end. And how much fluctuation, how erratic the frames are being delivered. Might try lowering settings and see if it's as obvious at higher rame rate ranges, and a different game if one or its settings are making rhe frame rate flow more choppy for some reason.

Make sure hdmi black level is set full etc and things like that, (brightness levels gamma etc not lifting dark areas) and running native rez. (using DLSS?). When the tv is disconnected some things revert like named pc icon mode 444 / rgb) .

Will try to think of anything else.
 
Maybe also lock the max framerate to 119 or 118 to keep Gsync on and in range without going over max refresh and dropping back down under constantly... which a 4090 is powerful enough to do. :)
 
I don't think it has much to do with the average frame rates though. The specific example that caught my eye was the "loading screen" in AC Valhalla where you can free run around in a dark foggy area. The 4090 seemed to exhibit much more pronounced flicker than the 6900 XT did. But it's probably just a case of more/worst frame time spikes on the 4090 during that loading screen or something. VRR flickering was something that I always experienced on loading screens where my frametime graph is all over the place even on non OLED displays so this is nothing new at all. In game though I don't really see much of it, at least not with the games that I play. And yes the 4090 is absolutely able to smash past 120fps at 4K, unless you love ray tracing lol. Might even swap it out for a 7900 XTX if the raster performance is close enough as I honestly can't tell a damn difference between RT on and off at a glance. I either need a direct side by side or I have to really analyze parts of the image to notice and nobody plays games like that.
 
Last edited:
That reminds me of the menu screen issue in "New World" that was frying gpus with 1000 fps or something, fan module overvolting and burning up/popping like pop corn . . I forget but it was definitely related to an extreme fps unlocked menu issue like that.

Idk if you ever watch "Sadly it's Bradly" on youtube. He does insider info + data mining rumors/information on upcoming VR tech. One of his more recent vids was on Valve Prism + Deckard. In it he mentions some indications of VRR brightness compensation tech.

FkAhVQ8.png


. . . . . . . . . . . . .

Steam VR's Prism - Sadly It's Bradly (youtube vid link)
 
..

Might even swap it out for a 7900 XTX if the raster performance is close enough as I honestly can't tell a damn difference between RT on and off at a glance. I either need a direct side by side or I have to really analyze parts of the image to notice and nobody plays games like that.

I know that shadows can make a huge difference. People used to turn them off or set them to low, perhaps more often in the past, but good global dynamic shadows can deliver an almost 3D ~ holographic look to a game. I notice that kind of aesthetic gain in a big way personally. It takes away the flatness of the scene, gives things depth. Especially large outdoor games with long view distances and animated objects (and shadows) both near and far, all the way out in the distance. So RT can add very dynamic 3d effect between shadows and highlights, reflections in those types of games, depending how well it's implemented.

I don't use RT due to the performance hit but I can see how it could be valuable if the big performance drop wasn't the tradeoff. The catch 22 now might be that in some cases games that have the frame rate to spare might not be as detailed and/or as open world wise so not have quite as great of an effect overall. Even less detailed stylized rpg/adventure//mmo can be very demanding with very high view distances + #anim objects in distance. Games with very long view distances unlocked + high # of animated objects and effects in the distance maxed in settings usually destroy frame rates when outdoors. Could also depend to a degree on how well the dev designed the game. Sill, even simple minecraft/pixelated games look cool with the dynamic lighting of raytracing so I definitely see the benefit outside of the frame rate hit in games.

. . .

Even dynamic shadows on high+ without raytracing, in large outdoor area games with very far view distances + large # of animated objects in distance, can deliver a more 3d/holographic feel - so RT could do that and better with highlight/reflection/shadow dynamism. You might not notice it in a corridor shooter or some demolition derby arena shooter as much. AC Valhalla HDR, or Odyssey HDR should benefit from RT though as they are large adventure games with long view distances (depending on your settings and where you are in the game worlds).
 
Last edited:
I know that shadows can make a huge difference. People used to turn them off or set them to low, perhaps more often in the past, but good global dynamic shadows can deliver an almost 3D ~ holographic look to a game. I notice that kind of aesthetic gain in a big way personally. It takes away the flatness of the scene, gives things depth. Especially large outdoor games with long view distances and animated objects (and shadows) both near and far, all the way out in the distance. So RT can add very dynamic 3d effect between shadows and highlights, reflections in those types of games, depending how well it's implemented.

I don't use RT due to the performance hit but I can see how it could be valuable if the big performance drop wasn't the tradeoff. The catch 22 now might be that in some cases games that have the frame rate to spare might not be as detailed and/or as open world wise so not have quite as great of an effect overall. Even less detailed stylized rpg/adventure//mmo can be very demanding with very high view distances + #anim objects in distance. Games with very long view distances unlocked + high # of animated objects and effects in the distance maxed in settings usually destroy frame rates when outdoors. Could also depend to a degree on how well the dev designed the game. Sill, even simple minecraft/pixelated games look cool with the dynamic lighting of raytracing so I definitely see the benefit outside of the frame rate hit in games.

. . .

Even dynamic shadows on high+ without raytracing, in large outdoor area games with very far view distances + large # of animated objects in distance, can deliver a more 3d/holographic feel - so RT could do that and better with highlight/reflection/shadow dynamism. You might not notice it in a corridor shooter or some demolition derby arena shooter as much. AC Valhalla HDR, or Odyssey HDR should benefit from RT though as they are large adventure games with long view distances (depending on your settings and where you are in the game worlds).

Well don't get me wrong, I do think RT is cool and that it's the future of gaming. I think the main issue right now is that RT in just about every game is very limited to only certain lighting. It's not like say, Portal to Portal RTX where the entire game is path traced and the difference in visuals is immediately obvious and huge. Turning on RT from off to the maxed out preset in games like CP2077, Control, Dying Light 2, etc. there is a difference in visual fidelity sure, but it's far more subtle and doesn't make an immediate day and night difference like Quake and Portal RTX does IMO. And yeah cutting my fps by ~40-50% on average also isn't a good proposition for turning it on since I'm now turning what is an over 120+ fps experience into an 80fps one, on a $1599+ GPU to boot . The performance hit for turning on RT is still pretty big even on Lovelace, in fact in some games it's actually no better than Ampere in that both architectures will lose the same % performance when turning RT on.

1668104351523.png
1668104358535.png
1668104378911.png
1668104448763.png



I was hoping that by the 3rd generation RTX GPU we could turn on RT without cutting fps by up to 66%. Hell even the FIRST gen RTX GPU, the 2080 Ti isn't that far off from the latest and greatest Lovelace when it comes to % performance loss from turning on RT, kinda sad actually. Seems like nvidia isn't really making these new RT cores more efficient at their jobs at all, instead we are just getting more brute force to push higher frame rates in general. Until either games look massively upgraded with RT on, or the performance hit is down below 20% in even the heaviest RT implementations, I'd prefer to just leave it off.
 
Last edited:
  • Like
Reactions: elvn
like this
It's sound like it's going to take a dev shift for RTX but probably also for the other thing we'd talked about where instead of nvidia's guessing vectors based on a buffered image, the games themselves would report the vectors of objects and backgrounds, cameras, etc. while the OS/drivers would report those of the peripherals. Which I believe VR dev does but I'd think VR game devs expect that going in where pancake screen devs doing it might be a slow adoption. The VR frame projection / spacewarp stuff is based on in game virtual objects, background, camera, etc vectors but also the headset where on pc desktop it would be the mouse and keyboard or gamepad entry (mouse-looking, movement keying, controller panning).

They have to up their game, literally:

-HDR dev (maybe even Dolby Vision)
-RTX Dev
-Frame Amplification Dev (based on actual reported game engine and peripheral/OS driver vectors projectiing the next frame rather than nvidia's "look at the next frame and guess the vectors from the difference in the two images" method.
- Surround sound / ATMOS.

I'd also like if they did gaming AI upscaling hardware on the displays themselves if possible to avoid the port+cable bottleneck, even if 4k to 8k on future 8k screens (or their uw resolutions when gaming).
 
Last edited:
It's sound like it's going to take a dev shift for RTX but probably also for the other thing we'd talked about where instead of nvidia's guessing vectors based on a buffered image, the games themselves would report the vectors of objects and backgrounds, cameras, etc. while the OS/drivers would report those of the peripherals. Which I believe VR dev does but I'd think VR game devs expect that going in where pancake screen devs doing it might be a slow adoption. The VR frame projection / spacewarp stuff is based on in game virtual objects, background, camera, etc vectors but also the headset where on pc desktop it would be the mouse and keyboard or gamepad entry (mouse-looking, movement keying, controller panning).

They have to up their game, literally:

-HDR dev (maybe even Dolby Vision)
-RTX Dev
-Frame Multiplication Dev (based on actual reported game engine and peripheral/OS driver vectors projectiing the next frame rather than nvidia's "look at the next frame and guess the vectors from the difference in the two images" method.
- Surround sound / ATMOS.

I'd also like if they did gaming AI upscaling hardware on the displays themselves if possible to avoid the port+cable bottleneck, even if 4k to 8k on future 8k screens (or their uw resolutions when gaming).

I'm just wondering why everyone believes nvidia has been making improvements to RT performance every gen when the numbers don't line up with that. Let's say the upcoming 4070 Ti is supposed to be on par with the 3090 Ti in raster, but faster in ray tracing because of the 3rd gen RT core vs 2nd gen RT core, and both games get 100fps with RT disabled. If the 3090 Ti turns on RT it loses 40% performance and drops to 60fps, the 4070 Ti should suffer a much less performance penalty for turning RT on and lose like 25% or something, but I'm pretty sure it's going to also lose 40% and be right in the same ballpark as the 3090 Ti in RT as well. So how exactly is nvidia improving things on the RT performance front? Seems like all they did was make more powerful GPUs in general. Anyways I'm getting off topic here so I'll end my rant about RT performance. Seeing the raster performance of the 4090 though really makes me pray for a 4K 160+Hz OLED next year. CES 2023 maybe?
 

Attachments

  • HZD 4090.png
    HZD 4090.png
    2.5 MB · Views: 1
Last edited:
I'm just wondering why everyone believes nvidia has been making improvements to RT performance every gen when the numbers don't line up with that. Let's say the upcoming 4070 Ti is supposed to be on par with the 3090 Ti in raster, but faster in ray tracing because of the 3rd gen RT core vs 2nd gen RT core, and both games get 100fps with RT disabled. If the 3090 Ti turns on RT it loses 40% performance and drops to 60fps, the 4070 Ti should suffer a much less performance penalty for turning RT on and lose like 25% or something, but I'm pretty sure it's going to also lose 40% and be right in the same ballpark as the 3090 Ti in RT as well. So how exactly is nvidia improving things on the RT performance front? Seems like all they did was make more powerful GPUs in general. Anyways I'm getting off topic here so I'll end my rant about RT performance. Seeing the raster performance of the 4090 though really makes me pray for a 4K 160+Hz OLED next year. CES 2023 maybe?
If we look at a purely path traced game (Quake 2 RTX) you start to see where the performance gains are. Unfortunately these are rarely seen in tests.

I don't have exact numbers, but if I set a target framerate of 144 fps and let Quake 2 RTX use dynamic resolution scaling to maintain it, my 2080 Ti would be running at sub-1080p resolutions, probably closer to 720p if not even lower. My 4090 runs at around 2880x1620 so 75% of 4K.

My guess is that even if the percentages for framerate drops look the same, just the fact that the 4090 runs RT games at massively higher framerates means it has to process the RT effects faster because they are going to take a good chunk of frametime.

As another empiric example, with the 2080 Ti I pretty much turned RT off in Shadow of the Tomb Raider because it was a good way to reach better framerates. With my 4090 I can turn off DLSS, turn RT effects to max and still maintain around 200 fps framerates at 4K!

TechPowerup has really done a disservice by not publishing minimum framerates too because I tried my 4090 with my old Ryzen 3700X and then replaced it with a system with 13600K and the minimum framerates improved just massively while the maximum framerates were not always that much higher but the overall experience is so much smoother. Also note that a lot of TechPowerup data is using a 5800X CPU, which is fine but they have themselves tested that 5800X3D or latest gen AMD/Intel do give up to 30% boosts in some games.
 
If we look at a purely path traced game (Quake 2 RTX) you start to see where the performance gains are. Unfortunately these are rarely seen in tests.

I don't have exact numbers, but if I set a target framerate of 144 fps and let Quake 2 RTX use dynamic resolution scaling to maintain it, my 2080 Ti would be running at sub-1080p resolutions, probably closer to 720p if not even lower. My 4090 runs at around 2880x1620 so 75% of 4K.

My guess is that even if the percentages for framerate drops look the same, just the fact that the 4090 runs RT games at massively higher framerates means it has to process the RT effects faster because they are going to take a good chunk of frametime.

As another empiric example, with the 2080 Ti I pretty much turned RT off in Shadow of the Tomb Raider because it was a good way to reach better framerates. With my 4090 I can turn off DLSS, turn RT effects to max and still maintain around 200 fps framerates at 4K!

TechPowerup has really done a disservice by not publishing minimum framerates too because I tried my 4090 with my old Ryzen 3700X and then replaced it with a system with 13600K and the minimum framerates improved just massively while the maximum framerates were not always that much higher but the overall experience is so much smoother. Also note that a lot of TechPowerup data is using a 5800X CPU, which is fine but they have themselves tested that 5800X3D or latest gen AMD/Intel do give up to 30% boosts in some games.

Yeah but again it's kinda like just saying that the 4090 is a faster card in general that's why it does RT better, not that it does it more efficiently. I'd prefer to have a GPU generation where I can turn on RT effects in games and experience minimal performance penalty for doing so. But it seems like we aren't going to head that direction and instead just brute force it so that even with all RT maxed out we can still get 200fps at 4K, but then if you turn RT off you will get 400fps and then that reason for not using RT is still going to be there.
 
Frame amplification tech seems like a good way to at least double frame rates to start with. Perhaps even more in the future. That would have a huge effect on frame rates which could help RT be more easily digested.

. . . . . .

However this is how I understand it:

Imagine an animation flip book with a clear overlay sheet on each drawn page, and blank page in between every drawn page cell.

(Clear overlay) on Drawn Cell 1 + [Blank Cell 2] + Drawn Cell 3

Nvidia's frame generation form of frame amplification tech is starting at a drawn cell 1 page and then flips ahead "two pages" to the next drawn Cell 3 page. Then it's doing it's best AI thinking to guess what the vectors are on the 1st cell in order to figure out how to build a "tween" cell get to the 3rd frame cell. Then it's drawing that best guess on the middle blank page based on it's imagined/guessed vectors. This has a lot of trouble in a lot of things, especially things like 3rd person adventure games or any games where the virtual camera moves around independently of the character, causing artifacts from bad guesses as the camera pathing makes some things in the sceen remain motionless or makes them move at different rates in regard to the FoV even though they are technically moving or at different speeds in the game world itself. It also has more artifacting the lower the foundation frame rate is (e.g. beneath 120fps) since that has larger time gaps between animation cells (animation flip book with far fewer pages so transitioning between fewer animation cells states, more staggered/chopped animation cell cycles).

VR's space warp style frame amplification tech instead has a clear page overlay on top of the first page. The clear page has a z-axis grid with a number of arrows and rates for some of the vectors already written on the clear page. VR system's apps can use actual vector data incl. rates, arcs, player inputs, etc. and then plot and draw measured results on the middle page going forward. So it's not completely guessing what the vectors are for everything in the scene, including the player's inputs and the effects of the virtual camera (or the player's head motion and hand motions in VR).

Unfortunately nvidia went with the former uninformed version, at least in the near timeframe. Hopefully the whole industry will switch to the VR method at some point. That would require the PC game devs to write their games using the same kind of VR tools and do the work to code for that. They have a lot of room to advance. Unfortunately progress is a lot slower on the PC and PC display front than the VR front, though in aspects like PPD,VR is way "behind" or inferior and will continue to be for some time yet.

. . .

from blurbuster.com 's forum replies (Mark R.) :

reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it.

One interesting innovation of reprojection that has not yet been done is making sure the pre-reprojected framerate is above flicker fusion threshold. That causes stutters in reprojection to disappear!

For VR, We Already Have Hybrid Frame Rates In The Same Scene
----------------------------------------------------------------------------------------------------


For example on Oculus Rift 45fps to 90fps, sometimes certain things stutter (hand tracking 45fps) while the background scrolls smooth (90fps) via head turns.

But if we had 100fps reprojected to 500fps, then even physics objects like enemy movements would still be smooth looking, just simply more motionblurred (due to frametime persistence) than turns (due to reprojection-based frame rate amplification).

Not everything in the game world *needs* to run at the same framerate; if motion blur is acceptable for such movements.

Different things running at different frame rates on the same screen is very common with reprojection (Oculus Rift), which is ugly when some of the framerates are below stutter/flicker detection threshold.

But if all framerates could be guaranteed perfect framepaced triple-digit, then no stuttering is visible at all! Just different amounts of persistence motion blur (if using reprojection on a non-strobed display). This will be something I will write about in my sequel to the Frame Rate Amplification Article.

Hybrid Frame Rates Stop Being Ugly if 100fps Minimum + Well Framepaced + Sample And Hold
------------------------------------------------------------------------------------------------------------------------------------------


Hybrid frame rates will probably be common in future frame rate amplification technologies, and should no longer be verboten, as long as best practices are done:

(A) Low frame rates are acceptable for slow enemy movements, but keep it triple-digit to prevent stutter
(B) High frame rates are mandatory for fast movements (flick turns, pans, scrolls, fast flying objects, etc)
(C) If not possible, then add GPU motion blur effect selectively (e.g. fast flying rocket running at only 100 frames per second, it's acceptable to motionblur its trajectory to prevent stroboscopic stepping)

The frame rates of things like explosions could continue at 100fps to keep GPU load manageable, but things like turns (left/right) would use reprojection technology. The RTX 4090 should easily be capable of >500fps reprojection in Cyberpunk 2077 at the current 1 terabyte/sec memory bandwidth -- and this is a low lying apple just waiting to be milked by game developers!

In other words, don't use 45fps. Instead of 45fps-reproject-90fps, use min-100fps-reproject-anything-higher. Then frame rate amplification can be hybridized at different frame rates for different objects on the screen -- that becomes visually comfortable once every single object is running at triple-digit frame rates!

Technically, reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it. Except it's overlaid on top of non-VR games.

One major problem occurs when you're doing this on strobed displays -- sudden double/multi-image effects -- and requires GPU motion blur effect (mandatory) to fix the image duplicating (akin to CRT 30fps at 60Hz). However, this isn't as big a problem for sample-and-hold displays.
 
Last edited:
Decided to keep my 48CX and buy the new zowie 25" XL2566K instead for FPS. I need to find a monitor arm though that can reach ~70-80cm so I can put it in front of my CX and then move it out of the way when not using it. Anyone have suggestions?
 
I need to clean my 48CX after a move is there anything I need to be careful not to use on the screen?
 
I need to clean my 48CX after a move is there anything I need to be careful not to use on the screen?
You're not really supposed to use any cleaners on screens. Just use a dry micro fiber cloth, if it's still dirty get it a little damp and wipe off any spots, then go back to a dry microfiber cloth.

I've heard people say X cleaner is fine to use on screens, but I've never had anything on a screen a damp microfiber cloth can't get off. I wouldn't risk it if you don't have to.
 
I need to clean my 48CX after a move is there anything I need to be careful not to use on the screen?
I use Ecomoist cleaner (available in the UK, there is probably an NA equivalent) - spray it on the cloth then wipe. Alternatively, if you want to be really careful, damp a cloth in distilled water (tap water will have minerals and will streak, I wouldn't use especially if you have hard water). The LG manual says to use a dry cloth but it's impossible to clean it with a dry cloth imo, and you don't want to rub hard when cleaning the screen.
 
Last edited:
^ Agree with those folks. There may be some screen cleaners that are safe to use, but I've rarely had to use anything more than filtered water + a microfiber cloth. Anything beyond that could be detrimental and/or a waste of $. Some of them could work well but make sure you read the fine print (and try water first because why not?).
 
I've gotten used to 48 over the last 2 years so I don't see much reason to upgrade if the image quality and Hz is the same. Some would argue that the image quality would be downgraded due to the 42s lower peak brightness vs the 48. The only thing that will get me to upgrade at this point is noticeably better image quality like a QD-OLED or a decent bump in Hz to like 160Hz.

Yeah I think you're right. Still, the prices this black friday are so good (700 right now in the UK). I still find the 48 slightly too big occasionally (like if I have to do a work call on Zoom and need to fullscreen share), but the immersion is great at around ~1m away.

edit: I couldn't help myself, I pulled the trigger on the 42, that price is too good. I'll be trying out FPS which was my main problem game and report how improved it is.
 
Last edited:
Yeah I think you're right. Still, the prices this black friday are so good (700 right now in the UK). I still find the 48 slightly too big occasionally (like if I have to do a work call on Zoom and need to fullscreen share), but the immersion is great at around ~1m away.

edit: I couldn't help myself, I pulled the trigger on the 42, that price is too good. I'll be trying out FPS which was my main problem game and report how improved it is.

FPS is epic on a big OLED unless you want a competitive edge for multiplayer then yeah I can see 48" being a little problematic. I'm enjoying Crysis Remastered on my CX with Auto HDR and maxed out RT on a 4090 and it's amazing.
 

Attachments

  • 20221124_213225.jpg
    20221124_213225.jpg
    680.7 KB · Views: 1
Back
Top