Expensive Wide Gamut HDR monitors and Oversaturation

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,765
As title.
Many expensive HDR Monitors are wide gamut and in windows or in many app some colors are really oversaturaded.

In games too.

Is there a way to solve this problem ?
RGB emulation does not solve the problem, mitigate it but does not solve it.

Is it possible that all modern expensive monitors shows wrong colors?
 
Part of the HDR is the bigger DCI-P3 colorspace. Blame windows for sticking to ages old and tiny sRGB colorspace. Anyway sRGB emulation that shrinks the color space is the only solution. If the monitor does not do this correctly then there is nothing you can do outside of color managed programs. Most TV's can do this correctly so there is no technical reason why monitors could not do it.

Personally I do not mind sRGB content on DCI- P3 screen. It is bigger but not massively so and it expands with rather stable increments to all directions. The colors still look relatively correct, just stronger unlike Adobe RGB which makes sRGB content look like candy.

*Edit* actually that reminded me. You can reduce saturation slightly in GPU control panels (digital vibrance control with Nvidia). It won't make colors more correct, can make them even worse, but if strong colors are not to your liking then you can make them duller with it. This was the only way to make AdobeRGB sceen tolerable in games.
 
Last edited:
Thanks for the answer MaZa.
This sounds pretty bad for all people buying HDR monitors.

All in all colours are not that bad even in the sRGB colore space but there are some tints that are badly wrong.

Reds for example are are way oversaturated.

But is this a problem of non colored managed apps only?
For example, can I see an image correctly saturated in a web page using Chrome?
Does chrome support color management?

Thanks
 
Thanks for the answer MaZa.
This sounds pretty bad for all people buying HDR monitors.

All in all colours are not that bad even in the sRGB colore space but there are some tints that are badly wrong.

Reds for example are are way oversaturated.

But is this a problem of non colored managed apps only?
For example, can I see an image correctly saturated in a web page using Chrome?
Does chrome support color management?

Thanks

To get correct colors for color managed applications you need a colorimeter and calibrate your screen with it. After that you have correct colors in photoshop, chrome and firefox if you enable it in the latter.

A little indepth, after calibration you have correct color temperature and gamma which benefits all applications, color managed or not. But mapping colors (like shrinking the color space) works only in color manages apps like those mentioned above. Only they can read the ICC profile file that contains the mapping data.
 
The only way is to get an sRGB monitor, or an HDR monitor with sRGB mode. Gaming companies tend to neglect the need of sRGB color space in the HDR monitors because it's cheaper to make a monitor that way. They actually try to push the idea that wide gamut gamut is better than sRGB gamut display because it can display more colors... while it is total bs. Wide gamut is only better for HDR content, but for the rest of monitor use - windows, internet and games that don't support HDR, wide gamut is actually bad, because colors get stuck at their max values and look oversaturated. Color of the sRGB level 255 is not the same as DCI-P3' 255, while monitor thinks it is.
 
Reading user reviews I was shocked that there are many people who actually like it. And many think that wide gamut = better color accuracy :facepalm:
 
To get correct colors for color managed applications you need a colorimeter and calibrate your screen with it. After that you have correct colors in photoshop, chrome and firefox if you enable it in the latter.

A little indepth, after calibration you have correct color temperature and gamma which benefits all applications, color managed or not. But mapping colors (like shrinking the color space) works only in color manages apps like those mentioned above. Only they can read the ICC profile file that contains the mapping data.

I have a colorimeter but when I generate my ICC I don't target any color space, I target temperature and brightness.

Is that ICC good for color managed app or should I say the software to target a specific color space to make it work in sRGB mode on color managed app?
 
I recently bought an HDR400 monitor so I could see for myself some the issues. I used HDR test patterns found on AVS forums. With the monitor in HDR mode, reds of the HDR patterns was more saturated than any other. I could easily use the graphics card adjustment to get HDR that looked pretty darn nice. I never got the full range of black to display, but was close. Unfortunately, even though windows10 has made improvements, I felt like all other content was not acceptable. I also found not all browsers and movie players render things the same way. So, I set it back to SDR mode. It makes a really great SDR monitor. One of the best I have had. I am no expert by any means. I was just hoping for something reasonably correct no matter what was being displayed.
 
There are HDR monitors that properly reproduce the sRGB color space if you select 8-bit color in your graphic card's control panel. The PG27UQ and X27 are two of those monitors that accurately cover the sRGB color space when set to 8-bit color. These are premium monitors, though. It wouldn't surprise me to see "HDR Ready" or HDR400 monitors not able to display color properly in 8-bit color mode.
 
personally I had two wide gamut monitors, an Eizo SW2433W S-PVA and this XV273K.
I don't like how those monitors looks outside the color managed environment and they tend to be even more unprecise in terms of panel uniformity.
There are HDR monitors that properly reproduce the sRGB color space if you select 8-bit color in your graphic card's control panel. The PG27UQ and X27 are two of those monitors that accurately cover the sRGB color space when set to 8-bit color. These are premium monitors, though. It wouldn't surprise me to see "HDR Ready" or HDR400 monitors not able to display color properly in 8-bit color mode.

no, even high end PG27UQ and X27 reproduces sRGB contents like crap, they cover 99% of the sRGB color space but this does not means that they have accurate reproduction of that color space.
 
As title.
Many expensive HDR Monitors are wide gamut and in windows or in many app some colors are really oversaturaded.

In games too.

Is there a way to solve this problem ?
RGB emulation does not solve the problem, mitigate it but does not solve it.

Is it possible that all modern expensive monitors shows wrong colors?
sRGB emulation solves it entirely if it's done properly. Completely solves it on my Lenovo Legion Y27q-20.
 
sRGB emulation solves it entirely if it's done properly. Completely solves it on my Lenovo Legion Y27q-20.

cheap monitors can't do it right trust me.
leave alone the fact that you can't calibrate the monitor when "this emulation" is enabled.
 
I wish i knew those kind of things before buying my wide gamut monitor, especially since HDR support is non-existent in those monitors so what's the point anyway?

I mean, in a certain way, i do kinda of enjoy the colors, but then sometimes it becomes really jarring to realize certain things aren't looking the way they are supposed to.

BTW, on my monitor (27GL850), there is an sRGB emulation mode (which is also supposed to be factory calibrated. At least that), but compared to an old 23" ISP i had, it still looks slightly more saturated. HOWEVER, if i go to my Radeon settings and change the color temperature from 6500k to "automatic", everything suddenly looks identical to the 23". Is the AMD driver capable of managing this color space? Or is this just an illusion? I.E., it may look similar on the naked eye but lacking in accuracy when actually calibrated?

I was kinda taken aback when i found this out. I now watch movies with this setting in automatic, where as games i still play them in wide gamut just because i have to use this thing somewhere since i paid for it. Damn it.
 
Have there been any good solutions or workarounds with this topic?

While researching a new monitors, I've noticed that pretty much everything I'm interested in is a wide gamut display. Based on reviews, it seems that for the monitors that have sRGB emulation, the sRGB modes aren't very reliable and often lack important adjustments, and then other popular displays have no sRGB mode at all.

With Windows 11, I've read some comments that leaving HDR turned on at all times tames the desktop and wallpapers to no longer be oversaturated, but it isn't clear to me if that would tame the colors in SDR games to not be oversaturated.

I also read about an option for AMD GPUs to set Color Temperature Control to disabled, but it isn't clear to me if that is a reliable setting.
 
Monitors only use wide gamut when displaying HDR.
When not displaying HDR they use SDR colour range.
There is no problem with the displays.

There was a problem with Windows 10 not compensating for the increased range of colours when enabling HDR on a none HDR desktop.
The previous cure always existed, it was to not enable HDR on the desktop.
They fixed this now, you can leave HDR on all the time and tone it down with a slider.
It appears to work for games too because when I dont enable HDR in game, it doesnt looks as good as it would with HDR functioning (as expected) but doesnt look wrong.
You can still turn HDR off in Windows as well.
 
Last edited:
People use their old 'sRGB' as reference to how colors should look and are complaining colors are more saturated 🤣
Native sRGB LCD monitor with proper colors are very rare. Take random sRGB monitor and it will probably look nothing like sRGB should look like.

Wide gamut monitors have decent if not very good sRGB emulation for years now. In fact better than native sRGB because once you get circuitry for it is easier to emulate smaller color space than match back-light and color filters to sRGB.
Wide gamut was was only an issue on early WCG monitors that didn't have any sRGB emulation or did it by abusing YUV colorspace color controls built in to scalers which solved over-saturation but didn't look very correct.

When buying these monitors one have to accept that colors will only look good at sRGB mode. Which means if HDR was to be used it will require some OSD manipulation on top of changing Windows setting... probably now worth the hassle unless monitor has FALD or it is OLED or something.

BTW. Radeons do have color correction.
Unfortunately even though it works very good and works in all applications (literally it is like set and forget setting, kinda like if monitor had hardware calibration) the only source of gamut data it can use are settings stored in monitors EDID. This means those settings cannot be changed in any way except with maybe some sort of EDID emulator. This is an issue because we depend on monitor manufacturer to provide correct data. Also gamut information stored in EDID do not have very high numerical precision so this data is not very precise either. For quick correction for monitors which cannot do it by itself it should be enough. Especially great for 'native' sRGB which usually have some over/under coverages or just wrong hues of color primaries and this setting correct these nicely. I used this option on SONY GDM-FW900 with EDID emulator and measured gamut information. It improved color rendering.
There is however one issue with this setting. If whitepoint stored in monitor's EDID is not exactly matching sRGB then this color correction goes all wonky and instead of just correcting gamut it tries to correct color temperature and it looks completely broken. So monitor cannot have stored different whitepoint limiting amount of monitors this setting can be used for. When this issue exists it is immediately visible. It it does not only color saturation changes and grayscale does not change at all. People reported being able to use it to have correct colors on some of these older wide color gamut monitors. For new wide gamut monitors with sRGB mode I recommend using monitor's built-in emulation.
 
Have there been any good solutions or workarounds with this topic?

While researching a new monitors, I've noticed that pretty much everything I'm interested in is a wide gamut display. Based on reviews, it seems that for the monitors that have sRGB emulation, the sRGB modes aren't very reliable and often lack important adjustments, and then other popular displays have no sRGB mode at all.

With Windows 11, I've read some comments that leaving HDR turned on at all times tames the desktop and wallpapers to no longer be oversaturated, but it isn't clear to me if that would tame the colors in SDR games to not be oversaturated.

I also read about an option for AMD GPUs to set Color Temperature Control to disabled, but it isn't clear to me if that is a reliable setting.

The Color Temperatue Control thingy works. The sRGB emulation works too, but i dislike how you are locked out of everything except brightness.

I cannot vouch how accurate both of those options really are, but then again if you care about accuracy you are probably a graphic designer and wide gamut is not an issue there.

All in all everything worked out since i bought this monitor but the cheaper, sRGB version of my LG screen would have probably been preferable considering my needs. No point recriminating now i guess.
 
Last edited:
People use their old 'sRGB' as reference to how colors should look and are complaining colors are more saturated 🤣

If the content was made for sRGB that's how it should look. I have no idea what you are even trying to say here, are you arguing that the saturated look is the "real" one we just didn't realize until now because we were using an sRGB screen all along?

I'm an old school gamer and i'll tell you right now those games were made for sRGB and don't look normal at all in wide gamut.
 
People use their old 'sRGB' as reference to how colors should look and are complaining colors are more saturated 🤣
Native sRGB LCD monitor with proper colors are very rare. Take random sRGB monitor and it will probably look nothing like sRGB should look like.

Wide gamut monitors have decent if not very good sRGB emulation for years now. In fact better than native sRGB because once you get circuitry for it is easier to emulate smaller color space than match back-light and color filters to sRGB.
Wide gamut was was only an issue on early WCG monitors that didn't have any sRGB emulation or did it by abusing YUV colorspace color controls built in to scalers which solved over-saturation but didn't look very correct.

When buying these monitors one have to accept that colors will only look good at sRGB mode. Which means if HDR was to be used it will require some OSD manipulation on top of changing Windows setting... probably now worth the hassle unless monitor has FALD or it is OLED or something.

BTW. Radeons do have color correction.
Unfortunately even though it works very good and works in all applications (literally it is like set and forget setting, kinda like if monitor had hardware calibration) the only source of gamut data it can use are settings stored in monitors EDID. This means those settings cannot be changed in any way except with maybe some sort of EDID emulator. This is an issue because we depend on monitor manufacturer to provide correct data. Also gamut information stored in EDID do not have very high numerical precision so this data is not very precise either. For quick correction for monitors which cannot do it by itself it should be enough. Especially great for 'native' sRGB which usually have some over/under coverages or just wrong hues of color primaries and this setting correct these nicely. I used this option on SONY GDM-FW900 with EDID emulator and measured gamut information. It improved color rendering.
There is however one issue with this setting. If whitepoint stored in monitor's EDID is not exactly matching sRGB then this color correction goes all wonky and instead of just correcting gamut it tries to correct color temperature and it looks completely broken. So monitor cannot have stored different whitepoint limiting amount of monitors this setting can be used for. When this issue exists it is immediately visible. It it does not only color saturation changes and grayscale does not change at all. People reported being able to use it to have correct colors on some of these older wide color gamut monitors. For new wide gamut monitors with sRGB mode I recommend using monitor's built-in emulation.
Thank you for the detailed response. I appreciate your reply. Many people ignore this topic or claim that the over saturated colors look better to them. I'm all for having a wider gamut of colors available for games that will actually utilize it.

I currently have a Dell U3818DW, and it is accurate enough for my needs on its built in 6500K setting. I'm not concerned with 100% sRGB calibration as I don't do professional photo editing, but I also don't want significantly over saturated colors when playing SDR games that don't utilize the wider color gamut. The monitor I'm currently looking at purchasing is the AW3821DW, which doesn't have any sRGB mode, so I would be relying on workarounds such as the Radeon setting. But even for monitors that have sRGB emulation, I've not found a single one that has a review saying it looks good. They always have issues but the settings are locked in the sRGB mode. Just look at the following page, for example. Many of the top gaming monitors don't have sRGB modes, and the ones that do do it poorly. https://www.tomshardware.com/reviews/best-gaming-monitors,4533.html

The most helpful article I've found on the topic so far is https://pcmonitors.info/articles/taming-the-wide-gamut-using-srgb-emulation/.
 
The Color Temperatue Control thingy works. The sRGB emulation works too, but i dislike how you are locked out of everything except brightness.

I cannot vouch how accurate both of those options really are, but then again if you care about accuracy you are probably a graphic designer and wide gamut is not an issue there.

All in all everything worked out since i bought this monitor but the cheaper, sRGB version of my LG screen would have probably been preferable considering my needs. No point recriminating now i guess.
When using the Color Temperature Control, do you then have to open up Radeon settings and set it back to normal when viewing actual HDR content, such as HDR YouTube and games with an HDR mode? I'm not a designer so I'm not concerned with perfection. I'm just trying to decide if its worth the hassle to move to a wide gamut monitor for a higher refresh rate. I'm still stuck in the 60Hz world now, and my current monitor doesn't even have adaptive sync.

How does the Windows look when leaving HDR setting turned on? My understanding is that when turning the HDR setting on it results in the desktop using sRGB colors, similar to Color Temperature Control, albeit with potential side effects caused by the monitor running in HDR mode, such as black crushing and local dimming concerns, based on the specific monitor.
 
Yes, you have to turn it on again, otherwise everything looks saturated. I mean, it's like the reverse problem.

HDR on windows look ok, but to be honest HDR400 isn't really that exciting. Not that it matters. Monitors with good HDR cost more than an OLED TV. Hell, some cost more than a car (i'm looking at you, Asus).
 
If the content was made for sRGB that's how it should look. I have no idea what you are even trying to say here, are you arguing that the saturated look is the "real" one we just didn't realize until now because we were using an sRGB screen all along?

I'm an old school gamer and i'll tell you right now those games were made for sRGB and don't look normal at all in wide gamut.
That they miss the sRGB gamut... often by a lot. There are plenty of "sRGB" monitors that aren't. Some have less coverage, some have more, some have the primaries in the wrong spot, etc. I had an old MSI laptop that was just abysmal. Everything looked very gray and desaturated. My previous desktop monitor was a bit the other way, advertised as sRGB and did cover it 99-100%... but actually exceeded it particular in green and blue but a bit. Not super noticeable on its own but you could see it next to other screens.

It is just real hard to build a screen that covers the whole space and hits the primaries dead on. So the answer to getting accurate color is to exceed it so you cover the whole space, then do the color space transformations in software (or monitor hardware). That's what the pro monitors tend to do, and have done for a long time.

Really the average user shouldn't worry. Accuracy is not the goal, enjoyment is. Do whatever you like. For games in particular, more saturation often looks good. If you like the look of that then it isn't "wrong". There's not a wrong way to enjoy your media so long as you are happy.
 
There seems to be in people minds equality: wide gamut monitor == oversaturated colors. This thinking is BS
All modern monitors with HDR should have sRGB mode which has correct sRGB gamut.
If there is monitor without it (or it does nothing to correct color space) then do not buy them. There might be other limitations like I have read once that some monitors lock brightness in sRGB mode or have other non acceptable quirks then solution is the same: do not buy those monitors and buy those which work correctly.

If the content was made for sRGB that's how it should look. I have no idea what you are even trying to say here, are you arguing that the saturated look is the "real" one we just didn't realize until now because we were using an sRGB screen all along?

I'm an old school gamer and i'll tell you right now those games were made for sRGB and don't look normal at all in wide gamut.
I am trying to say here that if you were using terrible sRGB screen that had less common with sRGB color space than claimed sRGB coverage had with claimed response times then do not expect proper sRGB colors to be that terrible just because this is what you got used to. Almost all sRGB monitors (both CCFL and W-LED) had washed out colors compared to reference. In the past I compared them to something like EBU phosphors CRTs - it is not exactly sRGB but very very close - and almost no LCD came even close in color rendering. The first monitor I think did was Dell U2410 with color managed environment (because its sRGB emulation was workaround and not real emulation and it was not very precise). Since I got RGB-LED monitor with hardware calibration I am comparing everything to it and surprise surprise colors on such monitor look almost exactly the same as EBU phosphor CRT and exactly the same if I calibrate it to EBU phosphor specs. There exist sRGB screens that match these colors but are not very common.

Today I got LG 27PG950 and the first thing I did was to find sRGB mode, I enabled it and this is it when it comes to issues with oversaturated colors. When I enable HDR then desktop is in sRGB, when I disable it then desktop is still in sRGB.
I did not even see DCI-P3 color space on it yet and do not intend to do so unless I play HDR enabled game on it that has texture/effect with colors outside sRGB color space.

If anyone has issues with oversaturated colors on their HDR monitor then either sRGB mode is not working correctly or they did not enable it or they expected bad colors because old monitor they got used to had bad washed out colors.

HDR on windows look ok, but to be honest HDR400 isn't really that exciting. Not that it matters. Monitors with good HDR cost more than an OLED TV. Hell, some cost more than a car (i'm looking at you, Asus).
I have about two hours of experience in HDR on Windows so not much to say. My monitor is HD600 with 16 zones so pretty much like generic HDR400 monitor, just potentially brighter.
Interesting thing I found out was that my monitor has brightness control in HDR mode so I can set "HDR/SDR brightness balance" to 100% and then limit brightness to the same values as in SDR mode and get very similar result in desktop/SDR content as when monitor is in SDR mode with sRGB emulation. Gamma seems to be slightly different (lower - I can adjust it in Nvidia panel though so that's good) and that would be it. Maybe there are some differences but I would need to do some measurements to see what actually differs. I was actually surprised to find ability to control brightness in HDR and I do not exect every monitor to have such setting. Maybe however most do... who knows.

It would seem there is no point in using HDR on monitor without FALD but maybe with such setup games will look better because of DCI-P3? Theoretically it should work just fine but we will see.

And yeah, prices of current real HDR monitors is complete BS. Surely in time FALD monitors will come to reasonable $1000 territory and more enthusiastic enthusiasts will pay $1000 more than they did last time to get slightly more zones :)
 
Back
Top