Strange monitor behaviour when calibrating it with the colorimeter

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Hi all,
I'm trying to correctly calibrate my wide gamut
Acer Nitro XV273K
using an X-Rite i1 display pro colorimeter and i1 profiler software.

Something strange happen on this monitor that I never experienced before on other monitors.

I can easily achieve 6500K moving the Red or Blu gain but moving the Green gain seems to not influence the measured color temperature even if I can see the temperature changing with my naked eyes.

This is bad because I have one gain control less to achieve the temperature I want. It's like calibrating the monitor using red and blu only.

Why I have this problem?

Please help.

Thanks
 
Joined
Dec 3, 2019
Messages
31
Take a look at the color temperature curve: https://en.wikipedia.org/wiki/Color_temperature
Color temperature is based on how a black body radiator looks when heated up to said temperature. This is a good way to describe the color of many traditional light sources that "glow", such as incandescent light bulbs, the sun or glowing steel. When you look closely at the curve you can see that at 6500°K the short labeled lines that indicate neutral color changes aim at the green your monitor emits. That's why you don't see a (major) color temperature change when turning green up and down. What you really want to measure is an unambiguous "white point" or "target white" - in your case D65 -, which is an exact point in the color diagram: https://en.wikipedia.org/wiki/White_point
D65 used to have a temperature of 6500°K until Planck's law got revised and it was concluded that it's more like the color of a black body at 6504°K, so that's your target temperature now. :p

P.S.:
Note that color correction comes in two parts. 1) gamma curves for red, green and blue and 2) software that remaps colors so they look as close to intended as possible. Both take their data out of an ICC profile. The gamma curves are simply loaded into the graphics card and fix white point as well as brightness differences. That's old tech and works passively in every program (although some old games used it as "free real estate" and overwrote it). The software side on the other hand is responsible for correcting color hues and saturation differences between the capabilities of the monitor and the source material. Games don't come with that and so without HDR support they will output sRGB colors and look oversaturated on a high-gamut monitor. In other words, your carefully crafted ICC profile wont correct game colors beyond the gamma curve. Instead, use the (hopefully factory calibrated) sRGB mode of virtually all wide-gamut monitors. The Acer Nitro XV273K has an excellent sRGB mode by the way*! Colors will look pale, but technically correct in non-HDR games. The gamma curve could be improved a little, so you may yield best results by profiling the sRGB mode and loading that profile when playing non-HDR games. YMMV.

* https://www.prad.de/testberichte/te...d-bildbearbeitung-geht-doch/4/#Farbwiedergabe
(Below the caption "Vergleich sRGB-Modus mit dem sRGB-Arbeitsfarbraum" the sRGB mode color reproduction and gamma of the tested unit is shown.)
 
Last edited:

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Take a look at the color temperature curve: https://en.wikipedia.org/wiki/Color_temperature
Color temperature is based on how a black body radiator looks when heated up to said temperature. This is a good way to describe the color of many traditional light sources that "glow", such as incandescent light bulbs, the sun or glowing steel. When you look closely at the curve you can see that at 6500°K the short labeled lines that indicate neutral color changes aim at the green your monitor emits. That's why you don't see a (major) color temperature change when turning green up and down. What you really want to measure is an unambiguous "white point" or "target white" - in your case D65 -, which is an exact point in the color diagram: https://en.wikipedia.org/wiki/White_point
D65 used to have a temperature of 6500°K until Planck's law got revised and it was concluded that it's more like the color of a black body at 6504°K, so that's your target temperature now. :p

P.S.:
Note that color correction comes in two parts. 1) gamma curves for red, green and blue and 2) software that remaps colors so they look as close to intended as possible. Both take their data out of an ICC profile. The gamma curves are simply loaded into the graphics card and fix white point as well as brightness differences. That's old tech and works passively in every program (although some old games used it as "free real estate" and overwrote it). The software side on the other hand is responsible for correcting color hues and saturation differences between the capabilities of the monitor and the source material. Games don't come with that and so without HDR support they will output sRGB colors and look oversaturated on a high-gamut monitor. In other words, your carefully crafted ICC profile wont correct game colors beyond the gamma curve. Instead, use the (hopefully factory calibrated) sRGB mode of virtually all wide-gamut monitors. The Acer Nitro XV273K has an excellent sRGB mode by the way*! Colors will look pale, but technically correct in non-HDR games. The gamma curve could be improved a little, so you may yield best results by profiling the sRGB mode and loading that profile when playing non-HDR games. YMMV.

* https://www.prad.de/testberichte/te...d-bildbearbeitung-geht-doch/4/#Farbwiedergabe
(Below the caption "Vergleich sRGB-Modus mit dem sRGB-Arbeitsfarbraum" the sRGB mode color reproduction and gamma of the tested unit is shown.)
WOW man. This post is wow, it took me one hour to understand it and not even completely, you gived me so much informations is few lines.
Thanks man. I appreciate it.

Are you someone from prad.de? A simple human can't have this knowledge.

What I don't understand is because this is the first monitor I have calibrated, (I have calibrated a lot of monitors) that doesn't allow me to use the green gain during the white point calibration.
Why most monitors allow me to do it and this no?
Is it the cheap hardware?

I'm disappointed by this monitor and the old Eizo S2433W I had, worlds are not color managed so I think that those monitors are expensive toys.

I need to admit it, I buy monitors on prad's recommendations, if prad.de gives a monitor 5 stars I generally tend to trust them.

I'm a big fan of prad.de, the entire world has something to learn from prad.de

But how can those reviewers give 5 star to a monitor that is not able to use HDR+Freesync?
This is supposed to be a gaming monitor that can do some color accurate work, no HDR + no FreeSync means that you
need to choose between a stuttering/tearing mess and some awful colors.

sRGB mode on this monitor can't be calibrated. The only thing I can regulate is the brightness, but if I change the brightness without adjusting RGB gains,
white point became completely wrong.

I think that I will return the monitor.

This is the first time that I completely disagree with a prad review.
To me, this monitor is not good for gaming at all.
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
ERRATA CORRIGE!!!!!!!!

I have found that with the 2 display port bundled with the monitor with a RTX2080Ti, windows 10 latest nvidia drivers 441.41,

at 4K, 120Hz, I can have both GSYNC and HDR enabled.

I enable GSYNC in nvidia control panel, than I enable HDR in windows, than I enable HDR mode in Acer Display Widget and I'm set.
I can see wonderful color in game while having perfectly functioning variable refresh rate as the internal refresh rate num counter on the monitor shows.
By doing this windows shows sRGB 8bit + dithering but I see no artifact and no particular color loss.
The chrome subsampling test here:
https://www.geeks3d.com/20141203/ho...-chroma-subsampling-used-with-your-4k-uhd-tv/
is passed without problems.
Now I'm completely satisfied by this monitor.
having HDR enabled in games is not really important for HDR because HDR400 is something like fake HDR but it is really really important to preserve colors to bee too much oversaturated.

But if I can do it, why so many people complain about the fact that you can't have both HDR and GSYNC enabled at the same time?
 
Joined
Dec 3, 2019
Messages
31
Are you someone from prad.de? A simple human can't have this knowledge.
Haha, thx. No I'm not from prad.de and so I can't answer any of the review questions. But good to hear that you found a way for HDR and GSync to work together!
Why most monitors allow me to do it and this no?
Is it the cheap hardware?
In the diagram below (that I ripped and edited from here), you see the color temperature curve with some thinner orthogonal lines with labels. Color shifts along these thin lines are neutral with respect to the color temperature. So for example at 3000°K, you could add or remove yellow and it would do nothing to the temperature. At 6500°K (edited in as that bold line) there is an (off screen) green tone that would be neutral to that particular temperature - and your monitor happens to have that green. As a proof you could set red and blue gain at 0% and green to 100% and see if your colorimeter indeed measures it as 6500°K.
The older monitors you calibrated probably had a "warmer" green, which when turned up would lower the measured color temperature noticably. That's my hypothesis at least. :)
Planckian-locus.png
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Haha, thx. No I'm not from prad.de and so I can't answer any of the review questions. But good to hear that you found a way for HDR and GSync to work together!

In the diagram below (that I ripped and edited from here), you see the color temperature curve with some thinner orthogonal lines with labels. Color shifts along these thin lines are neutral with respect to the color temperature. So for example at 3000°K, you could add or remove yellow and it would do nothing to the temperature. At 6500°K (edited in as that bold line) there is an (off screen) green tone that would be neutral to that particular temperature - and your monitor happens to have that green. As a proof you could set red and blue gain at 0% and green to 100% and see if your colorimeter indeed measures it as 6500°K.
The older monitors you calibrated probably had a "warmer" green, which when turned up would lower the measured color temperature noticably. That's my hypothesis at least. :)
View attachment 205117
Those kind of posts are like rare pearls that are very difficult to find on the net, specially this days where people are "more distracted by the marketing".

You have only 12 posts on the forum but I'm glad that we have a new valuable users like you to learn from.

I have a MD in Computer Science but I'm an absolute ignorant on this matter, but I would like to learn more.
In the mean time I have seen that XV273K have a very good DDC/CI feature that finally works well with my i1 display pro.
My last two monitors went crazy using DDC/CI automatic calibration.
The automatic calibration made a far better job that what I have done manually.

and, hey, thanks for the post, now I know that my monitor is not broken and I have an idea on why the monitor behave like that,
I'm mad when I don't understand things :D

Thanks.
 
Last edited:

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Streuwinkel can I say that enabling HDR in software and in games "enables" a sort of color managed environment?
Is HDR aware apps, color managed?
 
Joined
Dec 3, 2019
Messages
31
I don't know a terrible lot about HDR, but I think the answer is no. The developers of Battlefield simply test their games on a number of consoles and TVs and look for any colors that seem wrong, so the game will appear right on the "average consumer device". It's kind of a reversed situation where instead of assuming the ideal calibrated monitor as a target, they check with uncalibrated consumer grade TVs out there and adapt their game to that. It makes sense, as their sales don't wait for monitors and TVs to improve to match HDR color standards. :) I don't know if any studio is seriously investigating using ICC profiles for tone mapping their games. Media players for movies is something different. I've been using mpv on Linux with an ICC profile for years now as an example. Media Player Classic on Windows also does that.
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
I don't know a terrible lot about HDR, but I think the answer is no. The developers of Battlefield simply test their games on a number of consoles and TVs and look for any colors that seem wrong, so the game will appear right on the "average consumer device". It's kind of a reversed situation where instead of assuming the ideal calibrated monitor as a target, they check with uncalibrated consumer grade TVs out there and adapt their game to that. It makes sense, as their sales don't wait for monitors and TVs to improve to match HDR color standards. :) I don't know if any studio is seriously investigating using ICC profiles for tone mapping their games. Media players for movies is something different. I've been using mpv on Linux with an ICC profile for years now as an example. Media Player Classic on Windows also does that.
I'm asking it because there are some games that looks horrible on my XV273K due to the wide gamut oversaturation, when enabling HDR, colors seems more natural and absolutely not oversaturated.
 
Joined
Dec 3, 2019
Messages
31
Are you saying that the games that look oversaturated turn good simply by enabling HDR in Windows? It's entirely possible that Windows is working its magic here converting the color spaces. Do those games run in windowed mode and/or use DX12?
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Are you saying that the games that look oversaturated turn good simply by enabling HDR in Windows? It's entirely possible that Windows is working its magic here converting the color spaces. Do those games run in windowed mode and/or use DX12?
Testing it with
Red Dead Redemption 2, Anthem and AC Odyssey (DX12 games and Vulkan)

Yes, I enable HDR in windows and HDR in the game, once enabled HDR in the game the color turn to be more natural and not overshoot.
 

kasakka

[H]ard|Gawd
Joined
Aug 25, 2008
Messages
1,443
Testing it with
Red Dead Redemption 2, Anthem and AC Odyssey (DX12 games and Vulkan)

Yes, I enable HDR in windows and HDR in the game, once enabled HDR in the game the color turn to be more natural and not overshoot.
RDR2 at least behaves a bit weirdly with HDR. Everything looks very desaturated compared to SDR.
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Ok, so that's working just as intended then without any color management since in HDR the games target the wide gamut DCI-P3 color space that your monitor natively supports to ~90%. As for RDR2:
.
this explains why in HDR colors are way better on my monitor.
I'm experiencing the same thing even in other games like AC Odyssey and Anthem.
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Sure, all the HDR games should correctly use the wider color space.
There is some reviewers that says that HDR400 and HDR600 is useless because they generally don't use FALD and they don't add the "real HDR experience".

I completely agree with them but if the monitor is wide gamut, HDR support is really needed just for having more accurate colours
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
...but I still don't understood why if I move the green gain, I don't see even any changes in the green "slider" inside the i1 profile pro. ok, it does not change the white temp, but why i1 profiler pro does not even see the change in the green component?
 
Joined
Dec 3, 2019
Messages
31
I don't own a calibration device to compare what that slider is showing. Are you just measuring the green or is this showing after calibration as a "remaining color difference"? Is the slider/bar showing "∆E 94" or "∆E 2000" color difference? What is the target color space? AdobeRGB, DCI-P3, sRGB/Rec.709 or BT.2020?
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
I don't own a calibration device to compare what that slider is showing. Are you just measuring the green or is this showing after calibration as a "remaining color difference"? Is the slider/bar showing "∆E 94" or "∆E 2000" color difference? What is the target color space? AdobeRGB, DCI-P3, sRGB/Rec.709 or BT.2020?
I'm calibrating, so I'm in the part of the calibration where I need to regulate the monitor sliders to achieve 6500K
there is no target color space, the target color space is the monitor color space.
if I move the red or blue slider of the monitor's gain, Red and blue slider on the i1 profile software moves accordingly, if I move the green slider, nothing happen in the i1 profiler software

pretty strange behaviour

attachment.jpg
 
Joined
Dec 3, 2019
Messages
31
Now I get the full picture. I agree with you intuitively that reducing green gain should show as green being to low / red and blue being to strong. You'll have to ask X-Rite what's going on there. At the bottom of the panel it shows x,y coordinates where y should be almost proportional to green gain. Please report back what their support told you is going on, so we can all learn from it.

P.S.: Nice Christmas tree.
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Now I get the full picture. I agree with you intuitively that reducing green gain should show as green being to low / red and blue being to strong. You'll have to ask X-Rite what's going on there. At the bottom of the panel it shows x,y coordinates where y should be almost proportional to green gain. Please report back what their support told you is going on, so we can all learn from it.

P.S.: Nice Christmas tree.
Wrote to the xrite support , hope in a good answer but generally their support is pretty bad.

I am going to update the thread when they will answer.

Thanks for the answer and the help man, I appreciate it

ps: my Christmas tree is not that bad I agree
 

mtrupi

Gawd
Joined
Mar 26, 2007
Messages
739
Following out of curiosity and because I have a similar monitor, VG271U. So when you are done calibrating you are in the monitor USER mode, right? Does this then still look right with HDR content and windows set to HDR mode? So far not happy with HDR mode so I have stayed with the built-in SDR mode and factory ICC profile and it looks pretty good. Wondering if I should invest in a calibration tool. I am not wanting to hi-jack your post, I just wonder if you have more info you can share.
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Following out of curiosity and because I have a similar monitor, VG271U. So when you are done calibrating you are in the monitor USER mode, right? Does this then still look right with HDR content and windows set to HDR mode? So far not happy with HDR mode so I have stayed with the built-in SDR mode and factory ICC profile and it looks pretty good. Wondering if I should invest in a calibration tool. I am not wanting to hi-jack your post, I just wonder if you have more info you can share.
You can't calibrate the HDR mode, you can't do it on pretty every monitors.
HDR mode is factory calibrated and it looks very accurate, prad tested this mode and they confirm that this mode is very well factory calibrated on the Nitro monitors.
Acer does a really good job on those monitors in the factory calibration.

There are too much variables in HDR mode, you can't really calibrate it that easy and IMHO it does not even worth to calibrate it.
HDR mode is kick ass in games or in films that supports it, those contents does not support color management usually, and with wide gamut monitors you'll end with
wrong colors and a lot of oversaturation.
HDR help a lot in this type of contents.

For general usage or even color accurate work, most of the software support color management, for this reason I suggest to use the standard profile with the bundled Acer ICC profile,
or if you have a colorimeter the user one with the crafted ICC file.

The problem on those "non professional" monitors is that the white point change a lot with different brightness levels.
My monitor for example, is factory calibrated at a semi perfect 6500K while using the default factory settings (80% brightness) but the white point goes down to 6200K at 14% brightness (my desired 160 cd/m2).

If you lower the brightness, yes, the white point will be off and even the rest of the calibration will not be that good.
nothing that it is worth spending money on a colorimeter if you are not a "color enthusiast" but there will be some noticeable errors on both colors and white point (there are some people who don't recognize the difference between a bad TN and a good IPS, so this depends on you too)

I use EIZO monitors since a lot of years and now I tend to dislike everything that goes too far from 6500K.
At 6200K for example the white point is a bit too warm, at 6800K is a bit too cold. For my tastes you can remove the "a bit", it's too worm or too cold at that temperatures.

Personally I dislike everything under 6350K or over 6650K, so on my monitor, a colorimeter is needed.
I don't say that you will have the same needs or the same tastes.

If you want to buy a colorimeter buy a good one, otherwise, it has not much sense.
i1 Display Pro from X-Rite is a really good one and it doesn't cost an harm and a leg.

if you buy a good colorimeter and you preserve it from humidity, it will last a lot of years, not many people have my same thinking but for me a colorimeter is an investment
at it is well worth it.

calibration software usually brings images made to easily compare the "before the calibration" and the "after the calibration" with naked eyes.
every time I make this comparison, I say: "WOW"
 

mtrupi

Gawd
Joined
Mar 26, 2007
Messages
739
You can't calibrate the HDR mode, you can't do it on pretty every monitors.
HDR mode is factory calibrated and it looks very accurate, prad tested this mode and they confirm that this mode is very well factory calibrated on the Nitro monitors.
Acer does a really good job on those monitors in the factory calibration.

There are too much variables in HDR mode, you can't really calibrate it that easy and IMHO it does not even worth to calibrate it.
HDR mode is kick ass in games or in films that supports it, those contents does not support color management usually, and with wide gamut monitors you'll end with
wrong colors and a lot of oversaturation.
HDR help a lot in this type of contents.

For general usage or even color accurate work, most of the software support color management, for this reason I suggest to use the standard profile with the bundled Acer ICC profile,
or if you have a colorimeter the user one with the crafted ICC file.

The problem on those "non professional" monitors is that the white point change a lot with different brightness levels.
My monitor for example, is factory calibrated at a semi perfect 6500K while using the default factory settings (80% brightness) but the white point goes down to 6200K at 14% brightness (my desired 160 cd/m2).

If you lower the brightness, yes, the white point will be off and even the rest of the calibration will not be that good.
nothing that it is worth spending money on a colorimeter if you are not a "color enthusiast" but there will be some noticeable errors on both colors and white point (there are some people who don't recognize the difference between a bad TN and a good IPS, so this depends on you too)

I use EIZO monitors since a lot of years and now I tend to dislike everything that goes too far from 6500K.
At 6200K for example the white point is a bit too warm, at 6800K is a bit too cold. For my tastes you can remove the "a bit", it's too worm or too cold at that temperatures.

Personally I dislike everything under 6350K or over 6650K, so on my monitor, a colorimeter is needed.
I don't say that you will have the same needs or the same tastes.

If you want to buy a colorimeter buy a good one, otherwise, it has not much sense.
i1 Display Pro from X-Rite is a really good one and it doesn't cost an harm and a leg.

if you buy a good colorimeter and you preserve it from humidity, it will last a lot of years, not many people have my same thinking but for me a colorimeter is an investment
at it is well worth it.

calibration software usually brings images made to easily compare the "before the calibration" and the "after the calibration" with naked eyes.
every time I make this comparison, I say: "WOW"
Thanks. I think the factory settings have good color accuracy from just using test patterns and my eyes. However the HDR mode is less than satisfying for me. I think it's because an HDR400 monitor isn't really able to do HDR justice. I wondered if there was a way to calibrate it so HDR would be better. From what you say the answer is no.
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Thanks. I think the factory settings have good color accuracy from just using test patterns and my eyes. However the HDR mode is less than satisfying for me. I think it's because an HDR400 monitor isn't really able to do HDR justice. I wondered if there was a way to calibrate it so HDR would be better. From what you say the answer is no.
Color accuracy has nothing to do with HDR400, monitor is very accurate in the default HDR mode but you need to use HDR contents to see accurate colors.
You need an HDR content, you need to set HDR inside the OS and you need to set HDR on the monitor menu.
 

mtrupi

Gawd
Joined
Mar 26, 2007
Messages
739
Color accuracy has nothing to do with HDR400, monitor is very accurate in the default HDR mode but you need to use HDR contents to see accurate colors.
You need an HDR content, you need to set HDR inside the OS and you need to set HDR on the monitor menu.
Maybe the OP will also find this helpful so I will post a bit more. What I see is it doesn't have the contrast ratio I would really like. Using HDR test patterns from AVS forum shows color clipping not equal across all colors and white stops at about HDR500. They also show Black level doesn't go all the way down. It also looks different depending on the media player or browser used. I also downloaded HDR video that also had youtube links. One posted by NASA shows the extremes of HDR capability and really showed up where my monitor suffers. Then doing reading about HDR400 seemed to support what I saw.
 

XoR_

Gawd
Joined
Jan 18, 2016
Messages
862
When not using HDR mode sRGB emulation (set in monitor OSD) should be used to have more or less accurate colors.
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
Maybe the OP will also find this helpful so I will post a bit more. What I see is it doesn't have the contrast ratio I would really like. Using HDR test patterns from AVS forum shows color clipping not equal across all colors and white stops at about HDR500. They also show Black level doesn't go all the way down. It also looks different depending on the media player or browser used. I also downloaded HDR video that also had youtube links. One posted by NASA shows the extremes of HDR capability and really showed up where my monitor suffers. Then doing reading about HDR400 seemed to support what I saw.
Hdr400 monitor are not strong in hdr, don't saying the opposite. I'm only saying that hdr400 does not means weak colours but weak contrast and weak brightness.
Personally I would not give to hdr too much importance because hdr contents are too few to care.
 

IdiotInCharge

[H]F Junkie
Joined
Jun 13, 2003
Messages
13,523
I'm only saying that hdr400 does not means weak colours but weak contrast and weak brightness.
While weak contrast is definitely an artifact of HDR400-class monitors currently available, it doesn't have to be intrinsic -- the first OLEDs, for example, likely didn't meet HDR600 yet had 'infinite' contrast.

The 'HDR rating' is really just a measure of peak brightness, and qualification that the display will be able to accept and present an HDR signal as faithfully as possible given its own limitations. Give me a desktop OLED that is hardened against burn-in that's rated at HDR400 and I'd buy right now!
 

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,528
While weak contrast is definitely an artifact of HDR400-class monitors currently available, it doesn't have to be intrinsic -- the first OLEDs, for example, likely didn't meet HDR600 yet had 'infinite' contrast.

The 'HDR rating' is really just a measure of peak brightness, and qualification that the display will be able to accept and present an HDR signal as faithfully as possible given its own limitations. Give me a desktop OLED that is hardened against burn-in that's rated at HDR400 and I'd buy right now!
I agree, HDR400 is not that related to contrast but it's refers more to peak and sustained brightness.
OLED are not the solution to all the monitors problems, even if they will ever fix the burn in problems.
every "real person comparison" I have seen between an OLED vs IPS is that IPS is much sharper. You can distinshuish small detail better in the IPS panels with the same DPI, don't know why but the difference is there.

aging is another big problems of OLEDs, I'm in no hurry for OLED on PC monitors if they will not fix those problems.
 

IdiotInCharge

[H]F Junkie
Joined
Jun 13, 2003
Messages
13,523
I can't say I've used my OLED TV for 4k desktop use -- it's 55", and well, it's mounted in the living room -- but I'd agree that IPS panels do set a standard worth meeting here.
 

mtrupi

Gawd
Joined
Mar 26, 2007
Messages
739
This all sounds right to me as well. I was finding color profiles didn't make sense when windows and monitor were in HDR mode so I wondered if I was missing something. Everything I found searching on line didn't really address this well. I think the info in this thread will be helpful for anyone else that runs across it. I can say by tweaking the graphics card display settings and windows SDR slider I can come up with something that is visually pleasing.
 
Last edited:
Top