sRGB emulation in wide-gamut LCDs vs. sRGB LCDs

MetaGenie

Limp Gawd
Joined
Nov 6, 2009
Messages
246
albovin has stated several times that sRGB emulation is not satisfactory, and that a sRGB monitor is always better. However, Sailor_Moon has stated that an LCD with sRGB emulation enabled is very accurate when measured with a colorimeter (i.e., reporting very low Delta-E measurements).

I would like to get to the bottom of this. I have some theories, but no physical access to a Samsung XL30, Eizo CG301W or other wide-gamut LCD with sRGB-emulation.

It makes very good theoretical sense that a wide-gamut sRGB-emulation monitor would be better than an sRGB monitor at doing sRGB, because a color triangle fully encompassing and exceeding sRGB would be guaranteed not to have any under-coverages, whereas a monitor that endeavors to match sRGB would inevitably be an imperfect match because any slight deviations in its color primaries from the sRGB standard would result in under-coverages.

It would be very nice if albovin would try to analyze and explain why sRGB LCDs look better to him than sRGB-emulation LCDs, instead of simply stating that theory differs from practice and leaving it at that. When theory differs from practice, you should either figure out why this is happening and adjust the theory to better fit practice, or figure out how to make more accurate and complete observations, or both.

Here are some reasons I can think of that the Samsung XL30 and/or Eizo CG301W sRGB emulation would not look correct to albovin:
  1. LCDs that are considered "reference sRGB monitors" don't match the theoretically defined sRGB specification, so monitors that emulate sRGB are emulating the "wrong standard" — the theoretical one instead of a "practical" one. The sRGB standard was probably based on CRT phosphors, and CCFL backlighting filtered through an LCD probably can only get close to this, not match it. This would be easy to test; see if the color triangle of an sRGB reference LCD matches the sRGB standard color triangle, and compare this to how closely the sRGB-emulation LCD matches the triangle.
  2. Incorrect implementation. The monitor is not doing the correct chain of calculations, i.e., 1) convert sRGB gamma curve to linear, 2) do a linear matrix transform to convert the sRGB primaries to monitor-specific, calibrated primaries, 3) convert tone curve from linear to monitor-specific, calibrated for each channel (R,G,B) using individual, multi-point curves for each channel. Maybe the monitor is taking shortcuts or cutting corners in the chain of calculations, and somehow the calibration software is complicit in this (because it still reports low delta-Es).
  3. Spectral sensitivities in the colorimeter (the one used to calibrate the sRGB-emulating LCD) do not match the spectral sensitivity curves of your retinal cones; this would cause the colorimeter to see different metamers than your eyes, and thus it would disagree as to which mixes of monitor-specific R,G,B would result in sRGB R,G,B
  4. 30-inch monitors have inadequate uniformity to match with a "reference sRGB" 24-inch LCD (but both the XL30 and CG301W have a uniformity correction option, which if properly implemented should make this point moot)

Note that this is mainly of academic interest to me (but a strong interest nevertheless), since I have color-deficient vision (anomalous trichromatic vision with a protan defect). Perhaps due to this, I've always been interested in color theory. But any color model designed to match colors for people with normal color vision will not work for me. So in practical use, I don't care whether my monitor is sRGB or not. When viewing a photo of a particular scene, no existing monitor is going to mimic to my eyes how that scene would look to a color normal, nor is it going to mimic to my eyes how the scene would look to my own eyes if I viewed it directly. At least a wide gamut monitor makes the colors more vivid, easier for me to tell apart and prettier. What I do care about is matching the sRGB gamma curve with each primary.

The ideal monitor for me would have at least five laser-light primaries (processed to eliminate temporal coherence, i.e. speckle), and would have a 3D LUT that pointed to five-primary table elements. The table would be calculated using a color model tuned to my retinal cones' sensitivities, such that it would trick my eyes, as closely as possible, into seeing colors as a color-normal would. (Even better would be a direct neural interface, allowing me not only to see colors as a color normal would but to see imaginary colors. But that's beyond current technology, whereas the previously-described five-primary monitor is probably attainable with current technology at great expense.)
 
Last edited:
Sailor_Moon has stated that an LCD with sRGB emulation enabled is very accurate when measured with a colorimeter (i.e., reporting very low Delta-E measurements).
I should note that I can only refer to the Eizo CG and their color space emulation in the Color Navigator workflow. There are for example some other implementations that just invoke a factory precalibrated preset (which is dependent from an accurate factory measurement and can't compensate for drifts).

Best regards

Denis
 
Last edited:
I tried an HP DreamColor LP2480zx, and I found the sRGB mode looked totally wrong to me. The reds were too pure and dark.

The sRGB mode on the NEC LCD2690WUXi also looked weird in the same way, except the effect wasn't as strong.

This could be correct for all I know, but I don't think that's the case for two reasons:

1. The sRGB standard was designed to describe a typical monitor that was available at the time, but I've never seen a real monitor with reds like that.

2. The AdobeRGB mode on the HP looked fine to me, but AdobeRGB is supposed to have the same red primary as sRGB, so something doesn't add up.

Either the theory is flawed or the implementation is flawed. There's also the possibility that the calibration device is flawed like you mentioned.

It's definitely not uniformity though. Uniformity problems aren't significant enough to cause the difference that I saw.
 
The AdobeRGB mode on the HP looked fine to me, but AdobeRGB is supposed to have the same red primary as sRGB, so something doesn't add up.
It isn't the same primary. The brightness differs - something that gets "lost" when using only the chromaticity values.

sRGB red:
x: 0,6400
y: 0,3000
Y: 0,2127

AdobeRGB red:
x: 0,6400
y: 0,3000
Y: 0,2974

The sRGB mode on the NEC LCD2690WUXi also looked weird in the same way
The fixed sRGB mode in our NEC 2690WUXi2 test screen had also significant deviations that could easily be seen by the eye.

Best regards

Denis
 
Last edited:
First of all, I may assuem that if you want to do sRGB emulation, you need the monitor support hardware calibration. None hardware supported monitors may have GAMMA issues. AFAIK, some people tried to emulate sRGB color space for his U2410 monitor - but the GAMMA looks totally wrong. Does XL30 support hardware calibration? I know CG301W and LCD3090WQXI supports hardware calibration and you can do sRGB emulation via their own Color Navigator and Spectraview software.

Once the sRGB emulation is done, you don't need to go into the sRGB mode, you only need to manage your Color Navigator and Spectraview software and click your mouse to change the icc profile and the color space changes in 5 seconds. Nice and easy.

I've tried LCD2690WUXi, CG222W, CG243W and P221W in my studio, the sRGB emulation result is good, and the color looks the almost the same as the CRT with very low delta-e value.

I both tried my I1 Display 2 and EFI ES-1000 spectrophotometer.

Regards

Chum
 
Last edited:
I think I should give one example here.

I used Colo Navigator to emulate sRGB for my CG243W.

You can see the significant changes before and after emulation - not via changing to sRGB mode.

20091107_d372d45bff308587d93205JQrM9and12.png


20091107_43f53538c0b91d8ad124gqa9dp8kvHDA.png


I like the emulation rather than sRGB mode and sRGB monitor as they are still not 100% cover the sRGB color space.

Regards

Chum
 
I believe the difference in opinion between albovin and Sailor_Moon on sRGB emulation is the result of a differing interpretation.

Sailor_Moon is speaking of models like the CG243W that have a 3D LUT built in them. Because this allows the monitor to perform colour transformations, you can do powerful things like set the monitor's RGB chromaticities to whatever standard you require, as well as such as emulating protanopic and deuteranopic deficiencies.

albovin is referring to classical pre-3D LUT wide gamut monitors like the LCD3090WQXi and the 2690. The best these monitors can apparently offer regarding colour space emulation is a crippled preset.
 
ToastyX, Sailor_Moon, SlayeRBoxeR, thanks for your responses.

albovin is referring to classical pre-3D LUT wide gamut monitors like the LCD3090WQXi and the 2690. The best these monitors can apparently offer regarding colour space emulation is a crippled preset.
Neither the LCD3090WQXi nor the 2690 have a 3D LUT, true. But albovin has also referred to the Samsung XL30, which is supposed to have true hardware sRGB emulation (which would require a 3D LUT), and yet albovin did not think its emulation was accurate (and he owns one).
 
Last edited:
Back
Top