albovin has stated several times that sRGB emulation is not satisfactory, and that a sRGB monitor is always better. However, Sailor_Moon has stated that an LCD with sRGB emulation enabled is very accurate when measured with a colorimeter (i.e., reporting very low Delta-E measurements).
I would like to get to the bottom of this. I have some theories, but no physical access to a Samsung XL30, Eizo CG301W or other wide-gamut LCD with sRGB-emulation.
It makes very good theoretical sense that a wide-gamut sRGB-emulation monitor would be better than an sRGB monitor at doing sRGB, because a color triangle fully encompassing and exceeding sRGB would be guaranteed not to have any under-coverages, whereas a monitor that endeavors to match sRGB would inevitably be an imperfect match because any slight deviations in its color primaries from the sRGB standard would result in under-coverages.
It would be very nice if albovin would try to analyze and explain why sRGB LCDs look better to him than sRGB-emulation LCDs, instead of simply stating that theory differs from practice and leaving it at that. When theory differs from practice, you should either figure out why this is happening and adjust the theory to better fit practice, or figure out how to make more accurate and complete observations, or both.
Here are some reasons I can think of that the Samsung XL30 and/or Eizo CG301W sRGB emulation would not look correct to albovin:
Note that this is mainly of academic interest to me (but a strong interest nevertheless), since I have color-deficient vision (anomalous trichromatic vision with a protan defect). Perhaps due to this, I've always been interested in color theory. But any color model designed to match colors for people with normal color vision will not work for me. So in practical use, I don't care whether my monitor is sRGB or not. When viewing a photo of a particular scene, no existing monitor is going to mimic to my eyes how that scene would look to a color normal, nor is it going to mimic to my eyes how the scene would look to my own eyes if I viewed it directly. At least a wide gamut monitor makes the colors more vivid, easier for me to tell apart and prettier. What I do care about is matching the sRGB gamma curve with each primary.
The ideal monitor for me would have at least five laser-light primaries (processed to eliminate temporal coherence, i.e. speckle), and would have a 3D LUT that pointed to five-primary table elements. The table would be calculated using a color model tuned to my retinal cones' sensitivities, such that it would trick my eyes, as closely as possible, into seeing colors as a color-normal would. (Even better would be a direct neural interface, allowing me not only to see colors as a color normal would but to see imaginary colors. But that's beyond current technology, whereas the previously-described five-primary monitor is probably attainable with current technology at great expense.)
I would like to get to the bottom of this. I have some theories, but no physical access to a Samsung XL30, Eizo CG301W or other wide-gamut LCD with sRGB-emulation.
It makes very good theoretical sense that a wide-gamut sRGB-emulation monitor would be better than an sRGB monitor at doing sRGB, because a color triangle fully encompassing and exceeding sRGB would be guaranteed not to have any under-coverages, whereas a monitor that endeavors to match sRGB would inevitably be an imperfect match because any slight deviations in its color primaries from the sRGB standard would result in under-coverages.
It would be very nice if albovin would try to analyze and explain why sRGB LCDs look better to him than sRGB-emulation LCDs, instead of simply stating that theory differs from practice and leaving it at that. When theory differs from practice, you should either figure out why this is happening and adjust the theory to better fit practice, or figure out how to make more accurate and complete observations, or both.
Here are some reasons I can think of that the Samsung XL30 and/or Eizo CG301W sRGB emulation would not look correct to albovin:
- LCDs that are considered "reference sRGB monitors" don't match the theoretically defined sRGB specification, so monitors that emulate sRGB are emulating the "wrong standard" — the theoretical one instead of a "practical" one. The sRGB standard was probably based on CRT phosphors, and CCFL backlighting filtered through an LCD probably can only get close to this, not match it. This would be easy to test; see if the color triangle of an sRGB reference LCD matches the sRGB standard color triangle, and compare this to how closely the sRGB-emulation LCD matches the triangle.
- Incorrect implementation. The monitor is not doing the correct chain of calculations, i.e., 1) convert sRGB gamma curve to linear, 2) do a linear matrix transform to convert the sRGB primaries to monitor-specific, calibrated primaries, 3) convert tone curve from linear to monitor-specific, calibrated for each channel (R,G,B) using individual, multi-point curves for each channel. Maybe the monitor is taking shortcuts or cutting corners in the chain of calculations, and somehow the calibration software is complicit in this (because it still reports low delta-Es).
- Spectral sensitivities in the colorimeter (the one used to calibrate the sRGB-emulating LCD) do not match the spectral sensitivity curves of your retinal cones; this would cause the colorimeter to see different metamers than your eyes, and thus it would disagree as to which mixes of monitor-specific R,G,B would result in sRGB R,G,B
- 30-inch monitors have inadequate uniformity to match with a "reference sRGB" 24-inch LCD (but both the XL30 and CG301W have a uniformity correction option, which if properly implemented should make this point moot)
Note that this is mainly of academic interest to me (but a strong interest nevertheless), since I have color-deficient vision (anomalous trichromatic vision with a protan defect). Perhaps due to this, I've always been interested in color theory. But any color model designed to match colors for people with normal color vision will not work for me. So in practical use, I don't care whether my monitor is sRGB or not. When viewing a photo of a particular scene, no existing monitor is going to mimic to my eyes how that scene would look to a color normal, nor is it going to mimic to my eyes how the scene would look to my own eyes if I viewed it directly. At least a wide gamut monitor makes the colors more vivid, easier for me to tell apart and prettier. What I do care about is matching the sRGB gamma curve with each primary.
The ideal monitor for me would have at least five laser-light primaries (processed to eliminate temporal coherence, i.e. speckle), and would have a 3D LUT that pointed to five-primary table elements. The table would be calculated using a color model tuned to my retinal cones' sensitivities, such that it would trick my eyes, as closely as possible, into seeing colors as a color-normal would. (Even better would be a direct neural interface, allowing me not only to see colors as a color normal would but to see imaginary colors. But that's beyond current technology, whereas the previously-described five-primary monitor is probably attainable with current technology at great expense.)
Last edited: