Why people don't care about Wide Gamut?

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,765
As title,
I see that many people says that Wide Gamut is unuseful if not working with colors.

I don't agree with this people.

When I see my photo on my wide gamut monitor I can see color that I cannot see on my sRGB monitor.
Everyone has a camera, and photos on wide gamut monitor are far better than on sRGB, so why don't care?
 
Because for most people 8 bit - or even 6 bit - is good enough.
 
gamut and bit depth are not the same thing.

The reason that wide gamut can be bad in some contexts is because content is often not mastered for wide gamut, and wide gamut displays often are not calibrated to reflect this.

Things often end up looking oversaturated as a result.

Wide gamut displays that are well implemented can be great though.
 
Isn't it the same as why people are fine by taking photos with phone instead of DSLR or listening MP3 instead of FLAC (or any other superior format). Or watching movies on iTunes/Netflix instead of buying BluRay. Better quality usually means higher price and might not be so convenient. There will be always those who set quality over anything else, but for average person, good enough is all they need.
 
ain't the same thing at all. Wider gamut does not mean, defacto, better quality. For example, all HD content (e.g. Blu-ray), and sRGB content would have colors rendered inaccurately on a wide gamut display using its native primaries.

In the next few years, however, wide gamut displays will become more important.
 
Because the web and most media is not made for it, and it looks like ass on a wide gamut display.
 
The shortest possible answer is that 99.99999% of content is (and should be) mastered for a standard sRGB gamut, and when you view this content on a wide gamut monitor it's mangling the colors (i.e. showing the wrong saturation / shade) of the color. If you enjoy this, then you like this exaggeration of saturation, which is fine, but it looks wrong to the discerning eye.
 
It is possible to remap the primaries of a wide gamut monitor to adhere to smaller gamuts, but in order to do this without artifacts, you need a very good color management system in place. Some displays have this built internally and do a great job.
 
When I see my photo on my wide gamut monitor I can see color that I cannot see on my sRGB monitor.
Yes, you will see colors that your sRGB monitor cannot reproduce, but unless you are viewing them in Photoshop and have your color management setup properly, those colors are likely wrong.

For example your computer sends a signal to the monitor that says "display 100% red." Your computer by default assumes an sRGB color space, since that's what pretty much all content is designed for. When your wide gamut monitor receives the "100% red" signal, it displays what it thinks is 100% red. Which is some retina-burning neon version of red that was not part of the original image. If your system is set up to handle wide-gamut, it will instead send something like "display 78% red" for the same image. So a typical image viewed on a properly set up wide gamut display shouldn't really look all that different from the same image on a good sRGB display.

Isn't it the same as why people are fine by taking photos with phone instead of DSLR or listening MP3 instead of FLAC
Not really. If an image/film/game is designed for an sRGB system (which pretty much everything is) - then it is best viewed on an sRGB system. Using a wide gamut monitor isn't going to introduce fidelity that isn't there to begin with.
 
Anything exceeding sRGB is pointless. Seriously, have you ever looked at a photo in sRGB and thought that the colors were lacking in any way?
 
actually well mastered wide gamut content can be quite an experience. The DCI gamut (P3) which is used in digital cinema is quite a bit larger than sRGB.
 
There is nothing wrong with the idea of wide (and wider) gamut. Being able to display more of the visible spectrum can't really be a bad thing. It allows for more lifelike representation of images.

The problem is the transition.
 
Seeing as most phones and tablets have wide gamut displays...I question your assertion.

Is there really a way to know if material has been created/mastered in the sRGB colorspace? I know what you're saying and that more phones and such use wide gamuts, but that doesn't automatically mean that web media is authored in wide gamut.
 
Anything exceeding sRGB is pointless. Seriously, have you ever looked at a photo in sRGB and thought that the colors were lacking in any way?

Yes, yes I have. Especially if you were the one who took the image and saw the real life version at the same time.

Humans can perceive much more than sRGB.
 
Consumer media uses the sRGB or REC 709 color space which are very similar, and much smaller than the Adobe RGB color space which wide gamut monitors try to cover. When not color manged or using an sRGB mode (if available), wide gamut colors are inaccurate and over-saturated. Green and red are the worst offenders since they tend to swallow other shades and give skin tones nasty green and sun burnt hues. Games and consoles do not support color management and it needs to be enabled in Chrome and Firefox. MPC-HC allegedly does but I was not able to get it to work when using the wide gamut HP Z30i.

The brick buildings in this image from Gears 3 end up looking pukey green, pale skin turns pukey-green, tans turn into sunburns, ect. When color management is not available, wide gamut colors are both gross and disrespectful. Nvidia gpu's offer Digital Vibrancy, turn it up to 75% and watch as shades swallow and blend into each other and details disappear or become "Wide Gamut-ed."

Some good examples of wide gamut monitors when not color managed. These are not 100% accurate for a few reasons, but I took the second picture and it is pretty close to what I saw.

Left normal vs. wide gamut monitor right

Left wide gamut vs. right normal

Until larger color gamuts become standardized, it doesn't make sense to buy a wide gamut monitor if not used for work, especially since the affordable (<1000$) options (not Eizo's) suck (wide gamut AH-IPS corner glow is significantly worse and most suffer from really obvious overshoot ghosting) compared to standard gamut monitors, especially 30" monitors and the old wide gamut CCFL back-lit IPS panels.
 
Last edited:
Seeing as most phones and tablets have wide gamut displays...I question your assertion.

no they don't, they mostly have a combination of badly calibrated and somewhat oversatured displays. That's not wide gamut.

The *good* phone and tablet screens all have decently calibrated sRGB displays.
 
The new Ultra HD standard brings a much larger gamut than sRGB. It's even larger than AdobeRGB.

So in the near future the monitors should be wide gamut and manufacturers won't be able to charge for that and use economies of scale, offering wide gamut to each single monitor.

The gamut depends on the light source: White LED, RGB LED, GB-LED, QED, CCFL ..., not the TFT matrix.
 
Because the web and most media is not made for it, and it looks like ass on a wide gamut display.

This.

The reason why I dumped my U3011 for a Qnix.

Seeing as most phones and tablets have wide gamut displays...I question your assertion.

The reason why I left Samsung, wide gamut.
I have a iPhone now and the sRGB colors are more natural.

I'm a web designer so color reproduction on the web is very important to me.
 
Web? So you should care more about how it will look on most common displays. Most common - are non wide displays, so of course any optimisation should be done how it will look not when professionally printed, but on common cheap non-wide gamut displays 95% of people have ..
 
Is there really a way to know if material has been created/mastered in the sRGB colorspace? I know what you're saying and that more phones and such use wide gamuts, but that doesn't automatically mean that web media is authored in wide gamut.

Ultimately it comes down to the awareness of the content creators. An outfit producing content on wide gamut monitors without an awareness of sRGB standards will produce very odd looking content unless people view the material on the same monitors.

Similarly, even if the creators are working on "standard" gamut displays, if they're not well calibrated to sRGB, content will look different on displays that are well calibrated to sRGB.

And it's not only chromaticity, but relative luminance also. The luminance range of the display, the ambient lighting conditions, and the gamma of the display will all affect how the content creator makes judgments about the final product.

This is why it's nice to have standards that are followed. With online content, I imagine these standards are not adhered to with the same rigor as in HD movie production.
 
This.

The reason why I dumped my U3011 for a Qnix.



The reason why I left Samsung, wide gamut.
I have a iPhone now and the sRGB colors are more natural.

I'm a web designer so color reproduction on the web is very important to me.

You being a web designer is a salient detail. However...Lots of Joe Publics on the street though like the technically "more vibrant" (read over saturated) Sammy panels.

Web? So you should care more about how it will look on most common displays. Most common - are non wide displays, so of course any optimisation should be done how it will look not when professionally printed, but on common cheap non-wide gamut displays 95% of people have ..

Thing is on cheap displays, while they may not be wide-gamut capable...they tend to be very badly "calibrated" (they're not calibrated at all hence the quotes), they also tend to be sickly yellow tinted TN with shitty viewing angles. So it isn't like they're getting any more accurate a colorspace output in the end anyway.

About the only time gamut ever matters to non-photogs IRL is when looking and proofing their own wedding photos. Gamut, while people tend to argue about it day and night, just doesn't matter that much to non-image-professionals 95% of the time. That crappy movie you rented last night? It is just as crappy in sRGB as AdobeRGB color space. That Dragon Age 2 game you bought and feel ripped off on? Using a different color space or gamut monitor won't change that.

A great movie or game or whatever media is just as great no matter the color space.
 
A great movie or game or whatever media is just as great no matter the color space.

many home theatre enthusiasts would strongly disagree.

There are a number of dimensions to accurate color, and it's not just about the "hue" (although that is important also).

Having deep blacks is critical to image quality.

Having an appropriate luminance function means that the image has depth and is not crushed or clipped.

And having the correct hues often enhances the immersive experience of the story - skin tones look more natural, the subtle language of the artists behind the movie (of which color is a major component) is more faithfully conveyed.

Depending on how you define "color space", these considerations have an impact.
 
many home theatre enthusiasts would strongly disagree.

There are a number of dimensions to accurate color, and it's not just about the "hue" (although that is important also).

Having deep blacks is critical to image quality.

Having an appropriate luminance function means that the image has depth and is not crushed or clipped.

And having the correct hues often enhances the immersive experience of the story - skin tones look more natural, the subtle language of the artists behind the movie (of which color is a major component) is more faithfully conveyed.

Depending on how you define "color space", these considerations have an impact.

It depends on how nuts you want to get. And further OCD enthusiasts are a tiny portion of the market, just like [H] is a tiny subset of computer users. Most people are not enthusiasts-for 95% of people so long as there's a picture they can read it is "good enough".

That being said no one honestly knows how the video in movies was shot and mastered and it isn't like there's any necessary industry consistency in all likelihood anyway. Odds are they are originally shot and mastered for something wider than sRGB would be my guess....and for the DVD release you get remastered/post-processed a bit. So your "faithful" sRGB panel may not actually be faithful at all-or it might be, no one person really knows-even the people who's names scroll in the credits.
 
That being said no one honestly knows how the video in movies was shot and mastered and it isn't like there's any necessary industry consistency in all likelihood anyway. Odds are they are originally shot and mastered for something wider than sRGB would be my guess....and for the DVD release you get remastered/post-processed a bit. So your "faithful" sRGB panel may not actually be faithful at all-or it might be, no one person really knows-even the people who's names scroll in the credits.

This ain't correct at all. HD material, which includes Blu-ray, adheres to ITU-R Recommendation BT.709.

With very few exceptions, studios go to a lot of effort to ensure their mastering displays are calibrated extremely well, and operated in a well controlled lighting environment.

Digital theatre content uses a different set of standards - different white point, different primaries, and steeper luminance function. Most high end displays fluently switch between DCI (Digital Cinema Initiatives) and Rec 709.

Watching Blu-ray content with wide gamut primaries will look like utter shit, although I grant you that the degree of said shittiness depends on how developed one's sensibilities are.
 
This ain't correct at all.

He owns a mediocre wide gamut 30" and always uses silly excuses or slippery slope arguments (need 99% accuracy, expensive colorimeters, monitors with hardware calibration, ect) to disregard industry standards and the time and effort artists put into their work.
 
Last edited:
He owns a mediocre wide gamut 30" and always uses silly excuses or slippery slope arguments (need 99% accuracy, expensive colorimeters, monitors with hardware calibration, ect) to disregard industry standards and and the time and effort artists put into their work.

My Name is NCX and I troll people I disagree with with ad homs rather than discussing issues.
 
My Name is NCX and I troll people I disagree with with ad homs rather than discussing issues.

You should Google ad hominem before using it in a post since you obviously do not understand what an argumentum ad hominem is. Criticizing a monitor &#8800; criticizing the author of a comment and the fact that you own a wide gamut 30" is 100% relevant since you try to disregard the importance of color accuracy with inaccurate claims in order to justify your purchase/preference. You would have a case if your arguments were based on logic and facts.
 
Last edited:
I once had a few 6-bit LCD monitors that couldn't display 100% of sRGB color space. Those looked pretty bad.

I found that 8-bit displays that can handle the full sRGB color space were good enough for me. Anything more is overkill and I can't tell a difference.
 
Is this thread for real

People advocating wide-gamut over sRGB just because it looks more vibrant. What. So you like inaccurate colours because they look nicer? Well good for you, accuracy always takes prominence over what you like.
 
You should Google ad hominem before using it in a post since you obviously do not understand what an argumentum ad hominem is. Criticizing a monitor &#8800; criticizing the author of a comment and the fact that you own a wide gamut 30" is 100% relevant since you try to disregard the importance of color accuracy with inaccurate claims in order to justify your purchase/preference. You would have a case if your arguments were based on logic and facts.

Thank you most sincerely for proving my point.

Would you like a cookie? I usually reward such out-of-the-way kindness with treats.

Is this thread for real

People advocating wide-gamut over sRGB just because it looks more vibrant. What. So you like inaccurate colours because they look nicer? Well good for you, accuracy always takes prominence over what you like.

People not actually reading what is said for what the voices in their minds think is somehow said? Well your version of what has been said in this thread is certainly color-accurate so to speak.
 
Whatever dude, be happy with your wide gamut all you want. Unless your eyes have blown, most people are going to notice the neon greens you get from watching sRGB content on a wide gamut display.
 
Because the web and most media is not made for it, and it looks like ass on a wide gamut display.

it seems that most of [H] users has no idea on what color management is.
windows is color managed,
browsers are color managed,
newer media player are all color managed,
photo viewer are color managed,
good games are color managed.

so what's the point?
 
The Dell UP2414Q has SRGB and AdobeRGB modes - it was nice to have the monitor switch modes automatically depending on which application has focus. Well - until the application dell supplied kept locking up my PC...ugh!!!

Maybe I can try again now that I'm running Windows 8.1 running latest NVIDIA drivers - although with the UP2414Q's firmware bugs (A00 revision here) I'm not holding out hope...
 
it seems that most of [H] users has no idea on what color management is.
windows is color managed,
browsers are color managed,
newer media player are all color managed,
photo viewer are color managed,
good games are color managed.

so what's the point?

The answer to your question to what the "point" is, is answered in this thread already. Austinpike had a very good explanation of it, read his post.
 
it seems that most of [H] users has no idea on what color management is.
windows is color managed,
browsers are color managed,
newer media player are all color managed,
photo viewer are color managed,
good games are color managed.

so what's the point?

Browsers are not necessarily color managed. I don't know if it has changed but in Firefox you had to specifically turn on color management and it actually slowed the browser a bit. Chrome's color management seems to be hit or miss and also has to be turned on with a flag. Images on the 'net might not have proper color profiles defined either.

Most games are not color managed. That's why its important for displays (when it comes to gamers) to have adequate controls for colour adjustments.
 
it seems that most of [H] users has no idea on what color management is.
windows is color managed,
browsers are color managed,
newer media player are all color managed,
photo viewer are color managed,
good games are color managed.

so what's the point?

I may be wrong, but afaik, windows isn't truly color aware. So, for example, the desktop won't be able to incorporate anything more advanced than a 1D LUT.

As for media players, I know media player classic + MadVR can do the job. Which other ones support true color management?

And which games support color management?
 
I may be wrong, but afaik, windows isn't truly color aware. So, for example, the desktop won't be able to incorporate anything more advanced than a 1D LUT.

As for media players, I know media player classic + MadVR can do the job. Which other ones support true color management?

And which games support color management?

windows is fully color managed.
 
windows is fully color managed.

Not so sure about this. From what I understand Windows just has the ability to provide pointers for other color aware software to use. It doesn't do any transformations itself. As far as I know (and this is based on info from last year, coming from the lead software developer of Spectracal), the only component of an ICC profile that windows can actually use to implement a gamut transformation is the VCGT tag, which is a 1D LUT.

I may be wrong, but this is my understanding.
 
Back
Top