Why people don't care about Wide Gamut?

windows is fully color managed.
Windows provides a simple CMM and can load vcgt data to linearize a display. But there is no color management "from above". Color aware software explicitly uses the Windows CMM or an own solution (Adobe: ACE) to carry out color space transformations. To reproduce the background image in accordance with the current display characteristic you have to transform it into the display color space for example.

newer media player are all color managed,
There are solutions to create and apply CLUTs (based on source and target characteristics in the form of a quasi device link) to carry out color space transformations for videoplayback - but this is quite rare (MadVR video renderer is an example).

good games are color managed.
I don't know any game that would use the display profile to transform in device-dependent RGB. There may be games that can reload the vcgt linearization data from an accordant profile.

Is there really a way to know if material has been created/mastered in the sRGB colorspace?
No - you can only guess the target characteristic (unlike in the ICC workflow). But it's valid to assume Rec.709/ sRGB for ordinary media. Please keep in mind that Rec.709 doesn't define the trc for reproduction. That makes sense because the playback conditons vary considerably (light-flooded room vs. dark cave). With a sRGB or Gamma 2.2 trc you should be fine in most cases.

Odds are they are originally shot and mastered for something wider than sRGB would be my guess....and for the DVD release you get remastered/post-processed a bit. So your "faithful" sRGB panel may not actually be faithful at all-or it might be, no one person really knows-even the people who's names scroll in the credits.
Of course there are very different workflows. It could be conceivable - take is as educated guess as I'm related not to the film but graphic industry - to grade with respect to the characteristic of the forthcoming the film print (a wide gamut display is important here). For the broadcast/ consumer media exploitation there could be a predefined (appropriate CLUT which ensures the desired intent) transformation to Rec.709 or a new grading process or hybrid forms. "Pure" Rec.709 workflows will also exist (non cinema).

Simply reproducing this material on a wide gamut (undefined) screen is no good idea. Sony implements some kind of optional gamut mapping inversion in their Triluminos TVs. This heuristic approach can lead to quite pleasing results (important midtones are shifted only slightly) - although it remains a matter of taste. But when thinking of a grading workflow like the first one mentioned above with colorimetric transformation to Rec.709 (=> clipping of out-of-gamut colors, precise transformation of in-gamut colors) - that means no source color space compression - it will necessarily degrade the result.
 
Last edited:

AMD Catalyst Control Center>display color section and check "Use Extended Display Identification Data (EDID)" in Color Temperature Control.

Apparently the above settings can 'color manage' wide gamut monitors which lack sRGB modes (single input monitors). Do you know if this is effective? If it is, it could be very useful for those with multi-input wide gamut monitors which have sRGB modes with locked color controls. They could use the AMD setting when using the modes with unlocked color controls (Dell Custom mode).
 
most consumers do not even know wide gamut exists, how could they ask for it?
On top of that those who do try to use it that are not doing something critical run into all sorts of problems with applications not supporting it. So everyone defaults to the lowest common denominator. Mobile has a chance to break this trend because it is still growing. And maybe that would bleed back to desktops. But really if given a choice between wide gamut and frames per second I take frames per second. Most consumers don't even care about how accurate a display is. Look at what almost all the displays you encounter on a PC are? $100 TN junk.

Is there hope? Maybe. At some point pixel density is just going to not matter to people any more, screen size is already pretty much a non issue. And so after that finally we might start to see some real progress on some of the issues such as motion clarity and color.
 
most consumers do not even know wide gamut exists, how could they ask for it?
On top of that those who do try to use it that are not doing something critical run into all sorts of problems with applications not supporting it. So everyone defaults to the lowest common denominator. Mobile has a chance to break this trend because it is still growing. And maybe that would bleed back to desktops. But really if given a choice between wide gamut and frames per second I take frames per second. Most consumers don't even care about how accurate a display is. Look at what almost all the displays you encounter on a PC are? $100 TN junk.

Is there hope? Maybe. At some point pixel density is just going to not matter to people any more, screen size is already pretty much a non issue. And so after that finally we might start to see some real progress on some of the issues such as motion clarity and color.

I think the ROG Swift will do a good job of balancing motion clarity and color, but obviously it's not gonna be 10+ bit anyway...

Anyway, I think we're gonna go through an entire paradigm shift in display technology before what you're talking about happens though. Just a feeling... or rather a hope I suppose.
 
most consumers do not even know wide gamut exists, how could they ask for it?

pEvgIRP.jpg


Hehe, sorry.. stupid meme kept popping into my head everytime I read this thread title :D
 
First of all, thanks to everyone who was made the colorspace issue a more visible topic here on the forums as well as elsewhere on the internet. The problems of using a wide color gamut monitor for color-sensitive design or production work often go unnoticed by designers who are new to the game. When we bring new/young designers on board at my office I make sure to walk them through the pitfalls of color spaces and color management for this very reason, because many of them just assume a monitor is a monitor.

That said, as someone who does color-sensitive work from time to time and uses Quadro cards with 10-bit goodness and color-calibrated monitors at home and the office, I must say that I really don't mind the exaggerated colors of a wide gamut monitor outside of color-sensitive applications. Obviously I wouldn't ever want to edit photos on a wide gamut monitor without the proper color correction. I probably wouldn't want to sit through an uncorrected movie, either, but honestly it still wouldn't be a deal breaker.

But when it comes to games, productivity apps, or even regular (non-design specific) web browsing, the wide gamut issue really just doesn't bother me. Sure, the colors are clearly different than intended. But my gaming experience isn't ruined by any means.

So my advice is to very much be aware of the pitfalls when buying a wide gamut monitor and definitely heed all warnings and advice if you're going to do any production, design, photo editing, or otherwise color-sensitive work. But if you're just going to be playing games and browsing the web, chances are good that you probably won't be bothered by it unless you're a video game color purist. :p And let's be honest: Some people actually like the exaggerated artificially vibrant colors of a wide gamut monitor, much in the same way that people enjoy the colored odd-order distortions of a tube amplifier when listening to music.

But if you're a color purist or snob, stay clear of the wide gamut monitors.
 
I have monitor with best gamut that there is and... I do not use it at all :D

Wide gamut is just useless except maybe for people that work with images that go to professional printing which is not that often. Everyone else including film makers, web masters, game makers etc are fine with sRGB.

Wide gamut monitor with gamut remapping can eg. emulate CRT gamut. Its surely nice thing and what I use 99.9% of the time. With monitors that have fixed sRGB mode and nothing else wide-gamut would be just useless for me personally :eek:
 
The shortest possible answer is that 99.99999% of content is (and should be) mastered for a standard sRGB gamut, and when you view this content on a wide gamut monitor it's mangling the colors (i.e. showing the wrong saturation / shade) of the color. If you enjoy this, then you like this exaggeration of saturation, which is fine, but it looks wrong to the discerning eye.

Exactly.

I care about wide gamut. I care enough to avoid it.

I would never buy a monitor that was wide gamut, without an excellent sRGB emulation mode and it would be put in sRGB emulation mode from day one and that would be that.

Color management on a monitor that is much different from sRGB is too much hassle unless you have a real professional need.

The hassle isn't the setup, and using a colorimeter (I have one for my NEC 2490). It is the fact that you will have a mix of color managed and non color managed applications and it will be jarring as you move between these applications, and the color changes between them. You will probably get used to the exaggerated colors in non-managed applications, and you will find that color managed applications color will look duller in comparison, even though they are what is accurate.

Unless you really need wider gamut for professional use, it is just a waste of time and will like leave you dissatisfied.

Unless you plan to run everything unmanaged and just enjoy exaggerated colors.
 
Content is mastered for Wide Gamut but is reduced to sRGB.

All monitors should be Wide Gamut. Thanks to the new UHD standards, monitors will be wide gamut by default.
The Rec. 2020 ITU standard gives motion pictures the gamut they need and TV screens will have to support it. Current UHD codecs like H.264 and H.265 (HEVC) support this wide gamut color space. It's wider than AdobeRGB.
Finally we will get the color shades we are losing when capturing photos or shooting video. DSLR users will be happy.

And it won't cost much to get Rec. 2020 compliant monitors and TVs. With current TVs we can already get that color gamut. QDEF films allow that. The ASUS GX500 has 100% NTSC coverage thanks to a QDEF film.
 
Should be. But they are not. Software is made for majority of actual hardware out there. This is not ideal world. So for most of average joes out there wide gamut is evil. And i don't expect for that to change soon. Whomever wishes to deal with problems of wide gamut should do that by themselves due professional needs or due .. religious beliefs of superiority of said technology .. but NOT to advise that to others.
 
And let's be honest: Some people actually like the exaggerated artificially vibrant colors of a wide gamut monitor, much in the same way that people enjoy the colored odd-order distortions of a tube amplifier when listening to music.

But if you're a color purist or snob, stay clear of the wide gamut monitors.

Uh... I'm not sure if what you are saying about tube amps is relevant nor accurate.

electrical engineer said:
A lot of music features great dynamic signal swings, and it has been well established that in tube amplifiers the onset of clip/overload as maximum power is reached is gradual and rising distortion is of predominately low even-order harmonic nature.
 
Current UHD codecs like H.264 and H.265 (HEVC) support this wide gamut color space. It's wider than AdobeRGB.
No question of the codec. The color triples transport the information inherently. Of course standards must be established because unlike in the ICC workflow we can't rely on characterized data. But it is - apart from backward compatibility constraints - no problem to transport data relative to any color space even with a MPEG1 stream - the end device must only know what content it receives and how to handle it.

Finally we will get the color shades we are losing when capturing photos or shooting video. DSLR users will be happy.
They should already be happy for about 10 years when LCDs with color gamuts considerably exceeding sRGB were ready for the market and sold ;-).
 
Last edited:
electrical engineer said:
A lot of music features great dynamic signal swings, and it has been well established that in tube amplifiers the onset of clip/overload as maximum power is reached is gradual and rising distortion is of predominately low even-order harmonic nature.

Was that really written by an electrical engineer, or by a company selling tube amps?
 
They should already be happy for about 10 years when LCDs with color gamuts considerably exceeding sRGB were ready for the market and sold ;-).
No, they're not happy because their photos can't be seen by people in their true nature and as intended by the photographer/artist.
 
Should be. But they are not. Software is made for majority of actual hardware out there. This is not ideal world. So for most of average joes out there wide gamut is evil. And i don't expect for that to change soon. Whomever wishes to deal with problems of wide gamut should do that by themselves due professional needs or due .. religious beliefs of superiority of said technology .. but NOT to advise that to others.
I don't see people having any problem with bad calibrated screens like the ones used on phones (specially those using AMOLED).

So using wide gamut isn't any problem. Wide gamut is not evil. What is evil is monitors not capable of showing the different color spaces correctly.
Wide gamut is superior, it allows to show you more colors.
 
Not superior if in real world most of content displayed is not wide gamut aware and if srgb emulation in most implementations suck. There are many things that are superior on paper .. but shattered with harsh reality of sucky implementations. And unless at least half of implementations will be flawless (software part included), i cannot claim as you did, that "All monitors should be Wide Gamut". For now this would make things worse.
 
No, they're not happy because their photos can't be seen by people in their true nature and as intended by the photographer/artist.
These screens could and can be bought by anyone - therefore I have problems with the word "finally". A TV screen with extended color space must also be bought and the source characteristic still has to match the intended reproduction characteristic outside an ICC workflow. Apart from that an important part of the workflow has always consisted in printing the images. The mentioned professionals therefore were of course very happy to have a capable display for retouching and proof simulations.
 
Last edited:
I wonder if said professionals that often deal with high end printing make up even 1% from 'people' in thread title .. Niche of normal users by my guestimate is just that - by two orders of magnitude bigger, and is what really drives market and most of product design choices. Exceptions only prove rule.
 
Just because the codecs support them, it doesn't mean that actual delivery mechanisms do - DVD, blu-ray, cable and streaming platforms are all using rec 709 / sRGB. Web image and text standards are actually a subset of sRGB. Disc based standards seem unlikely to evolve, and the next broadcast update is a ways away. I see nothing in the next few years that will disrupt the vast vast majority of professionally produced content being exclusively sRGB in consumer delivery.
 
I just want to point out that color depth is different from wide gamut/Adobe RGB. A couple people have gone around conflating the two. You most assuredly want a high bit-depth display; a decent panel with a 10-bit depth will (hopefully) reduce banding to nothing.

I think it will be very cool when everything is switched over to Rec.2020, but until then, wide gamut color spaces aren't good for standard web browsing or computer work (not because wide gamut color spaces are in themselves bad but because the incompatibility of these color spaces and sRGB).
 
I just want to point out that color depth is different from wide gamut/Adobe RGB.
Agreed the terms are often used interchangeably when they shouldn't be, but afaik the only 10-bit panels available also happen to be wide gamut. If you use a 10-bit panel in sRGB mode or have it calibrated down to sRGB then it isn't using the full range of color and is thus something less than 10-bit. A full 10-bits squeezed into sRGB would certainly make for smooth gradients, but then I don't recall the last time I've even noticed banding on a decent 8-bit sRGB panel.
 
While color depth and color gamut are completely different concepts, they do share an interesting relationship (as austin alludes to), in that wider color gamuts require higher color depth to preserve smooth transitions across the gamut. Similarly, higher dynamic ranges require higher bit depth to preserve smooth transitions across the luminance range.

...but then I don't recall the last time I've even noticed banding on a decent 8-bit sRGB panel.

It's not typically noticeable, but it's easy to create situations where banding occurs. There are also certain types of content in video that expose the limitations of 8 bit, for example in fade to black transitions, or when a light source in the scene has a halo around it.

p.s. CRTs are capable of very high color depth, but I guess they're not really "panels" :p
 
AMD=less-to-no banding
Nvidia=banding on 99% of monitors when viewing gradients

http://forums.guru3d.com/showpost.php?p=4740861&postcount=599

I see assertions and no proof in that post.

my own experiments suggest that my Nvidia card (GTX 660), connected to my FW900 with a DVI-VGA cable, is capable of processing the LUT with 10 or 11 bits of precision (256 steps, but at least 1024 step sizes to choose from; or, if you like, 16.7 million simultaneous colors, that can be defined from a palette of over 1 billion colors).
 
It's true for 6-8 Bit LCD's/99% of typical monitors, not sure about CRT's. AMD gpu's offer better image quality and display related options like the EDID/Color Temperature setting while Nvidia does not dither and has HDMI+Displayport signal range issues. 7/8 of the gpu's I owned are Nvidia btw, and I am currently using a 780 Lightning.
 
I've never owned anything other than CRTs, so haven't had the chance to experiment. It could be the DAC is what allows the higher bit precision with my GPU.

Have there been any objective, quantifiable investigations into AMD vs Nvidia GPU image quality? I'm talking low level things, like quality of the DACs, or integrity of the digital signal, not "features" like digital vibrance or color temperature. What are you referring to specifically when you say AMD has better image quality? If it's the LUT precision issue, that certainly qualifies, but I'd be curious to see some reports on the issue. Is the dithering feature officially listed by AMD?
 
Last edited:
Have there been any objective, quantifiable investigations into AMD vs Nvidia GPU image quality?

None that I am aware of, keep in mind most gpu reviewers use 30" wide gamut monitors.

What are you referring to specifically when you say AMD has better image quality?

Less gradient banding and the ability to allegedly limit a wide gamut monitors color space using the Color Temperature/EDID setting:

Go into the AMD Catalyst Control Center>display color section and check "Use Extended Display Identification Data (EDID)" in Color Temperature Control

Is the dithering feature officially listed by AMD?

I google'd 'AMD Dithering,' and did not find specific results and I never look at official gpu specifications and websites. Yamaksoa and I both own nearly identical Qnix's (compared preset measurements measured with an i1dp) and his is banding free before calibration while mine is not, plus he has tested friends Qnix's with their Nvidia gpu's. I read about people using Lagom's gradient banding test on their PS3 and Nvidia GPU with uncalibrated monitors, and their PS3 showed significantly less banding. The PS3 vs. Nvidia differences apply to my my Crossover 2720MDP (8 Bit +FRC S-IPS).
 
Last edited:
While color depth and color gamut are completely different concepts, they do share an interesting relationship (as austin alludes to), in that wider color gamuts require higher color depth to preserve smooth transitions across the gamut. Similarly, higher dynamic ranges require higher bit depth to preserve smooth transitions across the luminance range.
its not entirely true, I would argue if its true at all
the only reason why 10bit is 'necessary' for wide-gamut displays is cause of need of conversion of sRGB content to wide-gamut which without higher color depth might introduce banding caused by mismatch of brightness levels of each RGB subpixels eg. when calculated brightness of colors were eg. 200.5 ; 129.8 ; 65.3 they would need to be clipped to 201 ; 130 ; 63 on 8bit display and going to 10bit we would have 802 ; 519.2 ; 261,2 which would be clipped to 802 ; 519 ; 261 which in 0-255 scale would be 200,5 ; 129,75 ; 65.25. Absolute error in 0-255 scale for 8bit panel would be 0.5 ; 0.2 ; 0.3 and for 10bit 0 ; 0.05 ; 0.05 so there is great improvement

If you had content mastered for wide-gamut then need for 10bit would be the same as content mastered with any color space when viewing it on monitor having natively this color space even including grayscale content on black&white monitor. Notion that wide-gamut would produce more banding is false. Its like I said result only of necessity of displaying accurately sRGB content.

Besides gamut conversion from sRGB to wide-gamut even if its done by simple 3x3 matrix multiplication without any dithering is almost banding free. I tested it on my 'highest gamut monitor there is' RGB-LED display and banding error even on silky smooth skies were much less than *ANY* gamma change. So if one correct gamma on NV card he/she is getting much more banding than crappiest conversion of sRGB content to wide-gamut.

One other thing: due to A-FCR all panels today have perceptually 10bit resolution and even if panel itself is 6bit its still perceptually better than 8bit to the point that such 6bit panel can have gamma control with smooth gradients on every setting. With only 8bit resultion it would be not possible to achieve.

So summarizing:
- wide gamut in itself do not need more bitdepth than sRGB
- this need is only caused by need to convert between color spaces
- gamma control/correction is much greater reason for necessity for more bitdepth as it causes a lot more banding if its done in 8bit

So imho if anyone starts talking about 10bit in 'wide-gamut' topic its sign of lack of real knowledge about wide-gamut and bitdepth. We do not see this 10bit discussion when there is talk about gamma which is (at least to me) very telling...
 
its not entirely true, I would argue if its true at all
the only reason why 10bit is 'necessary' for wide-gamut displays is cause of need of conversion of sRGB content to wide-gamut which without higher color depth might introduce banding...

This is completely incorrect.

Higher bits certainly do help when converting between color gamuts, but it isn't the only reason that higher bits are useful.

With higher dynamic ranges, and wider gamuts, you absolutely do need more individual steps to encompass the larger ranges. This is actually a major issue in the standards that are being developed for UHD - one of the balancing acts is between the size of the gamut, the dynamic range of luminance, and the available bit depth.

Listen to Joe Kane discuss this very issue here (from around 43:00 onwards, though I recommend the entire interview).

It's fairly obvious though - if you have a higher dynamic range (whether that range encompasses luminance or color saturation), you need more addressable points within that range to avoid banding artifacts. 256 gradations from black to white barely suffices with a dynamic range from ~0 nits to 100 nits. You think that when you move up to 600 nits you're not going to run into severe banding issues with only 256 addressable points?

Same thing with gamut area - larger gamuts means more saturated primaries, which means that if you only have 8 bits to traverse the space from fully desaturated (gray) to fully saturated, you're going to see a lot of banding. Joe Kane says that a minimum of 16 bits is needed to maintain adequate color resolution with Rec 2020's gamut, if not 24 bits.
 
I google'd 'AMD Dithering,' and did not find specific results and I never look at official gpu specifications and websites. Yamaksoa and I both own nearly identical Qnix's (compared preset measurements measured with an i1dp) and his is banding free before calibration while mine is not, plus he has tested friends Qnix's with their Nvidia gpu's. I read about people using Lagom's gradient banding test on their PS3 and Nvidia GPU with uncalibrated monitors, and their PS3 showed significantly less banding. The PS3 vs. Nvidia differences apply to my my Crossover 2720MDP (8 Bit +FRC S-IPS).

Interesting, I'd love to experiment with an AMD card on an LCD panel. It would be pretty straightforward to measure LUT precision using Psychtoolbox or ArgyllCMS and a decent light measuring instrument. If you happen to have a colorimeter, let me know, ArgyllCMS has a very simple switch that attempts to estimate the bit depth of the LUT (your post suggest you have an i1dpro)
 
What are you referring to specifically when you say AMD has better image quality? If it's the LUT precision issue, that certainly qualifies, but I'd be curious to see some reports on the issue. Is the dithering feature officially listed by AMD?
There must be a dithering stage for bith depth conversion. Linearization via vcgt is visually lossless when using an AMD card in a 8bit per channel output scenario.

Same thing with gamut area - larger gamuts means more saturated primaries, which means that if you only have 8 bits to traverse the space from fully desaturated (gray) to fully saturated, you're going to see a lot of banding. Joe Kane says that a minimum of 16 bits is needed to maintain adequate color resolution with Rec 2020's gamut, if not 24 bits.
Sometimes you have to exaggerate a little to explain the principle - but irrespective of the legitimate question about input color depth: Internal processing with high precision has always been important to allow for accurate and lossless reproduction.
 
Last edited:
Internal processing with high precision has always been important to allow for accurate and lossless reproduction.

aye, this is an interesting read, especially if you're a fan of the show "justified". It discusses not only the importance of using high precision during intermediate calculations, but also a wider color space (in fact, a color space that can only be defined mathematically).

But yes, the issue of precision is different from the point about having higher (framebuffer) bit depth.
 
@spacediver
because it didn't made any sense to me mathematically that wide-gamut would need more bit depth and having nothing better to do I tested if simplest shader based 3x3 matrix multiplication based color correction introduce any banding when converting from rec. 709 to my RGB-LED massive native gamut. This method is simplest and most prone to banding and I even checked if MPC-HC is doing any dithering: it is not, floating point value is rounded down to nearest 8bit value.

guess what, there is no visible banding at all. I tested it in various mp4 files with nice gradients and some png files with artificial test gradients. Gamma correction done by eg. c0 = pow(c0, 0.9) introduce nasty banding like you would expect from gamma correction done in 8bit workflow
gamut correction on the other hand NEVER introduced any banding. Because 8bit wide-gamut display can easily perceptually losslessly display sRGB content there just can't be more bitdepth requirement for wide-gamut for anything.

few hours wasted but now I literally KNOW that all this 10bit talk when it comes to wide-gamut comes just out of ignorance, kinda like like the one that made people for centuries believe sun revolves around the earth because it was so obvious... kinda obvious like more colors = more banding... right? :eek:

think hard why wide-gamut do not need more bitdepth, maybe you will get smarter
or throw another crappy podcast at me where people are making wish lists for perfect displays and not science :rolleyes:
I would too like that we evolved past this goddamn 8bit per channel 24bit overall color limitation set in early computing ages but it is not that simple as it might seem and I won't jump 'improve everything, we won't get another chance anytime soon' bandwagon cause its fueled by lies and ignorance. 10bit is needed but not for wide-gamut, it is needed NOW cause 8bit was never enough and wide-gamut have nothing to do with it. This kind of ignorant association is imho hurting both wide-gamut and higher bitdepth adoption :mad:

oh, and do not confuse dynamic range with gamut. Dynamic range is its own thing and much simpler to think about cause it doesn't involve 3x3 matrix multiplication ;)
 
@spacediver

think hard why wide-gamut do not need more bitdepth, maybe you will get smarter
or throw another crappy podcast at me where people are making wish lists for perfect displays and not science :rolleyes:
I would too like that we evolved past this goddamn 8bit per channel 24bit overall color limitation set in early computing ages but it is not that simple as it might seem and I won't jump 'improve everything, we won't get another chance anytime soon' bandwagon cause its fueled by lies and ignorance. 10bit is needed but not for wide-gamut, it is needed NOW cause 8bit was never enough and wide-gamut have nothing to do with it. This kind of ignorant association is imho hurting both wide-gamut and higher bitdepth adoption :mad:

You seem to be fundamentally misunderstanding a basic principle here (and btw, an hour long in depth interview with Joe Kane is not a "crappy podcast").

Let's start from a really simple example, and discuss color saturation.

Suppose your color gamut triangle is super tiny - let's say a 10th the size of sRGB. In this case, the primaries will be really unsaturated, compared to sRGB's primaries. Agreed?

Now, suppose you measure JNDs (just noticeable differences) in saturation, starting at the white point, and moving out towards the blue primary (and suppose we do this at full brightness). Suppose you find you have 50 JNDs.

Now, suppose you do exacty the same experiment, but on a display that has the full sRGB gamut. You will certainly require more JNDs to go from the white point to the blue primary. Agreed?

That is the only point that is being made here. The wider your gamut, the more JNDs there are within the gamut, and the more JNDs there are, the more bits are required to avoid the distance between each successive addressable point exceeding the JND distance. It is a general principle.

Perhaps on your display, you have enough bits to provide smooth transitions even with the wide gamut.

I highly doubt this though - It is very easy to create a situation where banding occurs even in sRGB with a full 8 bits. Here's an example I just created. Now if you had 10 bits, this banding would be much less evident, if perceivable at all. If I kept it at 8 bits, and had a gamut much larger than sRGB, the banding would become more evident.

10ht2f5.png
 
Or just gradient from black to white and make it, say, 1920 pixel wide. 8-bit is not even close enough, obviously.
 
Or just gradient from black to white and make it, say, 1920 pixel wide. 8-bit is not even close enough, obviously.

Yep, and while this has more to do with the luminance dynamic range than the size of the color gamut, the principle is exactly the same: larger ranges (whether the quantity is luminance or color saturation) require more bits.

also, the relationship between gamut size and required bit depth is acknowledged in the academic literature:

Characterizing the difference between the BT.709 and BT.2020 color gamut is best shown with the chromaticity diagram in Figure 1. In the Figure, the blue curve denotes the space of surface color, and the green and red curves denote the BT.709 and BT.2020 color gamut. As can be seen, the green and red primaries of BT.2020 are particularly expanded. Thus, BT.2020 content can convey a larger range of colors. We note that this larger color gamut is also a motivation for higher bit-depth, as the more significant bits can be used to denote the expanded space.

(reference)
 
p.s. XoR, I think part of the miscommunication is that you are talking about banding due to quantization artifacts, while the banding I'm talking about has nothing to do with rounding errors.
 
does it look like terribly more banding to you
1r1OrYv.png


it was done with worst type conversion that there is, everything would do better job at representing this gradient on my monitor native gamut, especially if it was 'mastered' in this gamut originally

so if such crappy conversion have that fine quality then I cannot imagine wide-gamut needing that much bitdepth than sRGB. 9bits would most probably provide much more than enough headroom to do any conversion completely losslessly and it would enable to achieve finer gradients on even artificial wide-gamut colorspaces than you can get from 8bit sRGB.

My point is that all this 10bit necessity for wide-gamut is exaggeration. It is not needed more for wide-gamut that it is needed to represent nice smooth greyscale.

ps. and make one thing clear: sRGB emulation vs native gamut have the same amount of banding on my monitor. There is perceptually no difference at all. You can check it if you have good wide-gamut monitor

edit://
my shader
Code:
sampler s0 : register(s0);
float4 p0 : register(c0);

static float4x4 r2r =
{
0.7097178, 0.2938549, -0.0035727, 0, // calculated for LM240WU5-SLA1
0.0548424, 0.9164684, 0.0286892, 0,  // tested on LG W2420R
0.0092153, 0.0449556, 0.9458291, 0,  // sholud apply also to  HP DreamColor LP2480zx and Qatographic 240 LED excellence
0, 0, 0, 0
};

float4 main(float2 tex : TEXCOORD0) : COLOR
{
float4 c0 = tex2D(s0, tex);
bool a = 1; //0 to disable gamma correction before multiplacation
float gamma = 2.4; //assume 2.4 for rec. 709
float gamma2;
gamma2 = 1/gamma;
if (a) c0 = pow(c0, 1/gamma2);
c0 = mul(r2r, c0);
c0 = saturate(c0);
if (a)  c0 = pow(c0, gamma2);
//c0 = pow(c0, 0.9);
return c0;
}

I cannot see much difference between it and built in sRGB mode. Maybe there is but because switching modes take two operations and too long testing it is very hard. Besides if it cannot be discerned in that way then it only tells that this shader is very good. I consider builtin sRGB the best sRGB I ever saw on any monitor ever
 
Last edited:
you keep referring to wide gamut as if that's a well defined standard. I trust you understand that that gamuts can vary in size arbitrarily.

Do you realize just how wide Rec 2020 is? The primaries are virtually touching the spectral locus. If you go to my original comment, you'll note I was making a general point about the relationship between gamut size and bit depth.

I'm sure you noticed banding in the image I posted. If you inspect it, you'll note the RGB values of each band are neighbours in the addressable space. In other words, that gradient represents the finest color resolution in 8 bits. You'll also notice that the gradient goes from completely desaturated to completely saturated red. With 8 bits, to traverse that particular gradient requires 44 steps. On my sRGB display, I can clearly see banding between almost all the steps (though some are harder). The ones that are harder are just around my JND (my perceptual threshold). Do you not understand that if this same image was rendered on a wider gamut display, the banding would become more evident, since those same 44 steps would have to cover a larger perceptual range?
 
Back
Top