well, I do not have display with artificial gamut but one that have 'only' ~133% of the 1953 NTSC gamut area, as expressed in the CIE 1976 u'v' space vs ~75% of sRGB so my observations might not be very accurate. I would gladly test it on monitor with wider gamut but somehow it is the largest gamut monitor that there is available to buy... unlucky me
this is Rec 2020, better but not by much
and when I view this image in native gamut it actually seems to have less banding! when I use saturation slider in OSD custom mode to make it less saturated the more banding I see. How is that even possible? Monitor have 10bit panel and high internal precision and this banding is definitely not caused by its internal crappy processing cause if I use convert to greyscale in irfan view its making way more banding now visible spanning over multiple original bands (bands of one color are very wide) which is a proof that monitor indeed is calculating in way more than 8bit because width of bands remain the same in monitor greyscale vs native gamut
I do not want to jump hastily to strong conclusions but it seems that its pretty much reverse to what you are suggesting it should be... so if we have this false belief that wide gamut display would make image more banding prone we can treat issue of representing smaller color spaces in bigger alone.
So I created lartest possible color space that there could even be (0.0001 instead of 0 cause of division by zero error)
R x 1.0000 y 0.00001
G x 0.0001 y 1.00000
B x 0.0001 y 0.00001
which gave this matrix
0.4338944, 0.3762235, 0.1898821, 0,
0.2126390, 0.7151785, 0.0721825, 0,
0.0177555, 0.1094555, 0.8727890, 0,
and while it made your example to exibit serious banding it is still less banding than converting this image to greyscale! So it seems that converting any video to greyscale will have more added banding than worst case scenario which is encoding sRGB signal in impossibly high gamut to which Rec. 2020 doesn't even come close. Now do you see that much added banding going to greyscale in real life images? There doesn't seem to be any banding be it silky smooth skies or other fine gradients you normally see. Again unlike any gamma manipilation which immediately produce severe banding in those scenarios
You can test if this artificial gamut conversion is that bad yourself, you will just have to install MPC-HC if you do not have it already. Like I said before, if display had this gamut natively it would show even less banding cause of out perception that sees colored images as more mentally blurred.
So where is the issue if all real life tests done on real images show there is none and even artificial images which look already like crap in sRGB hardly show any deterioration in image quality when going to much higher gamut that is near Rev 2020 using really bad color space conversion algorithm which could be easily improved with dithering?
ps. 10bit necessity for wide-gamut myth
this is Rec 2020, better but not by much
and when I view this image in native gamut it actually seems to have less banding! when I use saturation slider in OSD custom mode to make it less saturated the more banding I see. How is that even possible? Monitor have 10bit panel and high internal precision and this banding is definitely not caused by its internal crappy processing cause if I use convert to greyscale in irfan view its making way more banding now visible spanning over multiple original bands (bands of one color are very wide) which is a proof that monitor indeed is calculating in way more than 8bit because width of bands remain the same in monitor greyscale vs native gamut
I do not want to jump hastily to strong conclusions but it seems that its pretty much reverse to what you are suggesting it should be... so if we have this false belief that wide gamut display would make image more banding prone we can treat issue of representing smaller color spaces in bigger alone.
So I created lartest possible color space that there could even be (0.0001 instead of 0 cause of division by zero error)
R x 1.0000 y 0.00001
G x 0.0001 y 1.00000
B x 0.0001 y 0.00001
which gave this matrix
0.4338944, 0.3762235, 0.1898821, 0,
0.2126390, 0.7151785, 0.0721825, 0,
0.0177555, 0.1094555, 0.8727890, 0,
and while it made your example to exibit serious banding it is still less banding than converting this image to greyscale! So it seems that converting any video to greyscale will have more added banding than worst case scenario which is encoding sRGB signal in impossibly high gamut to which Rec. 2020 doesn't even come close. Now do you see that much added banding going to greyscale in real life images? There doesn't seem to be any banding be it silky smooth skies or other fine gradients you normally see. Again unlike any gamma manipilation which immediately produce severe banding in those scenarios
You can test if this artificial gamut conversion is that bad yourself, you will just have to install MPC-HC if you do not have it already. Like I said before, if display had this gamut natively it would show even less banding cause of out perception that sees colored images as more mentally blurred.
So where is the issue if all real life tests done on real images show there is none and even artificial images which look already like crap in sRGB hardly show any deterioration in image quality when going to much higher gamut that is near Rev 2020 using really bad color space conversion algorithm which could be easily improved with dithering?
ps. 10bit necessity for wide-gamut myth
Last edited: