Chrome subsampling, why I'm not experiencing it?

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,765
As title.
I'm using an Acer XV273K in 4K, 120Hz and HDR and GSYNC.

When using all those features my monitors goes down to 8bit + FRC.

Isn't this a condition when chrome subsampling should arise?

Is this test enough for testing chrome subsampling?

chroma-444.png

I see no "errors" in this image.

Can you help me understand? Thanks
 
Why would that enable chroma subsampling?

With non-compressed 8bit 4k you don't need chroma subsampling until 144Hz on the x27/pg27uq, for example.
 
Why would that enable chroma subsampling?

With non-compressed 8bit 4k you don't need chroma subsampling until 144Hz on the x27/pg27uq, for example.

but the point is, why use 10bit with chroma subsampling when you can use 8bit + frc?
 
but the point is, why use 10bit with chroma subsampling when you can use 8bit + frc?

Your original question wasn't even about why to use 10bit over 8bit + FRC, it was about why you aren't getting chroma subsampling. DP 1.4 has enough bandwidth for 4k120Hz 444 8bit and will not result in chroma subsampling.
 
Your original question wasn't even about why to use 10bit over 8bit + FRC, it was about why you aren't getting chroma subsampling. DP 1.4 has enough bandwidth for 4k120Hz 444 8bit and will not result in chroma subsampling.

ok thanks for the info, now I add another question.
is there some visible difference between 10 bit without chroma subsamopling and 8bit + frc "in real life"?
and does it have sense using chroma subsampling when there is 8bit + frc?
 
I think you are confusing concepts.
Chroma subsampling have nothing to do with how many bits per color channel you use but with color resolution.
1590501197564.png

In this screenshot neither RGB nor YCbCr444 have any have any chroma subsampling so color resolution is preserved and you will see no issues with images used to test chroma subsampling artifacts.

Output color depth is what control how many "bits" you use.
Windows desktop is 8-bit per color channel so it will give correct results most of the time. Exceptions are:
- when you change gamma response in any way eg. gamma control, software monitor calibration, etc.
- software you use can somehow use higher bit depths
Nvidia and Intel cards cannot apply dithering to gamma correction so any gamma control when using 8 bpc will result in banding. This is not the case with AMD cards because AMD cards use dithering, identical to A-FCR in fact.
As far as I know any software that uses 10bit surfaces will be dithered on any graphic cards, at least on Nvidia and AMD cards.

For HDR it is recommended (if not mandatory?) to use 10 bits at least. If it can somehow work with 8 bits then you can/will have either dithering or banding. I am not sure because I never had HDR capable monitor so I just do not know how different GPU's behave with it enabled...
 
I think you are confusing concepts.
Chroma subsampling have nothing to do with how many bits per color channel you use but with color resolution.
View attachment 248335
In this screenshot neither RGB nor YCbCr444 have any have any chroma subsampling so color resolution is preserved and you will see no issues with images used to test chroma subsampling artifacts.

Output color depth is what control how many "bits" you use.
Windows desktop is 8-bit per color channel so it will give correct results most of the time. Exceptions are:
- when you change gamma response in any way eg. gamma control, software monitor calibration, etc.
- software you use can somehow use higher bit depths
Nvidia and Intel cards cannot apply dithering to gamma correction so any gamma control when using 8 bpc will result in banding. This is not the case with AMD cards because AMD cards use dithering, identical to A-FCR in fact.
As far as I know any software that uses 10bit surfaces will be dithered on any graphic cards, at least on Nvidia and AMD cards.

For HDR it is recommended (if not mandatory?) to use 10 bits at least. If it can somehow work with 8 bits then you can/will have either dithering or banding. I am not sure because I never had HDR capable monitor so I just do not know how different GPU's behave with it enabled...

At least on NVIDIA, you need to select at least 10bit.

Note HDR on Windows is a mess. For example, any overlay (such as the Windows Volume OSD) will cause monitors to drop back to SDR mode briefly unless the game in question has a "proper" Borderless Fullscreen option (*glares at Warzone*). People have been begging Microsoft to fix these bugs for over two years now. At this point, I've all but given up on HDR on PC, since the implementation is so piss poor.
 
On an Acer X27 you can run HDR content without 10bit. You can leave it at 4k120Hz 8bit 444 and still run HDR content on it although I'm assuming you'd run into more banding vs dropping it down to 98Hz 444 10 bit. Thing is though the monitor itself doesn't even use a true 10 bit panel in the first place so really 10 bit is just 8bit + FRC. Me personally I have not noticed a huge difference in 120Hz 8bit vs 98Hz 8bit + FRC but maybe I'm not looking for the right things.

ok thanks for the info, now I add another question.
is there some visible difference between 10 bit without chroma subsamopling and 8bit + frc "in real life"?
and does it have sense using chroma subsampling when there is 8bit + frc?

There is no difference in your case because your monitor isn't a true 10 bit panel in the first place, it's just 8bit + FRC so that's all you're gonna get. Using the "10bit" mode is actually 8bit + FRC and using 8bit is just 8bit. If you want to try and see any difference, use 120Hz 444 8bit then switch it to 98Hz 444 10bit (8bit + FRC). But there is no "TRUE 10bit" mode on this monitor.

1590525402012.png
 
Last edited:
But there is no "TRUE 10bit" mode on this monitor.
There is no TRUE HDR either and if there was it would not need panel to be 10-bit to get higher than 8-bit precision.

I would not worry about dithering because you would most likely be not able to notice it even if panel was 6-bit let alone 8-bit outside maybe staring at test patterns up-close. These A-FCR algorithms got pretty efficient.
 
For HDR it is recommended (if not mandatory?) to use 10 bits at least. If it can somehow work with 8 bits then you can/will have either dithering or banding. I am not sure because I never had HDR capable monitor so I just do not know how different GPU's behave with it enabled...

Thanks for the reply. I can say the same, always used 8bit+frc panels even on HDR contents and never seen badning or dithering if not with some special contents made to test the issue.

So the end of this thread is, please stop in searching a 10bit hdr experience, a 8bit+FRC with a 10 bit panel is more than enough for HDR contents.
It's better to have high refresh rate and vrr than 10bit hdr with lower refresh rate.
 
Back
Top