I've been doing some further investigation into the Windows 11 SDR Content Brightness setting. It seems that the SDR Content Brightness actually controls the SDR reference white level (Paper White) and thus controls the SDR reference white level for Auto HDR. Microsoft stated that the minimum value is 80 nits, so 0% = 80 nits (DirectX reference).
Microsoft didn't specify what the maximum value is, though. So, I ran an HDR metadata test (using HDR + WCG Image Viewer, HDRImageViewer) on a 100% white SDR window in Windows HDR mode to determine maxCLL values. SDR Content Brightness of 0% = 81 nits, 25% = 181 nits, 50% = 287 nits, 75% = 401 nits, and 100% = 498 nits. Using 0% = 80 nits and 100% = 500 nits, this nearly linear relationship yields the following equivalency:
SDR Content Brightness (%) = (SDR reference white level - 80)/4.2
So, SDR Content Brightness 0% is SDR reference white level 80 nits, 5% is 100 nits, 10% is 120 nits, 17% is 150 nits, 29% is 200 nits, 52% is 300 nits, and 100% is 500 nits.
I'm curious if this formula works for your individual monitor. You can test it by doing the following:
1: Set up your SDR and HDR picture modes as similar as possible. You can't compare if your have HDR in a saturated/vivid mode and SDR in an accurate mode, for instance.
2: Pick an SDR brightness setting and note the luminance (should be available in TFT Central or rtings.com review).
3: Use this luminance (in nits) as the SDR reference white level to calculate the SDR Content Brightness and set it in Windows HDR mode
4: Flip back and forth between SDR and HDR (Win+Alt+B) to see if SDR windows look the same.
This formula works pretty well for my monitor (LG C2). As a general aside, how do you generally set your SDR Paper White using in-game HDR settings? Do you target a specific luminance value or do you match it to your monitor's SDR brightness setting?
Microsoft didn't specify what the maximum value is, though. So, I ran an HDR metadata test (using HDR + WCG Image Viewer, HDRImageViewer) on a 100% white SDR window in Windows HDR mode to determine maxCLL values. SDR Content Brightness of 0% = 81 nits, 25% = 181 nits, 50% = 287 nits, 75% = 401 nits, and 100% = 498 nits. Using 0% = 80 nits and 100% = 500 nits, this nearly linear relationship yields the following equivalency:
SDR Content Brightness (%) = (SDR reference white level - 80)/4.2
So, SDR Content Brightness 0% is SDR reference white level 80 nits, 5% is 100 nits, 10% is 120 nits, 17% is 150 nits, 29% is 200 nits, 52% is 300 nits, and 100% is 500 nits.
I'm curious if this formula works for your individual monitor. You can test it by doing the following:
1: Set up your SDR and HDR picture modes as similar as possible. You can't compare if your have HDR in a saturated/vivid mode and SDR in an accurate mode, for instance.
2: Pick an SDR brightness setting and note the luminance (should be available in TFT Central or rtings.com review).
3: Use this luminance (in nits) as the SDR reference white level to calculate the SDR Content Brightness and set it in Windows HDR mode
4: Flip back and forth between SDR and HDR (Win+Alt+B) to see if SDR windows look the same.
This formula works pretty well for my monitor (LG C2). As a general aside, how do you generally set your SDR Paper White using in-game HDR settings? Do you target a specific luminance value or do you match it to your monitor's SDR brightness setting?
Last edited: