4:4:4 limited vs RGB full, which is preferred? and HDMI 2.0a vs 2.0b switches?

markm75

Gawd
Joined
Sep 12, 2008
Messages
856
Slightly confused on the topic of setting an nvidia 1080 to 4:4:4 limited mode, or RGB full, both options which appear (and technically the 8bit vs 10bit as well)..

For pc gaming or HTPC use.. what are most going with.. to me i cant see a difference in these two, but maybe it depends on content (or if HDR is in use, which i'm not sure which titles or movie playback devices support hdr? ie: kodi on the playback).

My htpc is dual purpose both for bluray movie playback and gaming.

Any thoughts
 
Technically limited is inferior because there is less steps between extreme black and white (16-235) versus full (0-255), meaning limited is more prone to banding but in practice the difference is miniscule. If there is banding its most likely from the game, not from the range you use. There is also Ycbcr (also limited range) which movies are encoded in and may be the best idea for HTPC if you watch a lot of movies but in practice it does not really matter which you use as far as quality is concerned.

Regarding HDR, in the case of Shadow Warrior 2 you have to use either Limited or Ycbcr because if you have it in full it for some reason conflicts with HDR and results in washed out image. I dont know if this is the case with Resident Evil and the upcoming Mass Effect Andromeda but its something to keep in mind.
 
Technically limited is inferior because there is less steps between extreme black and white (16-235) versus full (0-255), meaning limited is more prone to banding but in practice the difference is miniscule. If there is banding its most likely from the game, not from the range you use. There is also Ycbcr (also limited range) which movies are encoded in and may be the best idea for HTPC if you watch a lot of movies but in practice it does not really matter which you use as far as quality is concerned.

Regarding HDR, in the case of Shadow Warrior 2 you have to use either Limited or Ycbcr because if you have it in full it for some reason conflicts with HDR and results in washed out image. I dont know if this is the case with Resident Evil and the upcoming Mass Effect Andromeda but its something to keep in mind.

Thanks.. that makes sense on the 4:4:4 limited..

When you say ycbcr, is this the same as 4:4:4 8bit, or 4:4:2 10bit, or 4:2:0 12bit etc.. not sure if i have those combos right, but basically anything thats not RGB.. i thought one and the same.

So there are titles worth checking out with HDR then .. i can also check out Horizon dawn zero hdr via ps4 pro connection to my sony 43 to see how it looks on vs off, not sure if the sony is supposed to report in the info HDR being active when it detects it or not (pc or ps4 pro for that matter). I think the sony has an option like HDR+, but i dont think thats the same as native HDR
 
Ycbcr can send 4:4:4. Ycbcr is a a signal format different from RGB and meant to be more efficient. Ycbcr has Luma (brightness) information separate from chroma (color) which allows the subsampling of the latter (4:2:2 and 4:2:0) for more bandwidth but when you use 4:4:4 its virtually identical with RGB. Conversions between the two may cause some slight color inaccuracies but the measured differences are so small that they are almost in the margin of error domain. Definetly not invisible to the human eye.

*edit* If you watch movies and play games some back and forth conversions are bound to happen. In Ycbcr mode movies are "molested" less, in RGB (if your TV is in PC mode) the games and desktop take the priority. And there is also the question what the TV does internally. Some convert to Ycbcr anyway for internal processing reasons no matter what signal you send it but there is no way of knowing how the TV works and if it does this or not. So yeah, dont fret about it too much.
 
Last edited:
Ycbcr can send 4:4:4. Ycbcr is a a signal format different from RGB and meant to be more efficient. Ycbcr has Luma (brightness) information separate from chroma (color) which allows the subsampling of the latter (4:2:2 and 4:2:0) for more bandwidth but when you use 4:4:4 its virtually identical with RGB. Conversions between the two may cause some slight color inaccuracies but the measured differences are so small that they are almost in the margin of error domain. Definetly not invisible to the human eye.

*edit* If you watch movies and play games some back and forth conversions are bound to happen. In Ycbcr mode movies are "molested" less, in RGB (if your TV is in PC mode) the games and desktop take the priority. And there is also the question what the TV does internally. Some convert to Ycbcr anyway for internal processing reasons no matter what signal you send it but there is no way of knowing how the TV works and if it does this or not. So yeah, dont fret about it too much.

I thought I read at least as far as the PC setting went that if you set it to 4:4:4 420 or 422 etc, that this used more overhead and processing and was less desirable? I also thought i read that actually if you set RGB to full on the pc, that this would crush blacks (vs limited), i'm not sure i'm seeing that result either. Its my understanding Bluray/UHD movies are generally at 4:2:0 (10bit for HDR?)..

For my sony 43 x800D, i have it set to game mode and generally leave it there, not PC mode, not sure if that changes things much in terms of what to set on the PC nvidia side in terms of RGB.. basically i have the choice of 4:4:4 Limited or RGB Full or limited

That brings into the realm of 10bit vs 8bit and i guess there are a few PC titles that do 10bit or is it that do HDR + 8bit, unsure... IE: if i were to buy a splitter for any reason, i think i'd want to target one that was HDMI 2.0a that supported 4:4:4 at 10bit, whereas most are 8bit.. (this is mainly in my living room setup where i sent a PS4 pro to two different screens, one that does HDR (and is 10bit), one that doesnt, but not at same time necessarily).. The PS4 pro has HDR, but does that imply 10bit 4:2:0 in that case (not 4:4:4), ah the bits and confusion.

I'm using a DP 1.2 to HDMI 2.0 cable which i think enables the 4:4:4 60hz option.
 
I think the overhead claim comes from the RGB->Ycbcr conversion process. Video games (and windows desktop in general) is in RGB, it is the PC format afterall. If you set your GPU to send Ycbcr it has to convert it. Movies are already in Ycbcr so no conversions are needed. The conversion is near lossless process, the measured differences are not visible by eye. Some colors lose few points of DeltaE, and some actually become better.

And the RGB Full causing black crush is bullshit and nothing but a user error. It only crushes blacks and whites when you send RGB Full into a TV that expects RGB Limited signal. Also when you do the opposite and send RGB Limited signal to TV that expects Full range the result is washed out image, grey blacks and dirty whites. When you switch between RGB Full and Limited you also have to choose a correct setting at the TV end. Some of them have Auto detection but in my experience it rarely works properly and it's better to choose manually. In Samsungs the setting is called HDMI Black Level and it is in the display settings. I dont know what it is called in current Sony's but in my old Sony EX720 it was buried somewhere deep in the system settings. With Ycbcr you dont have to change anything, TV's always show it correctly. It is pretty much the idiot proof setting. When you compare RGB Full, RGB Limited and Ycbcr (and signal chain is correct, no mismatching settings in RGB) they should all look the same. Blacks are equally black and whites are equally white and none of them should look more "contrasty and punchy" than the other.

With Sony's the Game mode is the same as PC Mode.

Regarding bits remember that in 4K 60hz 4:4:4 is limited to 8-bit. HDMI 2.0 does not have enough bandwidth for 10-bit. Either resolution has to be dropped down or chroma subsampled to 4:2:2 or 4:2:0.
 
The PS4 pro has HDR, but does that imply 10bit 4:2:0 in that case (not 4:4:4), ah the bits and confusion.

every f time you read 4k HDR over HDMI, assume low refresh rate and 4:2:0 unless openly stated otherwise:
- the hdmI cable may not handle 4k 60hz 10b 4:4:4
- the panel board may not handle it
-the source may not be able to output it
- the content displayed was not created at this quality setting in the first place.

i am not aware of a single situation were at least 1 of those conditions is not happening when HDMi is involved.
 
I think the overhead claim comes from the RGB->Ycbcr conversion process. Video games (and windows desktop in general) is in RGB, it is the PC format afterall. If you set your GPU to send Ycbcr it has to convert it. Movies are already in Ycbcr so no conversions are needed. The conversion is near lossless process, the measured differences are not visible by eye. Some colors lose few points of DeltaE, and some actually become better.

And the RGB Full causing black crush is bullshit and nothing but a user error. It only crushes blacks and whites when you send RGB Full into a TV that expects RGB Limited signal. Also when you do the opposite and send RGB Limited signal to TV that expects Full range the result is washed out image, grey blacks and dirty whites. When you switch between RGB Full and Limited you also have to choose a correct setting at the TV end. Some of them have Auto detection but in my experience it rarely works properly and it's better to choose manually. In Samsungs the setting is called HDMI Black Level and it is in the display settings. I dont know what it is called in current Sony's but in my old Sony EX720 it was buried somewhere deep in the system settings. With Ycbcr you dont have to change anything, TV's always show it correctly. It is pretty much the idiot proof setting. When you compare RGB Full, RGB Limited and Ycbcr (and signal chain is correct, no mismatching settings in RGB) they should all look the same. Blacks are equally black and whites are equally white and none of them should look more "contrasty and punchy" than the other.

With Sony's the Game mode is the same as PC Mode.

Regarding bits remember that in 4K 60hz 4:4:4 is limited to 8-bit. HDMI 2.0 does not have enough bandwidth for 10-bit. Either resolution has to be dropped down or chroma subsampled to 4:2:2 or 4:2:0.

In terms of 4:4:4 being limited to 8bit, from what i'm seeing with HDMI 2.0a and 2.0b it should be possible, though in most charts it doesnt indicate it.. so i guess its a moot point if a switcher or splitter like this one is listed as hdmi 2.0a and does 4:4:4 60 at 10-bit, plus there are probably no devices aside from a PC that could even come close i guess.

I figured the comment on crushing blacks with "full" is misinformation.. i am seeing for the Sony 43, many have to indeed set it to "full" to get the best blacks and "full" on the other end (ps4 pro or pc). Thanks for pointing that out.

I think the only spot still confusing is.. "Setting Ycbcr" vs "setting 4:4:4" vs "setting RGB full (at 4:4:4)" in terms of the nvidia control panel.. by setting ycbcr .. implying setting Nvidia to 4:4:4..(or 4:2:2 etc) and setting RGB in Nvidia meaning just that, RGB option.. in the end.. whats best.. sounds to me as going with RGB Full should always be the best bet, unless trying to achieve 10bit (and HDR) (are these 1 and the same, i'm thinking not, ie: can you have 8bit with HDR).. then with 10bit 4:2:2 or 4:2:0 perhaps.. or would RGB full and 8bit still result in HDR for bluray content (via Kodi in that case).. opening up more cans of worms here, but these terminologies can be very confusing. Its a shame they seem or have to be this messy.

Edit.. i guess only UHD blurays are higher than 8bit, the rest is still 8bit and HDR (source), vs the monitor being 10bit in many cases (or tv), from another source i read.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
In terms of 4:4:4 being limited to 8bit, from what i'm seeing with HDMI 2.0a and 2.0b it should be possible, though in most charts it doesnt indicate it.
The HDMI 2.0 spec lists the modes possible for 4k . 2.0a and 2.0b did not increase the maximum bandwidth available, so what i said remains correct: under HDMI2.0x you either get 4k 60Hz 4:2:0 or 4k 4:4:4 for 10b (HDR) content. Curiously, the spec has 4k 60Hz 4:2:2 12b.
Anyway it is moronic to "upgrade" to 10b(HDR) while downgrading color information to 4:2:0 or 4:2:2.
 
I think the only spot still confusing is.. "Setting Ycbcr" vs "setting 4:4:4" vs "setting RGB full (at 4:4:4)" in terms of the nvidia control panel.. by setting ycbcr .. implying setting Nvidia to 4:4:4..(or 4:2:2 etc) and setting RGB in Nvidia meaning just that, RGB option.. in the end.. whats best.. sounds to me as going with RGB Full should always be the best bet, unless trying to achieve 10bit (and HDR) (are these 1 and the same, i'm thinking not, ie: can you have 8bit with HDR).. then with 10bit 4:2:2 or 4:2:0 perhaps.. or would RGB full and 8bit still result in HDR for bluray content (via Kodi in that case).. opening up more cans of worms here, but these terminologies can be very confusing. Its a shame they seem or have to be this messy.

Edit.. i guess only UHD blurays are higher than 8bit, the rest is still 8bit and HDR (source), vs the monitor being 10bit in many cases (or tv), from another source i read.

Yes, RGB Full technically is the best. Only reason to not use it is because, as of now, it seems to conflict with HDR. Now, the upcoming Windows 10 Creators Update adds a proper native support for HDR so this thing may very well change and get fixed. If you only watch movies with your HTPC then Ycbcr is technically the best option because movies are all encoded in that format. Thats why I also always put bluray players in Ycbcr if the option is available.
 
Yes, RGB Full technically is the best. Only reason to not use it is because, as of now, it seems to conflict with HDR. Now, the upcoming Windows 10 Creators Update adds a proper native support for HDR so this thing may very well change and get fixed. If you only watch movies with your HTPC then Ycbcr is technically the best option because movies are all encoded in that format. Thats why I also always put bluray players in Ycbcr if the option is available.

Ah its all coming together now..

So 10bit (4:2:2 or 4:2:0) is needed for HDR (ps4 pro or future Windows 10 updates?).. Setting to RGB 4:4:4 and 8bit would negate the abilitity (in nvidia) to see PC based HDR content.. So would Nvidia ever have the ability to auto switch to 10bit mode.. (hdmi 2.1 may change this, but i assume a new video card/tv needed)

The idea of it being "moronic" to set to 10bit.. i'm not so sure.. is this implying that HDR isnt all its cracked up to be and that if you compared (on a PC) HDR source at 4:2:0 10bit and RGB 4:4:4 8bit, you'd find the 8bit to look better? Seems unlikely though i guess in theory maybe.

In the case of the PS4 Pro.. im assuming that its "automatic" somehow switches from RGB to the 4:2:0 mode that works with HDR on the fly with the correct TV/Monitor, in my case, or my test case the Sony x800D? (ill have to test obviously, but maybe someone knows)
 
HDR also works in 8-bit. It will just be very suspectible to banding. So in the case of HDR 10-bit and above will be superior easily even if the chroma gets is subsampled. The reduced chroma is only visible on sharp colored edges (like text) on solid colored background, something that you will rarely see in video games. Also the RGB Full conflict has nothing to do with bitrate I think. Actually I dont even know why it results in washed out screen, it just does and RGB Limited or Ycbcr fixes it (4:4:4 or 4:2:2 makes no difference in this case). We'll see what happens in the future. I cannot speak for consoles since I do not have PS4 nor Xbox One.
 
i am lost at your points this far. 4:2:0 means color information is reduced 8 times (2 from the :2 and 4 from the :0). 10b color, while sounding fun, only adds 2 bits of information, which is around half of what was lost going 4:2:0. 4:2:2 is a catchh 22: you lose 3/4 of the chroma resolution and gains 4x the number of possible colors.

Considering that:
most gamers are male , which suck big at color perception;
that HDR more often than not is being marketed for TVs that DO NOT have a gamut higher than sRGB;
that despite what your gaming studio claims, they do not created content for your HDR TV using a 10b GPU, on a high gamut monitor- because if rthey had done so, colors would be oversaturated in normal monitors.

i am inclined to stand behind my statement: HDR is moronic, call me when there are high gamut Tvs displaying content without chroma subsampling at 60Hz. there may be someone creating content in 10b 4k 60hz, but if such person is real, she/he is working with a $3k+ monitor, over displayport.
 
i am lost at your points this far. 4:2:0 means color information is reduced 8 times (2 from the :2 and 4 from the :0). 10b color, while sounding fun, only adds 2 bits of information, which is around half of what was lost going 4:2:0. 4:2:2 is a catchh 22: you lose 3/4 of the chroma resolution and gains 4x the number of possible colors.

Considering that:
most gamers are male , which suck big at color perception;
that HDR more often than not is being marketed for TVs that DO NOT have a gamut higher than sRGB;
that despite what your gaming studio claims, they do not created content for your HDR TV using a 10b GPU, on a high gamut monitor- because if rthey had done so, colors would be oversaturated in normal monitors.

i am inclined to stand behind my statement: HDR is moronic, call me when there are high gamut Tvs displaying content without chroma subsampling at 60Hz. there may be someone creating content in 10b 4k 60hz, but if such person is real, she/he is working with a $3k+ monitor, over displayport.

I have no practical experience (yet) to compare NON hdr at 4:4:4 or RGB vs HDR at 4:2:0 (or 4:2:2).. my sony will do it, i have a ps4 pro.. so maybe in the next few days.. has anyone compared an HDR title on a ps4 pro in 4:2:0 mode with HDR on vs RGB mode on the PS4? People claim to "see" a big difference, despite these obvious technical differences that should mean its worse (in some ways)..

In other digging.. it sounds as if 4:2:0 and HDR doubles the colour resolution: instead of reducing colour information horizontally and vertically, it only reduces colour information horizontally compared to 4:4:4, not sure if this makes sense. I think that if an app or game is coded to utilize HDR it will look better, so HDR 4:2:0 or 4:2:2 for HDR content and RGB for everything else.
 
People claim to "see" a big difference

I bet there is a big difference, much like people claim using a high gamut monitors makes colors "stand out" = oversaturated.

not sure if this makes sense
It shouldn't. HDMI 2.0 has 18Gbps to carry information, you better use it matching content and display capabilities.

if an app or game is coded to utilize HDR it will look better

As i said, coding a game to look better in HDR requires using a HDR capable monitor for modeling, and such content would look like s**t on sRGB monitors. I am not saying it is impossible, i am saying it would require a completely separate art development environment, just like creating content on a high gamut monitor.

I fimish revealing that i hope we adopt rec 2020 fast. it is the best of the available color formats and within range of OLED and quantum dot techs. This article has some points that i consider relevant :

At the time of introduction most CRT monitors were able to reproduce the entire sRGB color space, having color gamuts with very similar primaries, aiding greatly with its acceptance. This continued with the introduction of TFT LCD displays. Nowadays most display panels still have a color gamut with primaries that a more or less the same as the ones sRGB uses. This has led to most software being designed with the assumption that in the absence of a color profile specifying otherwise the color gamut of the display can be assumed to be equal to the sRGB color space.

The RGB color space specified by the Rec. 2020 standard is actually extremely close to the optimal RGB color space (using real primaries) for coverage of the Pointer’s gamut. The red and blue primaries are already spot on and the green primary differs only 5 nm from optimal....The reason that 527 nm wasn’t even considered, Korea proposed 531 nm and Japan 532 nm, was that the proposed primaries were based on what’s currently possible with LED backlit LCD and AMOLED and with Laser displays respectively. It would not make sense to use a green primary that can’t be easily reproduces by any of the display technologies.

Alterbatively, we could give up and create a color space ~91% of Rec 2020 just to go back to the old sRGb nirvana, when all monitors were capable of 100% coverage and there was no one complaining about color saturation.
 
i am lost at your points this far. 4:2:0 means color information is reduced 8 times (2 from the :2 and 4 from the :0). 10b color, while sounding fun, only adds 2 bits of information, which is around half of what was lost going 4:2:0. 4:2:2 is a catchh 22: you lose 3/4 of the chroma resolution and gains 4x the number of possible colors.

Considering that:
most gamers are male , which suck big at color perception;
that HDR more often than not is being marketed for TVs that DO NOT have a gamut higher than sRGB;
that despite what your gaming studio claims, they do not created content for your HDR TV using a 10b GPU, on a high gamut monitor- because if rthey had done so, colors would be oversaturated in normal monitors.

i am inclined to stand behind my statement: HDR is moronic, call me when there are high gamut Tvs displaying content without chroma subsampling at 60Hz. there may be someone creating content in 10b 4k 60hz, but if such person is real, she/he is working with a $3k+ monitor, over displayport.


First, chroma subsampling does not affect bit rate, it affects (to my understanding) sharpness of the color signal. Thing is, human perception of sharpness and detail comes mostly from the contrast and lighting. The luma channel comes in full detail and that matters the most to make picture look high resolution, chroma signal can be sacrificed quite a lot and you'd be none the wiser until you see a 4:4:4 test image or stumble upon a website that has red text on black background. With media content the subsampled chroma simply is not a big issue. Desktop however the story is different because you are more likely to see the faults of it.

I dont know what content creators have to do to make the colors work in both SDR and HDR but it seems to be working. There is nothing moronic about HDR, I can right now turn HDR on and off in Shadow Warrior 2. Or switch between SDR and HDR in Grand Tour throug Amazon Prime. I know which takes the cake, the picture quality is simply staggering in HDR. Sorry, but have you even seen a HDR picture by a good TV?

*edit* Actually I may have figured out what happens, or atleas it would be a good guess. When the game was developed in DCI-P3 color space but you game in SDR with sRGB color space, it downsamples the bigger color space content with so called Perceptual Intent. Very common thing in printing press and something you can toggle on in some color managed applications (like Firefox) and if you have calibrated monitor with ICC profile.

Simply put what it does is take the bigger gamut picture like Adobe RGB, desaturates the colors that fall outside of the sRGB gamut but avoid affecting the colors that do fit in it. Very simplified explanation but it works in converting larger color space into smaller one without making it look very dull looking which would happen if you just turned the saturation down.
 
Last edited:
chroma subsampling does not affect bit rate

it does.

Perceptual Intent

Nice try. I have no f idea what gaming studios are doing for HDR content. what i do know is that:
- they are not making it under HDMI 2.0 4k, 10b, 60hz, 4:4:4 on a Rec 2020 monitor with 4000 nits of white.
- text in game can look fuzzy under 4:2:0
-most HDR10 Tvs are basically your run of the mill TV running downgraded chroma content over 10b. For me it is OLED/quantum dot or stick to the basics.
- matching content and displays capabilities matters more than going 10b, and going 10b in another color space, that your Tv or monitor can not properly display will not convince me to use PC content on it.
i want the nirvana: 100% coverage of the color space i am using, at 60/120Hz, without chroma subsampling, with the highest contrast and resolutions possible under current tech. I had that on my seiki 39".
"what you see is what you get" is quite a thing when you are creating a game, or playing one.
 
As i said, coding a game to look better in HDR requires using a HDR capable monitor for modeling, and such content would look like s**t on sRGB monitors. I am not saying it is impossible, i am saying it would require a completely separate art development environment, just like creating content on a high gamut monitor.
Nice try. I have no f idea what gaming studios are doing for HDR content... <snip>
None of the art assets in a game need to be HDR for the game to take advantage of HDR. The dynamic range in a game is a result of the rendering engine applying light to the scene (just like real life). All modern game engines already render in HDR. At the end of the post processing phase of rendering they tone map the HDR image down to SDR for display. So fundamentally, all they need to do is author a different tone map for HDR displays (which are still a lower dynamic range than what the rendering engine is dealing with). The game then uses whichever tone map is appropriate for the display in use. Buying the equipment necessary to author the HDR tone map is peanuts. No AAA game studio would even bat an eye at it. The added testing effort is the real cost.

The above is oversimplified, there's more actual work in the nuts and bolts, but the point is... Games are already rendering in HDR, just not (usually) outputting in HDR, and the changes to support outputting in HDR are simple and well understood (without sacrificing any SDR quality).
 

No it does not. It affects the resolution of the chroma signal. 4:2:0 basically has halved the resolution of the chroma. IE, a 4K movie in 3820x2160 has luma signal in that resolution but chroma is only at 1920x1080p. You still have the same bit depth for both, 0-255 for 8-bit and 0-1024 10-bit.

None of the art assets in a game need to be HDR for the game to take advantage of HDR. The dynamic range in a game is a result of the rendering engine applying light to the scene (just like real life). All modern game engines already render in HDR. At the end of the post processing phase of rendering they tone map the HDR image down to SDR for display. So fundamentally, all they need to do is author a different tone map for HDR displays (which are still a lower dynamic range than what the rendering engine is dealing with). The game then uses whichever tone map is appropriate for the display in use. Buying the equipment necessary to author the HDR tone map is peanuts. No AAA game studio would even bat an eye at it. The added testing effort is the real cost.

The above is oversimplified, there's more actual work in the nuts and bolts, but the point is... Games are already rendering in HDR, just not (usually) outputting in HDR, and the changes to support outputting in HDR are simple and well understood (without sacrificing any SDR quality).


True, the lighting part is easy because deep down games are already doing it. But the color space part is still interesting and valid question. Now, larger colour space is not mandatory for HDR but it is a welcome addition and I am thankful movies are pushing it forward, sRGB is simply too small and old and it should just die already. Anyway, textures for example have to be in larger colorspace format right from the start and may appear desaturated and dull if viewed on smaller colourspace monitor, unless its properly downscaled like what Perceptual Intent does in color managed apps and printing.
 
First, chroma subsampling does not affect bit rate

No it does not.

You still have the same bit depth for both

You are either being obtuse or dodging my point: bit rate is reduced when you use chroma subsampling. bit depth may remain the same. Correcting myself: 4:2:2 uses 33% less bandwidth and 4:2:0 uses half the bandwidth.

deep down games are already doing it
As i said, i have no f idea how they are doing it, i do know that they are not doing it under REC 2020 ( because no monitor can cover 100% of it. edit: 80% coverage of REC 2020 is outside most OLEDs and quantum dot TVs), and certainly not with 10b, 4k, 4:4:4 60hz and hdmi 2.0 ( because HDMI 2.0 can not do it). So the perceptual intent has to be very precise, one can not assume the image was mastered in REC 2020 and the perceptual intent algorithm shoudl take in account the real coverage of the monitor used for mastering the art. unless a game industry worker tell us otherwise, i call this claim of " studios are mastering in HDR" BS.

sRGB is simply too small
i could'nt agree more
should just die already
it shoudl die when we have monitors capable of displaying other color spaces, among which i prefer rec 2020 because it the one in use with the better Pointers Gamut coverage.
textures for example have to be in larger colorspace format right from the start and may appear desaturated and dull if viewed on smaller colourspace monitor
hence "what you see is what you get" is the nirvana: you should master a rec2020 image on a rec2020 monitor and game with it in a rec2020 monitor, as we do today with sRGB. not master in a 80% rec2020 monitor and game in a rec2020 55% monitor.
 
Rolling into another sub area of this.. hdmi 2.0a vs 2.0b switches.. lets say you had the ability to render 4:2:2 at 4k 10bit.. from charts i've seen this would require 18gbps but would it also only work on an hdmi switch that was hdmi 2.0b.. Seems technically it would be needed
 
Does HDR work in Windows / PS4 Pro with RGB (Full Range) 8/10-bit at 1080p?
 
Back
Top