4:4:4, HDR, 60hz possible @ lower resolution HDMI 2.0?

DF-1

2[H]4U
Joined
Jun 17, 2011
Messages
2,726
As the title says, is it possible for me to lower the resolution on my OLED TV so that I can enable HDR (10bit?), 4:4:4 @ 60hz?

What resolution would it need to be? How can I do that?


Trying to have all of those @ 4k is not possible due to HDMI 2.0. I have a B7A, GTX 1080.
 
I mean you can run 1440p (60hz, only 2019 OLEDs supported 1440p 120hz) or 1080p 120z, but I'm not sure what you're trying to accomplish here.

Using 4:2:2 is already accomplishing the same thing as running slightly lower resolution, with the minimum possible visual impact.
 
1440p looks really shitty. I'd like to get something higher res. I'm looking for the highest resolution i'd have to drop to to achieve HDR 4:4:4 60hz.

Also nvidia's control panel seems wonky as hell. Doesn't always let me select anything other than RGB/8bpc and the Full/Limited toggle doesn't always switch.
 
Last edited:
Nvidia's control panel is a complete piece of shit.

Really, I'm losing it with Nvidia's shitty software. AMD lets you clone displays and take the highest resolution of the two. Nvidia makes you take lowest. Ugh.
 
As the title says, is it possible for me to lower the resolution on my OLED TV so that I can enable HDR (10bit?), 4:4:4 @ 60hz?

What resolution would it need to be? How can I do that?


Trying to have all of those @ 4k is not possible due to HDMI 2.0. I have a B7A, GTX 1080.

You can run 4:4:4 60 Hz with HDR at 8-bit just fine. All you will get is a little extra banding and isn't super noticeable on most content.
 
You can run 4:4:4 60 Hz with HDR at 8-bit just fine. All you will get is a little extra banding and isn't super noticeable on most content.

How?

I thought HDMI 2.0 cannot do that. Thats what I saw on this forum and others as well.

When I enable HDR and WCG in windows with those settings it says i have Bit Depth of 8-bit with dithering, and the TV does not swap to HDR (top right icon should pop up).

If I set it to 30hz while keeping everything the same (RGB 4:4:4, 4k) I can enable HDR/WCG and the HDR icon does pop up, although it swaps to 12 bit automatically.

I also thought you want at least 10 bit depth for HDR.
 
You can run 4:4:4 60 Hz with HDR at 8-bit just fine. All you will get is a little extra banding and isn't super noticeable on most content.
This is what I do, it saves the pita of changing to ycbcr 4:2:2 before using HDR, and then converting back.

No matter which route you take there will be banding (unless you change settings before/after using HDR).
If you use ycbcr the colour palette is restricted on each colour, 16 .. 235 instead of 0 .. 255.
If left ready for HDR your desktop will use 4:2:2, 16 .. 235 colour.
This isnt a compromise I like.

With RGB you can select full (0 .. 255), that option isnt available for ycbcr.
The desktop now uses 4:4:4, 0 .. 255, full quality.
HDR 10bit will now use 8bit though.
I have only seen banding glaringly in one film for a moment.
ymmv depending on the screen you use but its worth a try.
 
Can you explain what you're trying to do - i.e.:
- Trying to set a single mode for text, media and games
- Trying to do one of the above preferentially over the others
 
Can you explain what you're trying to do - i.e.:
- Trying to set a single mode for text, media and games
- Trying to do one of the above preferentially over the others
Find a display mode with best quality for everyday use that works well with HDR, using HDMI 2.0b.
There has to be either a slight quality compromise (with the desktop or HDR media) or a convenience compromise.
There are enough issues making sure everything is set up well, I much prefer using a mode that doesnt need changing. ie it is already set up in a way I am happy with.
Once HDMI 2.1 is out and I have an HDMI 2.1 display it wont be an issue.
But for now it is.
 
My suggestion is to not run HDR for everyday text and browsing.
RGB 8-bit HDR is worse than non-HDR because of dithering but RGB non-HDR is better at text.
YUV 4:2:2 10 or 12-bit is better at HDR for movies and games but not for text.

With HDMI 2.0, any setting you choose for one mode is a compromise for the other, IMHO. I have setup profiles and macros to switch between the two (using a 3rd party macro tool). Someday it will be baked into windows or the GPU panel.
This is not even counting the HDR mode settings you should configure on your TV once you have things dialed in. I have different gamma and color settings for HDR vs Desktop (actual 2 HDR settings, game and movie). This is all just IMHO, I can see why other people would not want to bother and feel like a dithered text setting and/or restricted range colors are OK if they don't use desktop mode a lot.
 
My suggestion is to not run HDR for everyday text and browsing.
RGB 8-bit HDR is worse than non-HDR because of dithering but RGB non-HDR is better at text.
YUV 4:2:2 10 or 12-bit is better at HDR for movies and games but not for text.

With HDMI 2.0, any setting you choose for one mode is a compromise for the other, IMHO. I have setup profiles and macros to switch between the two (using a 3rd party macro tool). Someday it will be baked into windows or the GPU panel.
This is not even counting the HDR mode settings you should configure on your TV once you have things dialed in. I have different gamma and color settings for HDR vs Desktop (actual 2 HDR settings, game and movie). This is all just IMHO, I can see why other people would not want to bother and feel like a dithered text setting and/or restricted range colors are OK if they don't use desktop mode a lot.
I think you missed the point.
I dont use HDR for everyday use.
The point is I dont want to have to remember to change video mode when starting/ending HDR so have chosen the mode best suitable for all uses.
 
As the title says, is it possible for me to lower the resolution on my OLED TV so that I can enable HDR (10bit?), 4:4:4 @ 60hz?

What resolution would it need to be? How can I do that?


Trying to have all of those @ 4k is not possible due to HDMI 2.0. I have a B7A, GTX 1080.

Your options are basically:

1) Use a lower resolution
2) Use a lower refresh rate (EG: 4k 30Hz 4:4:4 HDR)
3) Just use 4:4:2
 
And remember: 4:2:2 IS just a lower resolution, but only for color. Rather than lowering the resolution of everything, it is keeping the resolution of luma (brightness) high and just lowering the resolution of chroma (color). This is a good thing since our eyes are much less sensitive to chroma differences than luma differences. It's a good way to reduce the data rate you need in a connection if you can't handle full uncompressed video. What it does is cut your horizontal chroma in half, leaves your vertical unchanged and leaves your luma unchanged. This is something used a lot, JPEG uses it, for example. So if you have a 4k display that is 3840x2160 pixels. In 4:4:4 sampling, that's the resolution for your luma and your chroma. In 4:2:2 mode you still get 3840x2160 luma, but you get 1920x2160 chroma. Or to put it another way 8.3 million luma samples, 4.1 million chroma samples. If you drop down to 2560x1440 you can, of course, do 4:4:4 sampling... but here you have only 3.7 million luma and chroma samples. So even though you have no chroma subsampling, you are still getting less color data than 4k 4:2:2, and of course way less brightness samples.

If you want to do 4k, 60Hz, HDR content, just do 4:2:2 subsampling and call it good. It's a better solution than cutting the resolution of everything. Also as for the concern with text/line art on a computer: Give it a try, I bet you notice it less than you think you will. Most text relies on sharp luma transitions, not sharp chroma transitions. When you are talking about things like black on white or white on gray, the chroma of the adjacent pixels is very similar. You don't get much if any fringing going on. It is if you have text of similar brightness but different color than the background. That is a thing, but not often, so not a big issue overall. Rtings have some good pictures to look at. If you look at the 4:2:2 screenshot you can see that the blue and gray on black text is very crisp and readable. It is only the blue on red and red on blue where it gets muddled. Not really something you run in to a lot in daily usage.

Ya it would be nice if interconnects were fast enough to never need to do any subsampling, but they aren't and that's just life. The tradeoff of a little loss in chroma detail is not a bad one. Just run it in 4:2:2 mode and be happy. Then in the event you encounter something where it is a problem for whatever reason, just switch back to non-HDR mode and 4:4:4. Or, given that there is very, very little HDR content out there, run in non-HDR 4:4:4 mode and only switch when you are doing something in HDR.

Both I think are better solutions than doing 2560x1440.
 
i can use 2560x1440 x 75hz x 10bit x 3(4:4:4) x ?? (HDR) with hdmi 2.0 or dp 1.2a?

and i can use hdr with dp 1.2a or is required hdmi 2.0/ dp 1.4 ?

i want buy the hp pavilion 3BZ12AA and i dont know if i can use all togheter
 
Not sure what that particular monitor supports but I would go by specs or a owners thread if you can find one.DP 1.2a should technically support 2560 x 1440 4:4:4 or RGB 10bit HDR as far as bandwidth goes assuming the monitor supports it.
 
Back
Top