What Color Depth and 4:4:4 for PC gaming?

Blackstone

2[H]4U
Joined
Mar 8, 2007
Messages
3,580
I have a 1080 Plasma that is 4:4:4 compatible. I upsample to 4K using DSR but as far as the display is concerned it is seeing 1080P 60Hz.

The display will also do full 0-256 or limited color depending on how you set it.

It is HDMI 2.0. My card is a 3090.

My question is, for games (Doom Eternal, Battlefield, ect):

1. 8 bit, 10 bit or 12 bit color in the nvidia control panel?

2. Should I use 4:4:4?

3. Should I use limited or full RGB?

I guess my question is, what will make the GAMES look their best. Do they actually benefit from these settings? One would assume the higher settings are better but are games designed for that.

I am asking because I am thinking lately the image looks a little washed out and i am wondering if my settings are wrong.
 
Are you attempting to use HDR on a non HDR monitor? That'll absolutely make things look washed out.

Full Range absolutely. 12 bit color is best but most TVs won't know the difference. If the option is available, absolutely go for it.

With my cheap monitor setup I usually find better picture with RGB and full range than YCBR and limited range, but this is extremely monitor dependent.
 
When you mean use HDR do you mean turining HDR on in game? If that is what you mean the answer is no. The screen does not support HDR and I always keep it disabled.
 
I've seen conflicting info about whether 4:4:4 is better than RGB or not. Which one is actually better?
 
4:4:4 is identical to RGB. It's ok to lower to 4:2:2 if you have bandwidth restrictions, it probably won't be noticeable in games.
 
All I know is with my setup if I use ycbr 444, "full range" and 10-bit is grayed out, but if I use RGB I can select full range and 10-bit depending on the monitor, and full range has an IMMEDIATE difference in black levels that can be seen on text and backgrounds.
 
I was in the same boat, had a 2080 Ti with a C1 OLED. I had bandwidth issues constantly at 4k (which is the entire point of 30 series cards IMO) where the frame rate would absolutely tank for no reason and I’d have weird frame skipping or stutters.

If I changed it to 1080p at the desktop then used up sampling it was fine but the picture quality was not anywhere near as good.

I bit the bullet and paid 1500 for a 3080 Ti. It’s now perfectly fine. I use 8 bit, 4:4:4 with HDR. I didn’t want to upgrade, I was hoping I could wait for the 40 series but it was insanely annoying.
 
They're different terms for the same thing.
No, they're not. YCbCr is a different color format. It really doesn't matter to the display because it gets translated to RGB through signal processing. The point is that they are both uncompressed, but YCbCr has a limited range of values when converted to RGB (0-241 IIRC instead of 0-255, or 16-241 with limited range blacks).
 
No, they're not. YCbCr is a different color format. It really doesn't matter to the display because it gets translated to RGB through signal processing. The point is that they are both uncompressed, but YCbCr has a limited range of values when converted to RGB (0-241 IIRC instead of 0-255, or 16-241 with limited range blacks).


He's right, though. You're thinking of limited-range HDMI connections.

a full-range 444 monitor has the sme color range available over th4 HDMI link as RGB does,
 
He's right, though. You're thinking of limited-range HDMI connections.

a full-range 444 monitor has the sme color range available over th4 HDMI link as RGB does,
YCbCr is limited to 241 on the white side regardless of limited range. With YCbCr you have to use a feature called "super white" to expand the top range for each color to 255 when converted to RGB.
 
Back
Top