10/12-bit output to 8-bit display

Gregow

n00b
Joined
Mar 2, 2015
Messages
2
Hi,

I don't know wether this is display specific or graphics card specific, but anyhow... I've been led to the impression that with nVidias more recent drivers you should be able to set a 10- or 12-bit output (DisplayPort or HDMI) to an 8-bit monitor, effectively scaling the signal from the higher bitdepth down to 8-bits.

I've heard reports from others that they can select the higher bitdepths in nVidias control panel, even on old TN-displays.

Have I got it right here or have I missed something? Because if I got it right, well... I can only select 8bpc, and if at all possible I would of course like the graphics LUT calculated in >8-bits.

Perhaps someone here can shed light on the situation?
 
Hi,

I don't know wether this is display specific or graphics card specific, but anyhow... I've been led to the impression that with nVidias more recent drivers you should be able to set a 10- or 12-bit output (DisplayPort or HDMI) to an 8-bit monitor, effectively scaling the signal from the higher bitdepth down to 8-bits.

I've heard reports from others that they can select the higher bitdepths in nVidias control panel, even on old TN-displays.

Have I got it right here or have I missed something? Because if I got it right, well... I can only select 8bpc, and if at all possible I would of course like the graphics LUT calculated in >8-bits.

Perhaps someone here can shed light on the situation?

I don't think you are suppose to able to select 12 bit for monitor that does not bit accept 10-12 bit. The panel bit depth have nothing to do with the bit depth it can accept.
 
I'm just suprised old, cheap, TN-panels would accept 10- or 12-bit input. Otherwise it does indeed make sense.
 
Also if its 4K then y dont hv enough bandwidth for 10bit at 4:4:4 60hz w hdmi2.0 ... and if y select 10 bit then nvidia will change chroma to less than 4:4:4 or 30hz..
 
Back
Top