Hi,
I don't know wether this is display specific or graphics card specific, but anyhow... I've been led to the impression that with nVidias more recent drivers you should be able to set a 10- or 12-bit output (DisplayPort or HDMI) to an 8-bit monitor, effectively scaling the signal from the higher bitdepth down to 8-bits.
I've heard reports from others that they can select the higher bitdepths in nVidias control panel, even on old TN-displays.
Have I got it right here or have I missed something? Because if I got it right, well... I can only select 8bpc, and if at all possible I would of course like the graphics LUT calculated in >8-bits.
Perhaps someone here can shed light on the situation?
I don't know wether this is display specific or graphics card specific, but anyhow... I've been led to the impression that with nVidias more recent drivers you should be able to set a 10- or 12-bit output (DisplayPort or HDMI) to an 8-bit monitor, effectively scaling the signal from the higher bitdepth down to 8-bits.
I've heard reports from others that they can select the higher bitdepths in nVidias control panel, even on old TN-displays.
Have I got it right here or have I missed something? Because if I got it right, well... I can only select 8bpc, and if at all possible I would of course like the graphics LUT calculated in >8-bits.
Perhaps someone here can shed light on the situation?