nvidia or amd for color accuracy and calibration features

GTX 560 Ti, 8bit IPS panel through DVI (Dell 2007fp), calibrated to 6500K with Spyder 5 Elite.

Game: WoW, looking at sky.

0WI4ts3.jpg


This shit is still real! Screenshot is from framebuffer, before it's shown at monitor so if you have bad monitor with quantization errors, the results could look even worse, hahahaa. I know my Dell has pretty bad gamma curve so that extrapolates the issue but still... me not like!

I need to calibrate using native white point (which is around 5800K) to get somewhat proper results without that much banding. I get better results if I calibrate through my monitor OSD, but I can't do that with ZR30w because its direct drive, so no OSD to play with.

I don't even want to test out my ZR30w because my 560Ti doesn't have DP port but my guess is that it would look hilarious if the DP port only outputs 8bit data on nvidia cards.

Now let's be honest here, I take the minimal noise accompanied with Dither any day of the week if I don't have to see that ugly banding which always looks out of place.
 
Last edited:
not DVI analogue out on nvidia.
to avoid firther confusion let's always assume that DVI as in Digital Video Interface means digital video interface and not analog video inteface :)

Nowadays no one use VGA compatibility signals anymore and no one care for how they behave because no one will choose analog connection for their LCD over digital connection.

I once tried to use VGA on Dell U2410 to get 10bit LUTs on GeForce and it sucked so much that I went to banding and felt relief :D
 
GTX 560 Ti, 8bit IPS panel through DVI (Dell 2007fp), calibrated to 6500K with Spyder 5 Elite.

Game: WoW, looking at sky.
Not a good test if you want to look at videoLUT banding, for those reasons:

- The Spyder5 software introduces banding by quantizing the calibration values to 8-bit ('vcgt' with entrySize = 1, which means 8-bit), before they get even written to the videoLUT.
- Game scenes are a result of on-the-fly rendering, i.e. several textures, lighting passes, alpha blending, etc. influence the actual pixel values and errors can already be introduced at this step (e.g. optimizations in the texture mapping, texture compression or other format limitations, driver bugs, limited precision processing etc.), far before the videoLUT gets involved.
To test videoLUT banding, you want to have exact control over the source material (i.e. you need to use a well-defined test image where the actual source pixel values are fixed and not the result of on-the-fly rendering like in games).
 
To test videoLUT banding, you want to have exact control over the source material (i.e. you need to use a well-defined test image where the actual source pixel values are fixed and not the result of on-the-fly rendering like in games).

I know games are not the best way to test but the banding can be seen in that particular game so well that I decided to make that as an example. So same source material huh... http://www.lagom.nl/lcd-test/gradient.php

That should be good enough and I get banding with GTX on that image, though not as extreme as in that one game. Switch to AMD and I have butter smooth image after calibration.

So where is my Nvidia card dithering... hey Dither, where did you go, are you hiding, come out of closet!

Your post almost makes it look like its ok to have banding because the problem is not gpu LUT which doesn't have enough bits and dithering, but the problem lies in the program, calibration software, the position of moon etc. when the whole mess could be fixed by nvidia by enabling higher precision gpu LUT on all outputs (not just HDMI) and using dithering.

Aargh I need to find a Nvidia card with DP to test my primary monitor. Too bad all my friends & family in close proximity are using AMD cards because you can't beat the value of used 7970 at 100€ range.
 
Last edited:
Thank you for this information, gonna test this myself at some point and be most likely hugely disappointed when I see extreme banding on gray scale ramp up image after making any adjustments to the LUT by loading calibrated color profile.
Is dithering used when you are running madVR in windowed mode? Getting over that 8bit windows composition limit needs DX or OGL application in fullscreen exclusive mode so that the application can actually output more than 8bit data or that's how I have understood how windows color management works.
Well madVR itself can do dithering and LUT correction in Windowed Mode (it's built for the highest quality playback possible) but you will only get a 10-bit output (which the NVIDIA driver then dithers if your display is 8-bit) if you are using the D3D11 Full-Screen Exclusive Mode output.
madVR's dither implementation is actually higher quality than NVIDIA's though, so in this case it's actually better to output 8-bit so that the driver doesn't alter the output.
However Alien: Isolation should be improved by enabling its "Deep Color" option even if you have an 8-bit display, since that should force the driver to dither the conversion to 8-bit.

There's no way to trick the NVIDIA driver into dithering with an 8-bit output for windowed applications, if that's what you were hoping for.

GTX 560 Ti, 8bit IPS panel through DVI (Dell 2007fp), calibrated to 6500K with Spyder 5 Elite.
Game: WoW, looking at sky.
http://i.imgur.com/0WI4ts3.jpg
This shit is still real! Screenshot is from framebuffer, before it's shown at monitor so if you have bad monitor with quantization errors, the results could look even worse, hahahaa. I know my Dell has pretty bad gamma curve so that extrapolates the issue but still... me not like!
I need to calibrate using native white point (which is around 5800K) to get somewhat proper results without that much banding. I get better results if I calibrate through my monitor OSD, but I can't do that with ZR30w because its direct drive, so no OSD to play with.
I don't even want to test out my ZR30w because my 560Ti doesn't have DP port but my guess is that it would look hilarious if the DP port only outputs 8bit data on nvidia cards.
Now let's be honest here, I take the minimal noise accompanied with Dither any day of the week if I don't have to see that ugly banding which always looks out of place.
Unfortunately banding is very common in games even without modifying the GPU LUT.
Despite many engines now rendering in 16-bit, the majority of games only seem to output 8-bit, and it doesn't seem like they dither that conversion to avoid banding.
Alien Isolation is the only game I'm aware of which will output more than 8-bit color.

12-bit output to my TV via HDMI:
I would reset the GPU LUT and do all calibration in the display if you're using an NVIDIA GPU and an 8-bit digital connection until they fix this.
If the output is set to >8-bit in the NVIDIA Control Panel, banding caused by modifying the LUT is reduced but not eliminated because they still aren't dithering.
If you are using VGA, apparently the output to the display is 10-bit.
 
tried doing an experiment.

I created a LUT that simulated 2 bit color. For all channels, the values for the first quarter of the 256 entries were set to 0. The next quarter 0.25, the next quarter 0.5 and the last quarter 0.75.

I then ran alien isolation, and switched between deep color on and off. No difference.

My thinking was that if 10 bit color was being used, it would somehow scale up the LUT and interpolate, in which case there'd be 16 colors per channel.

Is my thinking correct?

Zone, if you like, I can upload the LUT and you can do the same experiment (you'll need dispwin from argyll to load it).
 
The EDID gamut correction feature works with my samsung 226bw connected through dvi. Not sure how accurate it is though. There's some drastic changes in some shades of blue, the reds get less punchy and the greens lighter.
 
Does anyone know if the new Pascal GPUs from Nvidia support the 10 bit dithering? I'm guessing the answer is "no" but just wanted to make sure before I choose 480 over 1060 :D

Thanks in advance.

just created some new files for a 10 bit test. See post here
 
Does anyone know if the new Pascal GPUs from Nvidia support the 10 bit dithering? I'm guessing the answer is "no" but just wanted to make sure before I choose 480 over 1060 :D

Thanks in advance.
Pascal supports HDR10 and "other HDR formats," which require at least 10-bit color output. I only have an 8-bit panel so I can't test for you. But I believe 10-bit color and higher were unlocked in the GeForce drivers sometime last year (they used to be locked to Quadro drivers). It's in NVCP, Display, Change resolution under the listbox called "Output color depth."
 
Pascal supports HDR10 and "other HDR formats," which require at least 10-bit color output. I only have an 8-bit panel so I can't test for you. But I believe 10-bit color and higher were unlocked in the GeForce drivers sometime last year (they used to be locked to Quadro drivers). It's in NVCP, Display, Change resolution under the listbox called "Output color depth."

Thanks, but that's a different kind of 10-bit support than the one we're talking about here (still an 8-bit signal, but with dithering by the GPU to produce 10-bit effective precision). It's something that's been lacking from NVidias and is the main thing holding me back.

spacediver's post got me thinking about other ways it could be tested, and here's what I've come up with:

lut test.zip

Instructions are in the file test pattern.png. If someone with a Pascal GPU could run the test it'd be much appreciated!

Also try not to even open Nvidia control panel before running the test, because I found some strange behaviour where even just opening the control panel was offsetting the LUT by a small amount thereby ruining the test.
 
Last edited:
nice stuff flossy. I can still read it with the lut loaded. I did notice that when i loaded the lut, the contrast of the pattern decreased too. I'll confirm this later with my i1 display pro, to make sure it wasn't my imagination. Nvidia Geforce GTX 660 and CRT.
 
Well I got a 1070 and it's still 8-bit. Even worse, none of my LUT enforcing tricks work on it. Full screen 3d games always steal the LUT and neither dispwin.exe, Powerstrip, Color Clutch, Color Sustainer or Monitor Calibration Wizard can override it.

On my previous AMD card (R9 270) Powerstrip would work on everything, and the others would work on most games.

This totally sucks and I'm not sure if I will keep the 1070. I'm very finicky about my 2.4 gamma at night and doing a little manual boosting to the shadow detail on top of that.

edit: just found this. There is hope!
3D LUTs for Direct3D and OpenGL applications (e.g. games) under Windows - AVS Forum | Home Theater Discussions And Reviews
 
Last edited:
That sucks about the 8 bit limitation. Means that even when us CRT users get adaptors that convert DVI to VGA at high enough pixel clocks to run, say, the FW900, we'll face same limitations. (edit, seems I'm wrong about this.)

afaik, the 3D LUT thing won't help this limitation. So even if you do get games to respect your LUT changes, those LUT changes will necessarily reduce the number of unique colors.


edit: according to this, the 1070 and/or? 1080 support 10-bit

Nvidia Confirms GTX 1070 Specs -1920 CUDA Cores & 1.6Ghz Boost Clock At 150W

maybe it's the DP and HDMI ports that support 10 bit. Which port did you experiment with flossy?
 
Last edited:
afaik, the 3D LUT thing won't help this limitation. So even if you do get games to respect your LUT changes, those LUT changes will necessarily reduce the number of unique colors.

The 3D LUT is implemented in a shader which I believe uses high precision tone mapping with dithering. If it wasn't, I think there would be so much banding in games where shaders are doing extreme tone mapping effects like dynamic HDR and bloom, you would get severe banding on those tones if shaders were only 8-bit precision.

It seems you can do a 1D calibration through the shader which I will probably use instead of the 3D one to reduce the number of colour changes.

I haven't yet experimented with it as I'm still learning how to use Diplaycal to generate the 3D LUT (the program is extremely technical and offers a huge bevy of calibration options). In any case, even if it is 8-bit that's still better to me than being stuck with a bad gamma curve.


edit: according to this, the 1070 and/or? 1080 support 10-bit

Nvidia Confirms GTX 1070 Specs -1920 CUDA Cores & 1.6Ghz Boost Clock At 150W

maybe it's the DP and HDMI ports that support 10 bit. Which port did you experiment with flossy?

I've tried Displayport on an Acer XB270HU and currently on DVI on a LG 27" IPS. Both show the 8-bit limitation, banding on gray ramp. I think the article is referring to a 10-bit output signal rather than an 8-bit output with dithering.
 
I don't know, but I'd imagine it would be spatial.

The shader route isn't panning out for me though, too many games are not compatible with ReShade. All the Source engine games for example are not compatible with it, so that pretty much rules it out for me.
 
When you said the article was probably referring to a 10 bit output signal, what does that mean? If it supports a 10 bit output, doesn't that mean that it has to have 10 bit precision?
 
When you said the article was probably referring to a 10 bit output signal, what does that mean? If it supports a 10 bit output, doesn't that mean that it has to have 10 bit precision?
The issue is that NVIDIA seem to be using the output bit-depth for their internal precision and/or rounding instead of dithering.
So if your output is 8-bit it will be much lower quality than a 10-bit or 12-bit output.
Ideally you would do all your LUT calculations in 16-bit and dither to your output bit-depth to minimize posterization/banding.
 
I'm still not following (likely because I have a limited understanding of how all this works - thanks in part to the fact that this information seems to be shrouded in mystery, and there seems to be no consistent use of terminology).

If the output bit depth is 10 bit, then why wouldn't you be able to choose from a palette of 1024 unique levels per channel, when choosing your 256 levels per channel?
 
The digital output is locked at 8-bit, all of our consumer grade LED monitors are 8-bit only, that's all they can accept and display. Only a high end 10-bit capable monitor would benefit from the new 10-bit support being referred to in that article. In that case the video card would be outputting a 10 bit signal and the monitor is capable of displaying all of those extra intermediate shades.

iirc you are able to bypass this issue because your are on a CRT which is on an analogue connection, and Nvidia for some reason allow dithering over analogue signals. Or maybe they just send intermediate voltages because the signalling is not so rigid (I could be totally wrong about this).
 
ah right, i forgot about the digital monitor part - so used to CRT.

So when you were testing for 10 bit precision, you were hoping the card would dither like AMD to achieve it.

Yes, I imagine the DAC on the Nvidia cards that have them are true 10 bit, so the voltages are intermediate. Would need an oscilloscope to be sure.
 
Eventually got ReShade working with Source engine and 2 other games that weren't working before. Just needed a bit of fudging with paths and an older version of reshade .dll. So it looks like I may be keeping this 1070 after all...
 
Back
Top