HDMI. DVI will always be limited to 8-bit if I'm not mistaken.With a CRT or digital connection?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
HDMI. DVI will always be limited to 8-bit if I'm not mistaken.With a CRT or digital connection?
to avoid firther confusion let's always assume that DVI as in Digital Video Interface means digital video interface and not analog video intefacenot DVI analogue out on nvidia.
Not a good test if you want to look at videoLUT banding, for those reasons:GTX 560 Ti, 8bit IPS panel through DVI (Dell 2007fp), calibrated to 6500K with Spyder 5 Elite.
Game: WoW, looking at sky.
To test videoLUT banding, you want to have exact control over the source material (i.e. you need to use a well-defined test image where the actual source pixel values are fixed and not the result of on-the-fly rendering like in games).
Well madVR itself can do dithering and LUT correction in Windowed Mode (it's built for the highest quality playback possible) but you will only get a 10-bit output (which the NVIDIA driver then dithers if your display is 8-bit) if you are using the D3D11 Full-Screen Exclusive Mode output.Thank you for this information, gonna test this myself at some point and be most likely hugely disappointed when I see extreme banding on gray scale ramp up image after making any adjustments to the LUT by loading calibrated color profile.
Is dithering used when you are running madVR in windowed mode? Getting over that 8bit windows composition limit needs DX or OGL application in fullscreen exclusive mode so that the application can actually output more than 8bit data or that's how I have understood how windows color management works.
Unfortunately banding is very common in games even without modifying the GPU LUT.GTX 560 Ti, 8bit IPS panel through DVI (Dell 2007fp), calibrated to 6500K with Spyder 5 Elite.
Game: WoW, looking at sky.
http://i.imgur.com/0WI4ts3.jpg
This shit is still real! Screenshot is from framebuffer, before it's shown at monitor so if you have bad monitor with quantization errors, the results could look even worse, hahahaa. I know my Dell has pretty bad gamma curve so that extrapolates the issue but still... me not like!
I need to calibrate using native white point (which is around 5800K) to get somewhat proper results without that much banding. I get better results if I calibrate through my monitor OSD, but I can't do that with ZR30w because its direct drive, so no OSD to play with.
I don't even want to test out my ZR30w because my 560Ti doesn't have DP port but my guess is that it would look hilarious if the DP port only outputs 8bit data on nvidia cards.
Now let's be honest here, I take the minimal noise accompanied with Dither any day of the week if I don't have to see that ugly banding which always looks out of place.
That's a decent test image, I'm using it as well.So same source material huh... http://www.lagom.nl/lcd-test/gradient.php
It probably isn't.So where is my Nvidia card dithering...
That alone won't be enough. The banding you're seeing is a compound effect of all the things I mentioned above.[...] the whole mess could be fixed by nvidia by enabling higher precision gpu LUT on all outputs and using dithering.
just created some new files for a 10 bit test. See post here
Pascal supports HDR10 and "other HDR formats," which require at least 10-bit color output. I only have an 8-bit panel so I can't test for you. But I believe 10-bit color and higher were unlocked in the GeForce drivers sometime last year (they used to be locked to Quadro drivers). It's in NVCP, Display, Change resolution under the listbox called "Output color depth."Does anyone know if the new Pascal GPUs from Nvidia support the 10 bit dithering? I'm guessing the answer is "no" but just wanted to make sure before I choose 480 over 1060
Thanks in advance.
Pascal supports HDR10 and "other HDR formats," which require at least 10-bit color output. I only have an 8-bit panel so I can't test for you. But I believe 10-bit color and higher were unlocked in the GeForce drivers sometime last year (they used to be locked to Quadro drivers). It's in NVCP, Display, Change resolution under the listbox called "Output color depth."
afaik, the 3D LUT thing won't help this limitation. So even if you do get games to respect your LUT changes, those LUT changes will necessarily reduce the number of unique colors.
edit: according to this, the 1070 and/or? 1080 support 10-bit
Nvidia Confirms GTX 1070 Specs -1920 CUDA Cores & 1.6Ghz Boost Clock At 150W
maybe it's the DP and HDMI ports that support 10 bit. Which port did you experiment with flossy?
The issue is that NVIDIA seem to be using the output bit-depth for their internal precision and/or rounding instead of dithering.When you said the article was probably referring to a 10 bit output signal, what does that mean? If it supports a 10 bit output, doesn't that mean that it has to have 10 bit precision?