color calibration - how does everything interact (display props, nvidia wizard, etc)

spacediver

2[H]4U
Joined
Mar 14, 2013
Messages
2,715
Just picked up a fw900 (interesting story that I'll share on the official fw900 thread), and trying to calibrate it.

I've let the monitor warm up, chosen the 6500k temp, and done the image restoration (which was fantastic!).

I'm running win xp, and have an nvidia geforce 9800GT.

I want to start calibrating using a third party software (i don't like the built in nvidia display optimization wizard), but before I do so, I need to clear up some confusion.

From what I can tell, on the software end of things (i.e. not using my monitor controls), there are at least two separate places where I can influence brightness, gamma, etc.

A the nvidia control panel has a "adjust desktop color settings" section where you can control five sliders (brightness, contrast, gamma, hue, and digital vibrance)

B the nvidia control panel also has a display optimization wizard.

I've run the wizard and hated the results, so I have just done a clean reinstall of the drivers.

If I were to run a third party calibration tool (such as Monitor Calibration Wizard 1.0), would it interact with the settings in A?

And does anyone with experience in these matters have any recommendations for alternative third party software? I've heard of displaymate, but I'd rather go with a free one (unless I can be convinced otherwise). I also don't currently own a photometer so it would have to be done by eye.
 
Ok I think i figured one thing out. The color settings in the nvidia control panel reflect the adjustments that the color calibration software imposes.
 
If you want to do it right you should get a package wih a hardware colorimeter. Without hardware you can't be quite sure what you're going to get. On a CRT most will have you set a black point with brightness, white point with conrast, and you can set only two point in the greyscale with RGB gain and bias. The rest of the corrections are loaded into your GPU LUT at startup based on sweeps with the colorimeter.

I used to have GDM-F520s calibrated with an MTP-94
 
thanks for the reply.

That's what I'm thinking to do - buy a decent colorimeter.

So I guess I have two questions:

Is there any way to test out different color profiles without rebooting? When I go to color management and choose a different color profile to set as default, I'm assuming the lookup table is not replaced until reboot. It would be handy for testing purposes to be able to switch around color profiles instantly.

On my FW900, I can adjust the following variables independently using the OSD:

brightness
contrast
color temperature
R bias
G bias
B bias
R gain
G gain
B gain

Does this mean I won't have to worry about changing the default lookup table when using the colorimeter?
 
the nvidia control panel settings for brightness, contrast, and gamma adjust the video card lut. digital vibrance and hue do not.

the lut corrections in calibrated profiles take effect when you apply them, you don't need to reboot.

osd adjustments are fairly limited, and if you don't know how to adjust bias and gain you can do more harm than good.

use the osd to get to the brightness and white point you want, then let the software do the rest.

i use argyll cms and dispcalgui to calibrate and profile, then monitor calibration wizard to lock the luts. all free software.
 
thanks livefast - that's useful.

I actually find that monitor calibration wizard allows me to load and apply profiles instantly, whereas the windows display properties doesn't work that way (at least in xp for me).

Your use of the terms calibrate and profile prompted me to learn something new. So if I understand correctly, the argyll cms guides you through the OSD calibration, and the dispcalgui helps adjust the LUT appropriately.

You mentioned that digital vibrance is independent of the lut - would you recommend disabling digital vibrance before beginning any calibration and profiling?
 
"Digital vibrancy" and similar are "effects". When you are trying to achieve reference grade reproduction all effects should be disabled. Same concept as ISF calibrating a TV, or setting up audiophile equipment.
 
monitor calibration wizard uses 8bit LUTs and thus introduce a lot of banding which is not necessary on CRT as all new video cards (at least nvidia ones from GF8 and up) support 10bit LUT (and 10bit color in general) on CRT which translates to virtually banding-free calibration

the best you can do without proper calibration probe is to use OSD settings + gamma in nvidia panel and calibrate with this:
1. about:blank
2. http://www.lagom.nl/lcd-test/gradient.php
3. http://images.google.com/

AD1. "pick" white color you like with GAIN
AD2. make grey have this color across all scale with BIAS (warrning! bias changes white too).
AD3. Test it with some nice photo and movie and adjust gamma until skin tones look natural

ps. try 1.10 gamma. I find it most adequate for Trinitron CRTs
ps2. icc profile from calibration probe won't help in games so all it can do is assist in bias/gain calibration. Not worth investing it it imho...
 
I actually find that monitor calibration wizard allows me to load and apply profiles instantly, whereas the windows display properties doesn't work that way (at least in xp for me).

xp's color management system doesn't have a built-in lut loader for monitor profiles, that's why you don't see visible changes when the profile is applied. starting with windows 7, microsoft has implemented this capability. however, it won't 'lock' the lut though so if the lut gets changed or reset, you'll have to reload the calibration profile, or use mcw which can enforce the lut.


So if I understand correctly, the argyll cms guides you through the OSD calibration, and the dispcalgui helps adjust the LUT appropriately.

argyll cms is the calibration software, but it is a set of command line utilities. dispcalgui is a gui front end for argyll cms.


You mentioned that digital vibrance is independent of the lut - would you recommend disabling digital vibrance before beginning any calibration and profiling?

yes, reset digital vibrance and hue to defaults before calibration. setting digital vibrance to 0 turns your monitor into grayscale. default value for digital vibrance is +50%.


if what xor says above is true for crt's, then you may want to pass on software calibration.
 
Last edited:
thanks for the great replies - yea i figured out after a while that vibrance @ 50% is equivalent to not using it at all, and after installing argyll and dispcalcgui i figured out the relationship.

XoR,

How can I reset my LUT now that I've already loaded one in MCW - do I just load up the default profile in xp display properties and reboot to get it back to a 10-bit LUT? Do I need to take any steps in order to ensure a 10-bit LUT? for what it's worth, I have a geforce 9800 GT with the latest drivers installed.

(edit: from what I can gather, only the geforce quadros support 10 bit color - to get 10 bit color on the other geforces u need linux and special drivers).

Btw I'm really happy to learn that I'm able to use 10 bit LUTs (I'm assuming this isn't only for greys, but for R G B also?)


AD1. "pick" white color you like with GAIN
AD2. make grey have this color across all scale with BIAS (warrning! bias changes white too).
AD3. Test it with some nice photo and movie and adjust gamma until skin tones look natural

sorry if this appears daft, but what does AD mean? (I get that it refers to the 3 previous points).
and when you say make sure grey has this color across all scales, what do you mean? Which grey? And scales = spatial scales?
 
Last edited:
Ok I've a related question.

Right now I'm trying to go the route of just using the OSD bias and gain settings to calibrate everything (with a little gamma adjustment at the end).

I keep hearing about the D65 color profile as a good standard to use. I have a sony_d65.icm file that I can load into my color management as the default, but I'm not sure if I should.

From what I understand, these color profiles are simply customized lookup tables. But if I'm doing everything manually, should I be using a preset profile?

Or should I load the profile and use that as a baseline for adjusting everything?
 
the manufacturer supplied files generally don't contain calibration corrections, only the device's color profile information.

more info here:

http://en.wikipedia.org/wiki/ICC_profile
http://www.imaginginfo.com/print/Studio-Photography/ICC-Color-Management-Explained


on the other hand, when you calibrate your monitor with a colorimeter, the calibration software usually calibrates and profiles, creating .icc files which include both the monitor profile and calibration corrections (adjustments to the video card lut).

in argyll cms, there is an option to create a profile only, which will still be more accurate for your monitor than the manufacturer supplied profile since they simply supply a generic profile.


to answer your questions, if the sony_d65.icm file is from sony, it likely doesn't contain any lut adjustments so you can load it and do your osd calibration.
 
Last edited:
Back
Top