Can someone explain to me how xrite calibration works?

Tyrindor

Limp Gawd
Joined
May 19, 2014
Messages
159
From what i'm reading it will adjust the windows color profile, not the actual monitor settings. This kind of worries me.

- Am I going to see a color shift when logging into Windows?
- I've heard some people claim games won't use the custom color profile. Is this true?

Thanks.
 
What's the correct color of that CGI tree in the background, or of the blood coming out of that demon's neck? Color spaces are part of the artistic design choice of a game; for example, Diablo 3 was criticized for using happy circus colors instead of gritty dark colors. Calibration is meaningless for games; miscalibrating for brighter and redder colors usually (but not always) make games look better.
 
What's the correct color of that CGI tree in the background, or of the blood coming out of that demon's neck? Color spaces are part of the artistic design choice of a game; for example, Diablo 3 was criticized for using happy circus colors instead of gritty dark colors. Calibration is meaningless for games; miscalibrating for brighter and redder colors usually (but not always) make games look better.

I'm sorry but I don't agree. Games can expand on your calibrated settings by having their own color scheme, but no way is "calibrating meaningless" for games. By having a perfectly calibrated monitor, the games color scheme would be as the developers intended it to be. They decided the correct color of that CGI tree, and you won't achieve that unless your monitor is calibrated.

The most accurate setting to game on would be the standard 6500k, 2.2 gamma target, and 120 cd/m2 which is the whole purpose of this device.
 
I'm sorry but I don't agree. Games can expand on your calibrated settings by having their own color scheme, but no way is "calibrating meaningless" for games. By having a perfectly calibrated monitor, the games color scheme would be as the developers intended it to be. They decided the correct color of that CGI tree, and you won't achieve that unless your monitor is calibrated.

The most accurate setting to game on would be the standard 6500k, 2.2 gamma target, and 120 cd/m2 which is the whole purpose of this device.

It is meaningless as 99.99% of gamers do not have a calibrated or at all accurate color output on their monitor anyway...and further most people like the more vibrant (inaccurate) colors on their smart phones.
 
It is meaningless as 99.99% of gamers do not have a calibrated or at all accurate color output on their monitor anyway...and further most people like the more vibrant (inaccurate) colors on their smart phones.

So since 99.9% of people have uncalibrated gaming displays (same can be said for TVs or any display in general), that means it's meaningless for me to calibrate my display for gaming?

How about answering the questions I asked instead of telling me my preference is meaningless?
 
It is meaningless as 99.99% of gamers do not have a calibrated or at all accurate color output on their monitor anyway...and further most people like the more vibrant (inaccurate) colors on their smart phones.

Most gaming monitors have accurate color presets with linear 2.2-ish gamma, 6500k color temperature and cover the sRGB/REC 709 color space which is used by consumer media. Not everyone lacks standards and owns a grainy, glowy, wide gamut 30" monitor. There is a reason 99.99% of monitors and TV's are not wide gamut.

How about answering the questions I asked instead of telling me my preference is meaningless?

Use Color Sustainer instead of windows to force everything to use ICC profiles.
 
If your monitor is way off on colors and has an OSD for red green blue levels and brightness contrast, then the x-rite iprofiler will guide you to adjusting them ballpark towards a proper standard, which you can choose from lots of settings you want to try to achieve. White level color temperature, gamma curve etc.

The x-rite software isn't very good and for the most part icc profiles are a tradeoff of color accuracy for gradient banding. For gamers it's really not particularly useful at all.

Now if you are trying to get information about your monitor, or you use it to make graphic color-sensitive material, etc. Then it's useful. Different situations like trying to match color temperatures on two different displays.

A lot of games do allow profiles to work, they just tend to reset the windows profile on loading to fullscreen. Yasamoka's Color Sustainer program is a way to try to endforce color profiles, but often in games it's not really an issue.

I do like having a way to test monitors colors and contrast but that's pretty much the only reason I got a ID3 pro, and if I'd known the extra money was for the software I would have just got a ColorMunki, since I use DispcalGUI+ArgyllCMS

More and more I'm realizing that since the Korean monitors tend to have fairly good color, there's zero reason to calibrate them for gaming. It will actually reduce your contrast ratio and again, cause grayscale banding, but this is a realization I'm coming to more lately.

The other thing is that the ID3 pro I have (as someone suggested might be the case) thinks white should be slightly pink. So, the probe I'm using to measure things like contrast ratio isn't really calibrated very well itself. I need to rent an expensive spectrometer and create an error measurement for it or find someone else with the files (you'd think they'd be available but I don't know where to look).

People say the Yamakasi Catleap has a green tint but I don't think mine does, it certainly has a strangely warmish white after being calibrated with the id3 pro.
 
The overclock-able Qnix/X-Star need ICC profiles when overclocked since the gamma rises and makes colors too dark, slightly dulled and causes black crush. The glossy Qnix/X-Star have skewed preset gamma at 60hz...calibration makes a big difference.
 
Perhaps I should test the catleap without a profile on it. Learned how to do it with hcfr the other day but I think it's weird it reads the primaries the same before and after. I guess this means you can't really change a display's primary RGB levels at the software level? No idea, just guessing. The color on the Yamakasi is better after profiling definitely but I think perhaps the IPS displays don't suffer from color issues due to overclocking in the way the qnix units do. There isn't this dramatic drop in brightness on the catleap.
 
Well, the good ones will involve both hardware and software. You can usually adjust via hardware first (the brightness/contrast/color controls on your monitor) to get as close as possible to the calibration target. This will allow things to always look good, even at times when a profile won't be active. The software calibration (creating a profile) does the rest to get the most accurate calibration.
 
From what i'm reading it will adjust the windows color profile, not the actual monitor settings. This kind of worries me.

- Am I going to see a color shift when logging into Windows?
- I've heard some people claim games won't use the custom color profile. Is this true?

Thanks.

You usually adjust RGB in your monitor's OSD like the calibrator is telling you and then it does its calibrations
So you shouldn't see a big difference before the color profile is loaded

A lot of older games override your color profile when running in fullscreen
Modern ones tend not to do that anymore for the most part
 
Most gaming monitors have accurate color presets with linear 2.2-ish gamma, 6500k color temperature and cover the sRGB/REC 709 color space which is used by consumer media. Not everyone lacks standards and owns a grainy, glowy, wide gamut 30" monitor. There is a reason 99.99% of monitors and TV's are not wide gamut.



Use Color Sustainer instead of windows to force everything to use ICC profiles.

I doubt that. Besides even if they do have them they do not use them. Ask most gamers what an ICC file is and they'd be clueless. Most gamers buy the cheapest monitor that is on sale in a given size class on NewEgg. Even among people who style themselves enthusiasts, there's minimum knowledge about what constitutes a good monitor these days.

Besides why are you making this childishly personal? The OP asked why it was meaningless and I told him.

My grainy, glowy, piece of shit, gawd-awful, wide-gamut "unusable" piece of shit 30" monitor even on a bad day is a better panel in color than most kids own...excuse me, better panel than their parents bought them. And further most people prefer "vibrant" color presets that their phones use whether it is ZOMG accurate or not. Because at the end of the day, whether you have 99.99% accurate sRGB color or not, that shitty CoD game you're playing is not made any less awful by the color profile....nor is Farmville any more realistic.
 
Most gamers buy the cheapest monitor that is on sale in a given size class on NewEgg.

Most budget monitors have more accurate color presets than a sRGB-mode-less wide gamut monitor, so, it's a good thing 'they' are buying cheaper monitors.

Besides why are you making this childishly personal? The OP asked why it was meaningless and I told him.

I'm not, you didn't make the monitor mediocre, HP did. It's your choice to take a critique of a display personally. Stop trying to justify your purchase and preference for inaccurate and over-saturated colors with generalizations which appeal to what you claim to be 'popular' consensus.
 
Most budget monitors have more accurate color presets than a sRGB-mode-less wide gamut monitor, so, it's a good thing 'they' are buying cheaper monitors.



I'm not, you didn't make the monitor mediocre, HP did. It's your choice to take a critique of a display personally. Stop trying to justify your purchase and preference for inaccurate and over-saturated colors with generalizations which appeal to what you claim to be 'popular' consensus.

Of course you are not. :rolleyes: :rolleyes: :rolleyes: :rolleyes:

Which is exactly why you brought it into the conversation. Now get back on topic color, accuracy troll.
 
- I've heard some people claim games won't use the custom color profile. Is this true?
A display calibration involves a linearisation and characterization. The linearisation - ensuring the desired whitepoint and gradation respectively a good grey balance - can be accomplished via display or videocard LUT. In the latter case this information is stored in the vcgt of the display profile and loaded during system start. When loading a game the videocard LUT is usually resetted. But even if you manage to persist this data the game engine will not carry out color space transformations based on the characterization data of the profile - unlike a CMM in color aware software like Photoshop. Therefore the display should reproduce the intended (ambiguous in this case but assuming sRGB should be fine) source characteristic by itself (=> e.g. color space emulation).

The other thing is that the ID3 pro I have (as someone suggested might be the case) thinks white should be slightly pink. So, the probe I'm using to measure things like contrast ratio isn't really calibrated very well itself. I need to rent an expensive spectrometer and create an error measurement for it or find someone else with the files (you'd think they'd be available but I don't know where to look).
Choose a whitepoint which looks good in your environment. There is - contrary to popular belief - no D65 constraint. Think of different colour matching conditions, color constancy of human perception (should be considered when processing colorimetric data - implicitly done for you when the profile is created but neglected in many TV calibration solutions) and impacts of observer metamerism. But give your eyes a chance (=> time + neutral stimulus) to adapt to the respectively selected whitepoint when you experiment with different targets.
 
Last edited:
Now get back on topic color, accuracy troll.

This 30" ARGB/sRGB stuff is mostly off-topic, anyway--but to be fair, your comment that 99.99% of "gamers" don't have monitors with decent color assumes that 99.99% of gamers don't have an sRGB monitor. Most consumer monitors do come with a reasonably neutral color preset available for sRGB, and most have reasonably accurate gamma curves. So you've made two widely sweeping generalizations about "gamers": that they don't have decent monitors (your 30" doesn't have sRGB mode so everyone else's doesn't need one either) and that they would prefer over-saturated colors to accurate ones... Is there some "gamer" color opinion poll we can reference to answer that last question? What does over-saturated mean to you? I'm guessing your definition of over-saturated is remarkably close to incorrect colorspace. They actually aren't the same thing, though.

OP asked about how to use x-rite software. You answer that it is meaningless and that most gamers prefer uncalibrated color to a profiled display. Who was off topic?
 
Choose a whitepoint which looks good in your environment. There is - contrary to popular belief - no D65 constraint. Think of different colour matching conditions, color constancy of human perception (should be considered when processing colorimetric data - implicitly done for you when the profile is created but neglected in many TV calibration solutions) and impacts of observer metamerism. But give your eyes a chance (=> time + neutral stimulus) to adapt to the respectively selected whitepoint when you experiment with different targets.

This makes sense - so long as the whitepoint isn't too extreme, and the grayscale is consistently balanced, the visual system should adapt to it.

I suppose one caveat would be that you'd want to match the chromaticity of your surround and bias light to whatever white point you chose.
 
Of course you are not. :rolleyes: :rolleyes: :rolleyes: :rolleyes:

Which is exactly why you brought it into the conversation. Now get back on topic color, accuracy troll.

It's his religion, it seems, in every thread, no matter how unrelated, he has to come in and complain about wide gamut monitors. I think a wide gamut monitor beat up his cat or something :).
 
It is meaningless as 99.99% of gamers do not have a calibrated or at all accurate color output on their monitor anyway...and further most people like the more vibrant (inaccurate) colors on their smart phones.

most people prefer sugary crap compared to dark chocolate, doesn't mean that sugary crap is better.

There's something to be said about the natural realism of a well rendered piece of photographic art. A wide gamut monitor that expands the saturation unnaturally may look appealing on first glance, and grab one's attention in a big box store, but in the long run it's just nauseating, unless you're actually working with content that is mastered for wider gamuts, such as Adobe RGB.

And smart phones are generally used for diff purposes than a desktop computer.
 
I went ahead and did a report with Dispcal on my Catleap with no calibration loaded (the video card LUT gamma scale is defeatable but what's the point--no one will work or play like that, so I left it on).

Uncalibrated response:
Black level = 0.1134 cd/m^2
50% level = 27.72 cd/m^2
White level = 119.02 cd/m^2
Aprox. gamma = 2.10
Contrast ratio = 1050:1
White chromaticity coordinates 0.3030, 0.3300
White Correlated Color Temperature = 7036K, DE 2K to locus = 10.6
White Correlated Daylight Temperature = 7030K, DE 2K to locus = 7.4
White Visual Color Temperature = 6555K, DE 2K to locus = 10.2
White Visual Daylight Temperature = 6714K, DE 2K to locus = 7.1
Effective LUT entry depth seems to be 8 bits
White drift was 0.145775 DE

That's what I got. I'll do a report with HCFR in a bit too. This is not bad at all and coming from an FG2421 you'd think black crush would be something I could gauge but I don't see this display as having a particular issue with dark colors. If I had known IPS/PLS could be this good I would have bought one a long time ago. Now I'm going to have 3 and I have to choose one... not going to be easy but I can tell you the Yamakasi is doing 120Hz with no coil whine (apparently this can be an issue for the Tempest)
 
Thread summary:

Wide gamut 30" monitors claim calibration is worthless in order to not-so-subtly justify their purchase rather than helping the OP and that I am derailing the thread when critiquing inaccurate, purchase justifying generalizations.

Meanwhile, people with standards favor calibration and appreciate the time and effort artists put into their work. Takeaway:

Use Color Sustainer.
 
Last edited:
I managed to get almost all the games to run with a custom profile (ICC or LUT) through Borderless Windowed Mode or Windowed Fullscreen Mode or whichever other names that mode has - they are all the same thing. The best programs to lock your profile are "Monitor Calibration Wizard" (MCW), which I think is the best, and "CPKeeper", which I didn't like as it wasn't as forceful as MCW. Neither program is going to work for all games, which is why you also need "Windowed Borderless Gaming" program. It works really well for programs that do not support the Windowed Fullscreen Mode. It forces them to use such a mode, removes borders, and aligns pixels perfectly with your monitor. I tried many many games with it and only one wouldn't work - Shadow Warrior.

People who are saying that games will reset your LUT / ICC profile no matter what are very very very wrong! Some games today, like BF4, even allow the use of a 1D LUT / ICC profile in normal Fullscreen Mode. ICC profile / 1D LUT reset is no longer a problem for new and most old games. However, Borderless Windowed Mode will not allow the use of Vsync AFAIK. Some people report lower framerate in such a mode, but I benchmarked several games in normal Fullscreen and Borderless Windowed modes to find FPS difference, which was non-existent.

As far as selecting the right white point, I am not going to start an argument about it again, but D65 is the one you should aim for. If you plan on selecting a white point of your own based on your environment, then make sure to measure your environments' conditions appropriately and not select some random white point. I highly disagree with randomly selecting a white point and then trying to get your eyes to adjust to it because if your eyes do adjust, then selecting D65 is the most optimal option anyway! I strongly encourage you to do your own research and visit AVS Forums (Display Calibration section - http://www.avsforum.com/f/139/display-calibration ), where there will be plenty of people who would also disagree with the whole "Eye-adjusting" idea that Sailor_Moon and spacediver are trying to push. Selecting a random white point a minority opinion/idea, which defeats the purpose of calibration. I advise listening to pro calibrators and reading guides. D65 is the white point you need to use, unless you know how to measure light around your setup. Even then, i1Display Pro or ColorMunki Display colorimeters are not accurate enough to provide you with a truly precise D65 calibration on their own. That means your results will be somewhat off and your eyes will adjust to those results, but aiming for the right D65 point is a hell of a lot better idea than to aim at any other white point, which you wouldn't achieve accurately anyway due to the same colorimeter aspects.

I also advice calibrating with BT.1886 gamma since monitor black levels are usually very poor. Power-law gamma 2.2 is a dying standard that will make it hard to see black and dark gray details on your monitor. On some monitors, power-law 2.2 gamma will completely crush your blacks. The popular G-Sync monitor VG248QE is a great example of such a monitor, where power-law 2.2 gamma calibration would completely clip black levels 1 and 2, which has a very negative image detail-killing effect.

I will probably be bashed by spacediver, NCX, and Sailor_Moon, who would try to correct most of the statements I just made, which is why I am stating the following: DON'T TAKE MY WORD FOR IT. Instead, visit AVS Display Calibration forums ( http://www.avsforum.com/f/139/display-calibration ), read posts, guides, and ask questions there, but not here, where minority opinions prevail. Just don't let Sailor_Moon, NCX, and spacediver (no offense guys!!!) be the only opinions that influence your calibration decisions and methods or you might as well not calibrate at all - just play around with monitor settings to your liking... It does take time to figure it all out, but that is what it takes to understand calibration and know how to do it right, rather than listening to a few minority opinions here because its just not representative of what professional calibrators would advise.
 
Monarch,

I personally calibrate to D65, and am not trying to "push" anything, but the idea of adapting is of theoretical interest to me, and I can't think of a good reason why some other white point wouldn't provide the same perceptual results. I'm planning on discussing this in AVS forum soon, but you should understand that this idea is based on a deeper understanding of color theory rather than a naive one.

As for BT.1886, I highly recommend this for a lot of video content, but for gaming, a 2.2 (with an input offset similar to BT.1886) may be more appropriate, as I suspect gaming content is mastered with a 2.2 exponent and not a 2.4 exponent. ArgyllCMS has a function that allows you to implement an input offset with any desired base exponent, so you can have a 2.2 exponent but not "power law" based.
 
Isn't it true that if AdobeRGB had come before sRGB (and thus standardized differently), we would all be better off? More colors to work with seems like a good thing. It's cheaper to stick with sRGB (and cheaper to produce products that display most of that color space) so I doubt things will change.

Anyways, I've always run my U3011 in sRGB mode, but was shocked to see how far off the white point was from D65 after calibrating. Made a huge visual change. I'm not going to bother with programs that force the ICC profile in full screen games (I've tried to switch what I can to borderless windowed fullscreen) as I'm hoping that the LG 34UM95 I ordered ships. Internal calibration seems like the way to go whenever possible.
 
Actually, for emmissive sources (rather than illuminated sources), perhaps chromatic adaptation is compromised, in which case, it may be important to stick with D65:

http://www.color-image.com/2012/02/monitor-calibration-d65-white-point-soft-proofing/

in particular:

When hard-copy images are being viewed, the image is perceived as an object that is illuminated by the prevailing illumination. Thus both sensory mechanisms that respond to the spectral energy distribution of the stimulus and cognitive mechanisms that discount the “known” color of the light source are active. When a soft-copy display is being viewed, it cannot easily be interpreted as an illuminated object. Therefore there is no “known” illuminant color and only sensory mechanisms are active.

(quote is from Mark Fairchild)
 
Thread summary:

Wide gamut 30" monitors claim calibration is worthless in order to not-so-subtly justify their purchase rather than helping the OP and that I am derailing the thread when critiquing inaccurate, purchase justifying generalizations.

Meanwhile, people with standards favor calibration and appreciate the time and effort artists put into their work. Takeaway:

Use Color Sustainer.

Thank you for pointing us out to Color Sustainer. I'll have to check it out.
 
Actually, for emmissive sources (rather than illuminated sources), perhaps chromatic adaptation is compromised, in which case, it may be important to stick with D65:
As I said the whitepoint is a quite flexible factor - especially when only judging only one source at a time. During profile generation (ICC workflow) visual adaption regarding display white is assmumed and the characterization data is adapted to the reference white of the PCS (D50). However: Color constancy has its limits and environments with different sources must be aligned to one another (although that always means a visual but not not necessarily colorimetric identical whitepoint - modern ISO recommendations like the upcomming 14861 account for that). Moreover, color critical work requires defined and constant color matching conditions.

I will probably be bashed by spacediver, NCX, and Sailor_Moon
Again I don't like your diction - never "bashed" anyone. Just mentioned basic principles of human perception and colorimetry (reaching back to Graßmann and von Kries). This may collide with some best practises in the video sector (in contrast to fields of application in the graphic industry) but is valid.

Selecting a random white point a minority opinion/idea, which defeats the purpose of calibration
A normative reference white doesn't fixate the reproduction white. But even if I don't account for color constancy in the human visual system (which you experience each day) observer metamerism is a strong factor. The neutral tone reproduction of two screens calibrated to the same whitepoint can vary considerably (triggered by the sharp spectra of self-luminous devices) due to differences between normative and actual observer. To visually achieve the "grading-whitepoint" you would have to bring grading screen and TV together and adjust the latter ones whitepoint till there is a match for your individual perception. Just replicating the whitepoint colorimetrically will in most cases lead to deviations during parallel viewing.

However: I'm not speaking against D65 as calibration target. It's a good starting point and should be used in the context of most TV calibration solutions (especially because there often are some limitations in the workflow). My answer was addressed to the perception of Bluesun311, irrespective of whether the color cast was caused by incomplete, disturbed (=> environmental conditions) or impossible adaptation, absolute measurement errors or effects of observer metemerism - or any combination of these.
 
Last edited:
Monarch,

I personally calibrate to D65, and am not trying to "push" anything, but the idea of adapting is of theoretical interest to me, and I can't think of a good reason why some other white point wouldn't provide the same perceptual results. I'm planning on discussing this in AVS forum soon, but you should understand that this idea is based on a deeper understanding of color theory rather than a naive one.

As for BT.1886, I highly recommend this for a lot of video content, but for gaming, a 2.2 (with an input offset similar to BT.1886) may be more appropriate, as I suspect gaming content is mastered with a 2.2 exponent and not a 2.4 exponent. ArgyllCMS has a function that allows you to implement an input offset with any desired base exponent, so you can have a 2.2 exponent but not "power law" based.

Most films aren't mastered with BT.1886 either, but BT.1886 is needed on displays with poor black levels, be it for games or movies to prevent black detail crush. I have seen a few monitors with a BT.1886-like gamma on the dark/blac end, but with a 2.2 exponent. It was like a hybrid, which I think was more or less acceptable, as long as there was a perceptual luminance rise in the darkest of gray levels to prevent dark detail crush. Games can have crushed black and dark gray detail just like films with power-law 2.2 gamma on displays with poor blacks.

I partially agree that our eyes do adjust, but I disagree with the idea of white point selection based on preference. If you are looking at a display's white point with more blue than D65 point requires, then forcing an accurate D65 profile would immediately produce at what would seem as a red/redder image for a limited period of time, during which our eyes will adjust to the new white point. Does that mean your eyes can adjust to any white point? If so, then there should be no complaints of blue, red, or green tints that people can see on many uncalibrated displays. That leads towards the big question - where is that threshold, at which your eyes stop perceiving the current white point as a white point and start perceiving it as a point with too much/little blue/red/green? Is it the same for every color? Is it the same for every person? If there really is an answer, then is it backed up by scientific evidence? Thus far, we only have one value for an answer - "dE 3". This is why a SINGLE white point was selected for whichever standard and adjusting your display's white point to that standard's white point is called "display calibration". This also begs for the question I already asked... If eyes do adjust to whichever white point, then why not use the standard D65 white point??? You might not like it at first, but your eyes will get used to it and you will get both benefits - adjusted liking AND seeing the image the way it was mastered or the way it was meant to be seen...
 
Most films aren't mastered with BT.1886 either, but BT.1886 is needed on displays with poor black levels, be it for games or movies to prevent black detail crush. I have seen a few monitors with a BT.1886-like gamma on the dark/blac end, but with a 2.2 exponent. It was like a hybrid, which I think was more or less acceptable, as long as there was a perceptual luminance rise in the darkest of gray levels to prevent dark detail crush. Games can have crushed black and dark gray detail just like films with power-law 2.2 gamma on displays with poor blacks.

No, you're misunderstanding. BT.1886 comprises two key elements: a standardized exponent of 2.4, and an input offset which adjusts the shape of the luminance function in order to perceptually compensate for a raised black level.

If material was mastered in a 2.2 environment, a 2.2 exponent with an input offset is better than a 2.4 exponent with an input offset. This is why Argyll is so good - it is flexible enough to have a custom exponent but still have the input offset incorporated.
 
That leads towards the big question - where is that threshold, at which your eyes stop perceiving the current white point as a white point and start perceiving it as a point with too much/little blue/red/green? Is it the same for every color? Is it the same for every person? If there really is an answer, then is it backed up by scientific evidence?

Yes, there is most certainly a threshold - there are limits to chromatic adaptation, and yes, look up chromatic adaptation and color constancy if you want to review the scientific literature.
 
For what it's worth, monarch said the id3 probe would make whites look pink and it does.

That said, I can use argyll to adjust for it
 
As I said the whitepoint is a quite flexible factor - especially when only judging one source.

Ignoring observer metamerism for now, which I admit is a huge issue, based on that link I posted, it seems that chromatic adaptation will not occur when judging an emissive source, as the brain doesn't interpret the display as being illuminated (and therefore does not account for a light source that is illuminating the surface).

In this situation, even after a period of exposure, my understanding is that D65 will appear perceptually different than D50 will after a period of exposure (unlike the case of hard proofing where chromatic adaptation does occur).

And because of this, a standardized white point must be adhered to in order to maintain integrity of artistic intent, in the context of emissive sources.
 
Ignoring observer metamerism for now, which I admit is a huge issue, based on that link I posted, it seems that chromatic adaptation will not occur when judging an emissive source
Adaption processes (think of a "RGB gain control" of the visual system) still take place (also not contradicted by Fairchild). But you rely solely on arithmetic calculations (via transfering and scaling the tristimulus data in a cone response domain) when determining the corresponding colors* under the new "illumination". In the ICC workflow this process is transparent for the user. However: D50 is a quite "borderline" calibration target and won't lead to a visual match with the media whitepoint of standard paper types under D50 normlight during proofsimulation. 5800K (daylight or blackbody locus) is a good starting point in this case.

--
* If you define a gamut target (e.g. TV calibration, configuration of color space emulations) with a different whitepoint you must use the adapted data to achieve optimal reproduction.
 
Last edited:
Adaption processes (think of a "RGB gain control" of the visual system) still take place (also not contradicted by Fairchild). But you rely solely on arithmetic calculations (via transfering and scaling the tristimulus data in a cone response domain) when determining the corresponding colors* under the new "illumination". In the ICC workflow this process is transparent for the user. However: D50 is a quite "borderline" calibration target and won't lead to a visual match with the media whitepoint of standard paper types under D50 normlight during proofsimulation. 5800K (daylight or blackbody locus) is a good starting point in this case.

--
* If you define a gamut target (e.g. TV calibration, configuration of color space emulations) with a different whitepoint you must use the adapted data to achieve optimal reproduction.

Ok let's make it simple. Suppose a video contains a single image of a white screen at peak white, D65. The artist wishes to convey this impression with as much fidelity as possible (barring observer metamerism).

How are you going to reproduce this sensation if the display you're viewing is calibrated to D50?

Remember, the only thing being displayed is a "pure" white screen for the entire duration of the video.
 
Last edited:
How are you going to reproduce this sensation if the display you're viewing is calibrated to D50?
The visual system will chromatically adapt to the neutral tone(s) - unlike a camera where the white balance must be defined during RAW development (digital workflow). Though D50 (respectively the colour stimulus - not the "synthetic" spd which comes into play for illumination) can be quite borderline as I said. Please keep in mind that you have to adapt the reference data when performing an "illumination change" to account for the automatic white balance of the visual system.
 
Last edited:
The visual system will chromatically adapt to the neutral tone(s)

But the whole point of that fairchild quote is that we don't chromatically adapt to white points in an emissive display. In the case of a physical object that has a chromaticity along the plankian locus, we will eventually adapt to it and see it as achromatic. In that case, it doesn't really matter where along the locus it was (D50, D65, etc., within limits).

But in an emissive display, the point is that we'll still perceive the tint, because our brains are smart enough to figure out that there is no light source that is illuminating the display, and chromatic adaptation does not occur.

Please keep in mind that you have to adapt the reference data when performing an "illumination change" to account for the automatic white balance of the visual system.

Not sure what this would imply in the context of my example of the video with only a white screen. Are you suggesting that we periodically adapt to a reference signal in order that we perceive the correct tint of the displayed image? Surely it's easier just to calibrate to D65 right?
 
Ok I think I can wrap my head around what the current debate is.

I have some questions specific to the x-rite iprofiler software and specific to monitors that have no OSD and no way to control primary RGB levels. And this includes 10bit wide gamut displays with no srgb mode.

It would appear that in a way x-rite has made their software "better" for use with the id3 "pro" probe by including factory calibration code for it (as opposed to the colormunki display which is also slower in taking measurements). It seems to me calibrations made with dispcal are higher quality and more configurable but the white level I described above as being pinkish comes out cooler and more neutral using the x-rite software. What am I doing wrong? There's 2 programs right now that both are set up to load profiles, though x-rite doesn't have a profile to use configured it seems to mess with the gamma on windows loading. Likewise dispcalGUI wants to load a profile and does at start, and set up this way dispcal tells me my profiles are getting better srgb coverage than if I disable the x-rite gamma loader. Very confusing. Any suggestions?

Add to that the factor of owning displays with no color brightness or contrast controls (these are everywhere guys) calibration becomes pretty gnarly and mysterious. I have asked why it is that my RGB levels don't change in delta e from being profiled. The gamma curve improves the contrast ratio is reduced along with black crush, but the primary colors remain way off. Is this because I am doing something wrong, is it because you can't change a displays primary RGB levels with software, or is it because the best results need to include the native RGB levels of the display?
 
And also, can anyone with an educated opinion please comment on the pre-calibrated measurements posted above for the Yamakasi catleap: would you recommend forcing icc profiles on a display like this while gaming? It doesnt seem to make a difference on games I try on steam. Yasamoka's color sustainer program has an option for 10bit output that was supposed to reduce banding on 8bit displays as thoug it were an FRC matrix but it doesn't work for me. Is this not a possible legitimate way to deal with the issue of post-calibration greyscale banding and optionally wide gamut displays with no srgb mode? Can we really not display 8bit material correctly on the ccfl displays with no srgb mode period? I'm puzzled that people don't notice it. The greens are supposed to be really unnatural looking when displayed this way right?
 
Back
Top