WinDAS White Point Balance guide for Sony Trinitron CRTs

It tells you what to do - click the re-adjust button.

You should just skip that step and click ok. The previous step where you actually measure how the chromaticity tracks with brightness is more than sufficient. You're worrying too much about nothing :)
 
It tells you what to do - click the re-adjust button.

You should just skip that step and click ok. The previous step where you actually measure how the chromaticity tracks with brightness is more than sufficient. You're worrying too much about nothing :)

I calibrated for a few hours over the weekend. Now I finally understand what those 2 final steps are for.

After doing the lut process, how do I unload the lut and go back to the default colors if I want to start again? Resetting the PC did nothing.
 
Also, how do I measure contrast in hcfr? It worked before when I did the gray scale tests but now it's a bunch of question marks.
 
I calibrated for a few hours over the weekend. Now I finally understand what those 2 final steps are for.

After doing the lut process, how do I unload the lut and go back to the default colors if I want to start again? Resetting the PC did nothing.

Did you follow this step to implement the LUT?

Once this is complete, you'll have a file on your hard drive called mylutname.cal (or whatever you called it). To load it, type:

dispwin mylutname.cal

As for unloading the LUT:

Make sure there are no LUTs being loaded into your system. Things like monitor calibration wizard, adobe gamma loader, etc. should all be disabled. To really ensure things are normalized and linear, open up a command window (start, cmd), and type "dispwin -c" (without quotations). This will reset the video LUT, and if it wasn't reset to begin with, you'll see the colors/brightness of your display change as soon as you hit enter. If the colors do change, make sure you figure out what was changing the LUT.
 
Also, how do I measure contrast in hcfr? It worked before when I did the gray scale tests but now it's a bunch of question marks.

That's because your black level is so deep that your instrument can't measure it. If you followed my instructions for G2 and your visual system is as sensitive as mine, then your black level is probably around 0.002 cd/m2 if not lower, so you can calculate your approximate contrast by dividing peak luminance by that number.
 
Just did some experiments.

First, I wanted to see how much precision LUT adjustments provide compared to just changing the RGB level (video level).

Luminance of full test field of video level 41: 0.365
Luminance of full test field of video level: 42: 0.400

I then loaded up a test field of 42, and incrementally reduced the LUT, pausing between adjustments to see if a luminance change occured. If none occured, I reduced LUT by another notch, until a change occured, and then I wrote down this number.

Starting luminance: 0.400
Next lowest: 0.393
then: 0.386
0.379
0.372
0.365

So this tells me that LUT adjustments offer five times as much precision as video level adjustments.

If this precision increase holds across all LUT entries, then this means we have a total value of 1276 possible intensity levels per channel, which is "10.3" bits of precision.

Second experiment: compare luminance of video level 50 with luminance of 50/0 grating.

50: 0.738 cd/m2

Assuming a 0 black level, perfect integration of sensor over grating, and no leakage from light to dark patches, we'd predict a 50/0 grating to have a luminance of 0.369

Actual measured 50/0 grating: 0.379

Not bad, and I should also say there was fluctuation in the readings - it took a while for the luminance to stabilize, so variance might account for this (minor) discrepancy.
 
Did you follow this step to implement the LUT?



As for unloading the LUT:

Thanks. Yes I loaded the freshly created lut but was looking for a way to compare the before and after. That's why the unloading command you gave me will help a lot.
 
That's because your black level is so deep that your instrument can't measure it. If you followed my instructions for G2 and your visual system is as sensitive as mine, then your black level is probably around 0.002 cd/m2 if not lower, so you can calculate your approximate contrast by dividing peak luminance by that number.

Hmmm, I see. How would I go about finding peak luminance then? You're saying to divide peak L by black level? How do I determine what my black level is?
 
Peak luminance is the luminance of your pure white - its what HCFR shows as the 100% value in the grayscale measurements.

I already explained that you can't measure your black level because it's too low.
 
Thanks. Yes I loaded the freshly created lut but was looking for a way to compare the before and after. That's why the unloading command you gave me will help a lot.

If you loaded the LUT, then rebooting would have certainly unloaded it. Did you notice an immediate change in the image when you loaded the LUT?
 
Peak luminance is the luminance of your pure white - its what HCFR shows as the 100% value in the grayscale measurements.

I already explained that you can't measure your black level because it's too low.

But how would I calculate my contrast ratio? You said divide peak luminance by something, what is that something?
 
If you loaded the LUT, then rebooting would have certainly unloaded it. Did you notice an immediate change in the image when you loaded the LUT?

Yes, there is an immediate change when loading and unloading LUTs. Rebooting, for whatever reason, doesn't unload the LUT.
 
But how would I calculate my contrast ratio? You said divide peak luminance by something, what is that something?

I thought I was clear in my post. I said although your black level can't be measured, it's probably around 0.002 cd/m2 or lower.
 
weird, mine always resets upon reboot, which is why I created a startup batch command that loads it automatically.

I also have a couple shortcuts on my desktop - one to load it and one to unload it so I can easily switch with the click of a button.

How's the image quality with the Argyll LUT?
 
weird, mine always resets upon reboot, which is why I created a startup batch command that loads it automatically.

I also have a couple shortcuts on my desktop - one to load it and one to unload it so I can easily switch with the click of a button.

How's the image quality with the Argyll LUT?

Argyll LUT has been pretty good. I'm just trying to figure out what the right gamma level is -- hard to pick one. Haven't had any chance to test games out. Just been watching some movies.
 
Stick with 2.4, as that is what HD content is now supposed to be mastered at. If you like, create another one with 2.2 for stuff that was encoded with a lower gamma.
 
Measure your gamma in HCFR to be sure, and also, try a dark room with a bias light (a small lamp facing towards the wall behind the monitor)

It also could be that you're used to a slightly washed out image. interfacelift has some great photography, some of which looks fantastic with 2.4 gamma.

This is a nice example
 
Measure your gamma in HCFR to be sure, and also, try a dark room with a bias light (a small lamp facing towards the wall behind the monitor)

It also could be that you're used to a slightly washed out image. interfacelift has some great photography, some of which looks fantastic with 2.4 gamma.

This is a nice example

Well now that its night time, I can properly calibrate for 2.4.

I left this monitor was for about 24 hours and am.finding that with a full black screen, I'm unable to get completely deep black. I tried to lower the G2 down another 10 points during the first adjustment in the WPB procedure. Is this normal? Is there always bound to be some glow when in a full black screen?
 
Yes, the luminance threshold of a dark adapted human is extremely low even in the fovea, where there are fewer rods. As such, a CRT will never be able to provide true blacks to a dark adapted visual system.

But you're never dark adapted when actually viewing content. For one, a reference viewing environment typically has a bias light, which lowers the apparent black level. Secondly, content is rarely full black for more than a second, which is not enough to dark adapt.
 
Yes, the luminance threshold of a dark adapted human is extremely low even in the fovea, where there are fewer rods. As such, a CRT will never be able to provide true blacks to a dark adapted visual system.

But you're never dark adapted when actually viewing content. For one, a reference viewing environment typically has a bias light, which lowers the apparent black level. Secondly, content is rarely full black for more than a second, which is not enough to dark adapt.

Cool. I didn't know that CRTs can't produce 100% true blacks. But you're right, there's little chance of that in practice being an actual issue when viewing content. Do you have any recommendation for setting up a good bias light? That's a concept I never had to deal with before. But looks interesting.

I was looking at some videos that showed off Sony's trimaster EL series of OLED monitors. They had darker blacks than CRTs. What are your thoughts on those?
 
the trimaster are nice OLEDS - see my post here

I prefer the FW900 though - supports higher resolutions, higher refresh rates, and has better motion resolution. The Trimaster probably has a decent amounf of input lag (tho not sure). For watching HD content though, the trimaster OLED probably wins out.

And yes, OLED blacks are second to none.

I described a poor man's bias light here. If you want to go full out, you can buy one that is close to D65, but if you go that route, ensure that the wall behind your monitor has a neutral color.

One of the best resources on bias lighting is George Alan Brown:

http://www.cinemaquestinc.com/ (click on bias light basics)
 
Last edited:
the trimaster are nice OLEDS - see my post here

I prefer the FW900 though - supports higher resolutions, higher refresh rates, and has better motion resolution. The Trimaster probably has a decent amounf of input lag (tho not sure). For watching HD content though, the trimaster OLED probably wins out.

And yes, OLED blacks are second to none.

I described a poor man's bias light here. If you want to go full out, you can buy one that is close to D65, but if you go that route, ensure that the wall behind your monitor has a neutral color.

One of the best resources on bias lighting is George Alan Brown:

http://www.cinemaquestinc.com/ (click on bias light basics)

I'll look into bias light. Might try the antec one -- only $10 bucks!

I'm hoping one day something will be released that will beat the FW900 completely.
 
Do you think that it may be worth investing in.an i1 display pro? Would it give me better results than a DTP-94? Or are we limited by the WPB procedure?
 
Just did some experiments.

First, I wanted to see how much precision LUT adjustments provide compared to just changing the RGB level (video level).

Luminance of full test field of video level 41: 0.365
Luminance of full test field of video level: 42: 0.400

I then loaded up a test field of 42, and incrementally reduced the LUT, pausing between adjustments to see if a luminance change occured. If none occured, I reduced LUT by another notch, until a change occured, and then I wrote down this number.

Starting luminance: 0.400
Next lowest: 0.393
then: 0.386
0.379
0.372
0.365

So this tells me that LUT adjustments offer five times as much precision as video level adjustments.

If this precision increase holds across all LUT entries, then this means we have a total value of 1276 possible intensity levels per channel, which is "10.3" bits of precision.
what graphics card are you using?
Second experiment: compare luminance of video level 50 with luminance of 50/0 grating.

50: 0.738 cd/m2

Assuming a 0 black level, perfect integration of sensor over grating, and no leakage from light to dark patches, we'd predict a 50/0 grating to have a luminance of 0.369

Actual measured 50/0 grating: 0.379

Not bad, and I should also say there was fluctuation in the readings - it took a while for the luminance to stabilize, so variance might account for this (minor) discrepancy.

this agrees with what I see on my display. the grating is always a percent or so brighter than it should be from direct averaging. in your case it's 2.6%.

as for stabilizing, I recall that there would be some amount of drift over the span of several minute. cant remember exactly how much drift.. I think it was less than a percent. I'll check again later. or actually I should probably just learn the ins and outs of argyll and write scripts so I can skip hcfr.

anyway... new crazy idea:
instead of using our eyes, how about using a cone of aluminum foil to focus all the display's light onto the colorimeter's sensor. ideally that would increase its sensitivity by about a thousand times.

wait actually i have to think a bit about how to shape the cone... it's not quite like funneling water
 
Nvidia GeForce GTX 660 connected via DVI-I

If I have time this wknd, I'm gonna see whether it''s consistently off by 2.6%, and whether it's off by 2.6% across different gratings. If it is, then it's simple to model the offset in whatever protocol we use.

As for drift, I'm gonna try and measure the drift and see if I can find a pattern.

Funneling light through aluminum. Interesting idea!! I'll think on it too. A massive lens would probably work :) If you have a regular magnifying glass, would be easy to do some experiments. I'll try it out with my loupe, although not sure it focuses light the same way you can do it with a magnifying glass (remember using magnifying glass to focus light from the sun?)

edit: it does focus light like a magnifying glass - the skin on my hand is hurting right now from focusing sunlight :) Will try it out later this evening with a colorimeter.
 
Last edited:
a lens would work somewhat but it wouldn't be perfect. a lens would focus collimated light onto a focal point, but the light from crt monitors is diffuse.
 
ok... a funnel-like design would not be very effective as light will tend to reflect back onto the monitor after a few reflections.

blah this idea isn't really practical...

the only way to focus the monitor's light would be to have a huge elliptical mirror.
 
is it possible to buy "mirror paint" that you can apply to a surface? If so, then if you could build the surface somehow, then you could have your elliptical mirror.
 
you're right, loupe did absolutely nothing.

How big a mirror were you thinking of. Wonder if you could just glue some aluminum foil onto a curved surface...
 
the issue with lens and small mirrors like in those videos is that too much light escapes from the sides.

i was thinking of a 1meter sized egg type of mirror. yea... not practical.

back to using our eyes I guess.
 
if you could get a ten fold boost in luminance using a satellite dish converted mirror, that would be huge :)
 
probably not possible.
the only way to focus the monitor's light would be to have the dish far away. but then too much light would be lost from the sides.

so the only way would be to use an egg/elliptical mirror that covers the sides
 
I see what you mean. But what if you only were interested in collecting light from, say, a third of the screen area, so you could use a smaller "dish"? As it is, our instruments collect light from a pretty small area, so one might not need to focus light from the entire screen to get a measurable boost.
 
Back
Top