spacediver
2[H]4U
- Joined
- Mar 14, 2013
- Messages
- 2,700
This might help
No, just move the slider up and down, and just make sure that the hue doesn't change. Do you understand what I mean by hue?
It tells you what to do - click the re-adjust button.
You should just skip that step and click ok. The previous step where you actually measure how the chromaticity tracks with brightness is more than sufficient. You're worrying too much about nothing![]()
I calibrated for a few hours over the weekend. Now I finally understand what those 2 final steps are for.
After doing the lut process, how do I unload the lut and go back to the default colors if I want to start again? Resetting the PC did nothing.
Once this is complete, you'll have a file on your hard drive called mylutname.cal (or whatever you called it). To load it, type:
dispwin mylutname.cal
Make sure there are no LUTs being loaded into your system. Things like monitor calibration wizard, adobe gamma loader, etc. should all be disabled. To really ensure things are normalized and linear, open up a command window (start, cmd), and type "dispwin -c" (without quotations). This will reset the video LUT, and if it wasn't reset to begin with, you'll see the colors/brightness of your display change as soon as you hit enter. If the colors do change, make sure you figure out what was changing the LUT.
Also, how do I measure contrast in hcfr? It worked before when I did the gray scale tests but now it's a bunch of question marks.
Did you follow this step to implement the LUT?
As for unloading the LUT:
That's because your black level is so deep that your instrument can't measure it. If you followed my instructions for G2 and your visual system is as sensitive as mine, then your black level is probably around 0.002 cd/m2 if not lower, so you can calculate your approximate contrast by dividing peak luminance by that number.
Thanks. Yes I loaded the freshly created lut but was looking for a way to compare the before and after. That's why the unloading command you gave me will help a lot.
Peak luminance is the luminance of your pure white - its what HCFR shows as the 100% value in the grayscale measurements.
I already explained that you can't measure your black level because it's too low.
If you loaded the LUT, then rebooting would have certainly unloaded it. Did you notice an immediate change in the image when you loaded the LUT?
But how would I calculate my contrast ratio? You said divide peak luminance by something, what is that something?
Yes, there is an immediate change when loading and unloading LUTs. Rebooting, for whatever reason, doesn't unload the LUT.
You loaded it in the command window in windows 7?
weird, mine always resets upon reboot, which is why I created a startup batch command that loads it automatically.
I also have a couple shortcuts on my desktop - one to load it and one to unload it so I can easily switch with the click of a button.
How's the image quality with the Argyll LUT?
Stick with 2.4, as that is what HD content is now supposed to be mastered at. If you like, create another one with 2.2 for stuff that was encoded with a lower gamma.
Measure your gamma in HCFR to be sure, and also, try a dark room with a bias light (a small lamp facing towards the wall behind the monitor)
It also could be that you're used to a slightly washed out image. interfacelift has some great photography, some of which looks fantastic with 2.4 gamma.
This is a nice example
Yes, the luminance threshold of a dark adapted human is extremely low even in the fovea, where there are fewer rods. As such, a CRT will never be able to provide true blacks to a dark adapted visual system.
But you're never dark adapted when actually viewing content. For one, a reference viewing environment typically has a bias light, which lowers the apparent black level. Secondly, content is rarely full black for more than a second, which is not enough to dark adapt.
the trimaster are nice OLEDS - see my post here
I prefer the FW900 though - supports higher resolutions, higher refresh rates, and has better motion resolution. The Trimaster probably has a decent amounf of input lag (tho not sure). For watching HD content though, the trimaster OLED probably wins out.
And yes, OLED blacks are second to none.
I described a poor man's bias light here. If you want to go full out, you can buy one that is close to D65, but if you go that route, ensure that the wall behind your monitor has a neutral color.
One of the best resources on bias lighting is George Alan Brown:
http://www.cinemaquestinc.com/ (click on bias light basics)
Do you think that it may be worth investing in.an i1 display pro? Would it give me better results than a DTP-94? Or are we limited by the WPB procedure?
what graphics card are you using?Just did some experiments.
First, I wanted to see how much precision LUT adjustments provide compared to just changing the RGB level (video level).
Luminance of full test field of video level 41: 0.365
Luminance of full test field of video level: 42: 0.400
I then loaded up a test field of 42, and incrementally reduced the LUT, pausing between adjustments to see if a luminance change occured. If none occured, I reduced LUT by another notch, until a change occured, and then I wrote down this number.
Starting luminance: 0.400
Next lowest: 0.393
then: 0.386
0.379
0.372
0.365
So this tells me that LUT adjustments offer five times as much precision as video level adjustments.
If this precision increase holds across all LUT entries, then this means we have a total value of 1276 possible intensity levels per channel, which is "10.3" bits of precision.
Second experiment: compare luminance of video level 50 with luminance of 50/0 grating.
50: 0.738 cd/m2
Assuming a 0 black level, perfect integration of sensor over grating, and no leakage from light to dark patches, we'd predict a 50/0 grating to have a luminance of 0.369
Actual measured 50/0 grating: 0.379
Not bad, and I should also say there was fluctuation in the readings - it took a while for the luminance to stabilize, so variance might account for this (minor) discrepancy.