LAGRUNAUER
Gawd
- Joined
- Dec 7, 2006
- Messages
- 745
Thanks for your concern, Unkle Vito.
I have some more questions regarding GAIN...
As shipped, this FW900 had its 6500K set to gain R=78.4%, G=72.2%, B=70.6%. This must have been calibrated by Accurate IT, because when I tell the monitor to reset its own 6500K by pressing the RESET button, it goes to R=95.3%, G=87.5%, B=85.9%. (All R,G,B bias is set to 50.2%.) Also, the monitor was set to Brightness=32.2%, Contrast=100%. This is probably not a coincidence, since around Brightness=32% is where the blacks start to become virtually perfect, deep black.
It seems that Accurate IT must have used the ECS access port to do calibration, maybe with WinDAS or WinCAT, because their 6500K color balance of R=78.4%, G=72.2%, B=70.6% had the name "6500K", whereas editing it in the OSD changes the name to "2". RESETing it changes the name back to "6500K" but with different defaults than the monitor was shipped with (as noted above).
I took note of how 6500K was set and then turned it up to R=100%, G=100%, B=95.3%, because I wanted the maximum white to be brighter. (Due to the type of color deficiency I have it doesn't matter to my eyes exactly how much Red is in the white point as long as it's approximately okay.) I kept Brightness and Contrast the same as they were. At these settings, white was quite bright but still not blown out (gradations could still be seen in the brightest whites). (This is using the OSD only; I have not used WinDAS at all and don't even have the necessary cable yet.) Of course the native gamma is not 2.2 this way, but it's correctable through a gamma ramp on the video card.
Is it safe to set the RGB gains to maximum this way, or could it have caused the defocus/pop problem after 8 hours of use? I ran my original FW900 this way for all 5 years that it worked fine. (Only turned down the contrast sometimes to make text sharper at high res.)
After setting the color balance gains on this Accurate IT FW900 to the somewhat dialed-down level of "EASY" 7200K, it has continued to pop/flash (five times so far, since setting it back), but the defocus only lasts for a split second before the pop.
Exactly how bright can the FW900 go before it starts to do damage to itself? Can it reach this danger level of brightness through OSD adjustments alone, or can the damage level only be reached by using WinDAS / WinCAT?
Regarding what you said earlier:What if you use the colorimeter readings to construct a gamma ramp/LUT to correct the Delta E variances? That should reduce the Delta E values down to a very low level, especially with a 10-bit DAC on the video card... right? Even doing very simple gamma correction I got a very good-looking result, though I wished for a good way of setting gamma at two or more points. (I don't have a colorimeter, but am considering getting one.)
Also, what do you make of this success story of using car polish to remove scratches from a CRT's anti-reflective coating?
Thanks for your concern, Unkle Vito.
I have some more questions regarding GAIN...
As shipped, this FW900 had its 6500K set to gain R=78.4%, G=72.2%, B=70.6%. This must have been calibrated by Accurate IT, because when I tell the monitor to reset its own 6500K by pressing the RESET button, it goes to R=95.3%, G=87.5%, B=85.9%. (All R,G,B bias is set to 50.2%.) Also, the monitor was set to Brightness=32.2%, Contrast=100%. This is probably not a coincidence, since around Brightness=32% is where the blacks start to become virtually perfect, deep black.
Comment: Correct in your assumption that the monitor was calibrated via OSD. On the brightness: A properly calibrated/adjusted monitor via WinDAS will have brightness level set at 50% (factory setting), and perfect blacks at such level. On your unit, I would adjust the G2 level to bring the unit up to the target 50% which is the factory and the recommended setting.
It seems that Accurate IT must have used the ECS access port to do calibration, maybe with WinDAS or WinCAT, because their 6500K color balance of R=78.4%, G=72.2%, B=70.6% had the name "6500K", whereas editing it in the OSD changes the name to "2". RESETing it changes the name back to "6500K" but with different defaults than the monitor was shipped with (as noted above).
Comment: Depending which target point was modified, any time the BIAS/GAINS on the unit are changed/modified via ODS Expert Mode, it automatically assigns a custom number for that setting: 1,2,3. Resetting it will bring it back to the 5000, 6500, or 9300 factory setting and/or WinDAS/WinCAT calibration/adjustment setting. Now, if the unit is reset, and it display different values other than the originals that were stored (factory and/or WinDAS/WinCAT adjustment settings), then the EEPROM may have lost its ability to store/save data, and Ive seen this before.
I took note of how 6500K was set and then turned it up to R=100%, G=100%, B=95.3%, because I wanted the maximum white to be brighter. (Due to the type of color deficiency I have it doesn't matter to my eyes exactly how much Red is in the white point as long as it's approximately okay.) I kept Brightness and Contrast the same as they were. At these settings, white was quite bright but still not blown out (gradations could still be seen in the brightest whites). (This is using the OSD only; I have not used WinDAS at all and don't even have the necessary cable yet.) Of course the native gamma is not 2.2 this way, but it's correctable through a gamma ramp on the video card.
Is it safe to set the RGB gains to maximum this way, or could it have caused the defocus/pop problem after 8 hours of use?
Comment: I reproduced your environment in one of our units at D65 and D93, took readings and measurements with our calibrators, and all the RGB and luminance readings were so far out of the calibration scale, that I had to reset the unit to the original calibration/adjustment levels. Rule of thumb: If it isnt needed, it is not recommended to set these controls to the maximum because the tube will be stretched to its limits and beyond its limits, and youll run the risk of damaging and/or reducing the lifespan of the tube.
I ran my original FW900 this way for all 5 years that it worked fine. (Only turned down the contrast sometimes to make text sharper at high res.)
After setting the color balance gains on this Accurate IT FW900 to the somewhat dialed-down level of "EASY" 7200K, it has continued to pop/flash (five times so far, since setting it back), but the defocus only lasts for a split second before the pop.
Comment: This is a well-known problem with the FBT and that is the fix. Increasing levels of internal adjustment parameters (brightness-contrasts-gains-bias) via OSD to cause the voltage supplied from the FBT to the CRT (increase and/or decrease) will not fix the issue and it may ultimately make the matters worst. Take the unit to the local repair shop, have the tube checked for HK and G1 shorts, emission and guns functioning, then run the ring test on the FBT to confirm the fault, and then replace the faulty transformer. That is what I would do first. Now, if the unit has faulty gun(s), low emission, and shorts, then that will be a different issue.
Exactly how bright can the FW900 go before it starts to do damage to itself? Can it reach this danger level of brightness through OSD adjustments alone, or can the damage level only be reached by using WinDAS / WinCAT?
Comment: This is a good question, and I will try to answer it with additional background information
How high and how can the brightness be set before the tube damage itself? That is a matter of how much washed out the picture really looks once the brightness level is either increased or decreased via OSD, and how long the tube is kept enduring these out of specs levels. It can be achieved via OSD, but it also can be achieved via WinDAS, but only when the program is not properly operated.
In normal operating conditions, depending on the tubes functionality, brightness may be adjusted all the way down to ZERO and/or it may be all the way up to 100 in order to achieve a decent picture. For instance, in a properly calibrated/adjusted unit (factory brightness setting at 50) with a good functioning tube, levels above 60 begin to show a washed out picture and they are not recommended. An increased and continued brightness level set above 60 will corrupt the EEPROM and it will ultimately diminish the life of the tube. Now, if a unit has to have the brightness adjusted all the way up to 100 to achieve a decent image, then the tube is pretty low in emission, it has low luminance and it has reached the end of its life. But if a monitor has to have its brightness control set all the way down to ZERO to achieve a decent picture, and at that level it is still too bright, then the unit has a corrupted EEPROM and its G2 level (voltage) is out of spec. At this stage, the unit may display the famous retrace lines across and up/down the screen. That is a very well known problem with the Sony CRT monitors and it is an easy fix if the WinDAS program, the probe, and the measurement instruments are available. I do not recommend the resistor fix, featured in many blogs.
Lastly, if the G2 level is set too high on WinDAS, the brightness will be increased off specs, the unit will not calibrate and the balance limiters will not pass the final test. If the unit is kept with these levels, the tubes life will be diminished and it will ultimately lead to irreversible damage. Hope this answers the question
What if you use the colorimeter readings to construct a gamma ramp/LUT to correct the Delta E variances? That should reduce the Delta E values down to a very low level, especially with a 10-bit DAC on the video card... right? Even doing very simple gamma correction I got a very good-looking result, though I wished for a good way of setting gamma at two or more points. (I don't have a colorimeter, but am considering getting one.)
Comment: This is a basic software color calibration profiling that can be easily achieved by using any standard commercial color calibration system. The color calibration system should come with a colorimeter and the program, and it will do what you want to achieve. In order to reduce the Delta E values to zero (I do that all the time in my units), during the course of the calibration process, the brightness, contrast, GAINS and BIAS will need to be properly adjusted. That is the only way to achieve that goal. But none of this will work properly if the tube has issues such as G2 levels off specs, hardware factory calibration off specs, faulty gun(s), low emission (luminance). What youll end up is with an ICC profile with erroneous calibration data.
Also, what do you make of this success story of using car polish to remove scratches from a CRT's anti-reflective coating?
Comment: Unless I missed something, I think that was done in a glass surface with the antiglare built into such surface, but not in a plastic coated surface. Applying an abrasive or a solvent to the plastic antiglare will further damage it. Dont believe me? Try it out In the past, I had a client who brought me a GDM-FW900 for antiglare removal, which he tried toothpaste to smooth out antiglare scratches. The antiglare looked like a blackboard with cloudy chalk marks all over the surface.
Hope this helps...
Sincerely,
Unkle Vito!