24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Thanks for your concern, Unkle Vito.

I have some more questions regarding GAIN...

As shipped, this FW900 had its 6500K set to gain R=78.4%, G=72.2%, B=70.6%. This must have been calibrated by Accurate IT, because when I tell the monitor to reset its own 6500K by pressing the RESET button, it goes to R=95.3%, G=87.5%, B=85.9%. (All R,G,B bias is set to 50.2%.) Also, the monitor was set to Brightness=32.2%, Contrast=100%. This is probably not a coincidence, since around Brightness=32% is where the blacks start to become virtually perfect, deep black.

It seems that Accurate IT must have used the ECS access port to do calibration, maybe with WinDAS or WinCAT, because their 6500K color balance of R=78.4%, G=72.2%, B=70.6% had the name "6500K", whereas editing it in the OSD changes the name to "2". RESETing it changes the name back to "6500K" but with different defaults than the monitor was shipped with (as noted above).

I took note of how 6500K was set and then turned it up to R=100%, G=100%, B=95.3%, because I wanted the maximum white to be brighter. (Due to the type of color deficiency I have it doesn't matter to my eyes exactly how much Red is in the white point as long as it's approximately okay.) I kept Brightness and Contrast the same as they were. At these settings, white was quite bright but still not blown out (gradations could still be seen in the brightest whites). (This is using the OSD only; I have not used WinDAS at all and don't even have the necessary cable yet.) Of course the native gamma is not 2.2 this way, but it's correctable through a gamma ramp on the video card.

Is it safe to set the RGB gains to maximum this way, or could it have caused the defocus/pop problem after 8 hours of use? I ran my original FW900 this way for all 5 years that it worked fine. (Only turned down the contrast sometimes to make text sharper at high res.)
After setting the color balance gains on this Accurate IT FW900 to the somewhat dialed-down level of "EASY" 7200K, it has continued to pop/flash (five times so far, since setting it back), but the defocus only lasts for a split second before the pop.

Exactly how bright can the FW900 go before it starts to do damage to itself? Can it reach this danger level of brightness through OSD adjustments alone, or can the damage level only be reached by using WinDAS / WinCAT?

Regarding what you said earlier:What if you use the colorimeter readings to construct a gamma ramp/LUT to correct the Delta E variances? That should reduce the Delta E values down to a very low level, especially with a 10-bit DAC on the video card... right? Even doing very simple gamma correction I got a very good-looking result, though I wished for a good way of setting gamma at two or more points. (I don't have a colorimeter, but am considering getting one.)

Also, what do you make of this success story of using car polish to remove scratches from a CRT's anti-reflective coating?


Thanks for your concern, Unkle Vito.

I have some more questions regarding GAIN...

As shipped, this FW900 had its 6500K set to gain R=78.4%, G=72.2%, B=70.6%. This must have been calibrated by Accurate IT, because when I tell the monitor to reset its own 6500K by pressing the RESET button, it goes to R=95.3%, G=87.5%, B=85.9%. (All R,G,B bias is set to 50.2%.) Also, the monitor was set to Brightness=32.2%, Contrast=100%. This is probably not a coincidence, since around Brightness=32% is where the blacks start to become virtually perfect, deep black.

Comment: Correct in your assumption that the monitor was calibrated via OSD. On the brightness: A properly calibrated/adjusted monitor via WinDAS will have brightness level set at 50% (factory setting), and perfect blacks at such level. On your unit, I would adjust the G2 level to bring the unit up to the target 50% which is the factory and the recommended setting.

It seems that Accurate IT must have used the ECS access port to do calibration, maybe with WinDAS or WinCAT, because their 6500K color balance of R=78.4%, G=72.2%, B=70.6% had the name "6500K", whereas editing it in the OSD changes the name to "2". RESETing it changes the name back to "6500K" but with different defaults than the monitor was shipped with (as noted above).

Comment: Depending which target point was modified, any time the BIAS/GAINS on the unit are changed/modified via ODS Expert Mode, it automatically assigns a custom number for that setting: 1,2,3. Resetting it will bring it back to the 5000, 6500, or 9300 factory setting and/or WinDAS/WinCAT calibration/adjustment setting. Now, if the unit is reset, and it display different values other than the originals that were stored (factory and/or WinDAS/WinCAT adjustment settings), then the EEPROM may have lost its ability to store/save data, and I’ve seen this before.

I took note of how 6500K was set and then turned it up to R=100%, G=100%, B=95.3%, because I wanted the maximum white to be brighter. (Due to the type of color deficiency I have it doesn't matter to my eyes exactly how much Red is in the white point as long as it's approximately okay.) I kept Brightness and Contrast the same as they were. At these settings, white was quite bright but still not blown out (gradations could still be seen in the brightest whites). (This is using the OSD only; I have not used WinDAS at all and don't even have the necessary cable yet.) Of course the native gamma is not 2.2 this way, but it's correctable through a gamma ramp on the video card.

Is it safe to set the RGB gains to maximum this way, or could it have caused the defocus/pop problem after 8 hours of use?

Comment: I reproduced your environment in one of our units at D65 and D93, took readings and measurements with our calibrators, and all the RGB and luminance readings were so far out of the calibration scale, that I had to reset the unit to the original calibration/adjustment levels. Rule of thumb: If it isn’t needed, it is not recommended to set these controls to the maximum because the tube will be stretched to its limits and beyond its limits, and you’ll run the risk of damaging and/or reducing the lifespan of the tube.

I ran my original FW900 this way for all 5 years that it worked fine. (Only turned down the contrast sometimes to make text sharper at high res.)
After setting the color balance gains on this Accurate IT FW900 to the somewhat dialed-down level of "EASY" 7200K, it has continued to pop/flash (five times so far, since setting it back), but the defocus only lasts for a split second before the pop.

Comment: This is a well-known problem with the FBT and that is the fix. Increasing levels of internal adjustment parameters (brightness-contrasts-gains-bias) via OSD to cause the voltage supplied from the FBT to the CRT (increase and/or decrease) will not fix the issue and it may ultimately make the matters worst. Take the unit to the local repair shop, have the tube checked for HK and G1 shorts, emission and guns functioning, then run the ring test on the FBT to confirm the fault, and then replace the faulty transformer. That is what I would do first. Now, if the unit has faulty gun(s), low emission, and shorts, then that will be a different issue.

Exactly how bright can the FW900 go before it starts to do damage to itself? Can it reach this danger level of brightness through OSD adjustments alone, or can the damage level only be reached by using WinDAS / WinCAT?

Comment: This is a good question, and I will try to answer it with additional background information…

How “high” and how can the brightness be set before the tube damage itself? That is a matter of how much “washed out” the picture really looks once the brightness level is either increased or decreased via OSD, and how long the tube is kept enduring these “out of specs” levels. It can be achieved via OSD, but it also can be achieved via WinDAS, but only when the program is not properly operated.

In normal operating conditions, depending on the tube’s functionality, brightness may be adjusted all the way down to ZERO and/or it may be all the way up to 100 in order to achieve a decent picture. For instance, in a properly calibrated/adjusted unit (factory brightness setting at 50) with a good functioning tube, levels above 60 begin to show a “washed out” picture and they are not recommended. An increased and continued brightness level set above 60 will corrupt the EEPROM and it will ultimately diminish the life of the tube. Now, if a unit has to have the brightness adjusted all the way up to 100 to achieve a decent image, then the tube is pretty low in emission, it has low luminance and it has reached the end of its life. But if a monitor has to have its brightness control set all the way down to ZERO to achieve a decent picture, and at that level it is still too bright, then the unit has a corrupted EEPROM and its G2 level (voltage) is out of spec. At this stage, the unit may display the famous “retrace lines” across and up/down the screen. That is a very well known problem with the Sony CRT monitors and it is an easy fix if the WinDAS program, the probe, and the measurement instruments are available. I do not recommend the resistor fix, featured in many blogs.

Lastly, if the G2 level is set too high on WinDAS, the brightness will be increased off specs, the unit will not calibrate and the balance limiters will not pass the final test. If the unit is kept with these levels, the tube’s life will be diminished and it will ultimately lead to irreversible damage. Hope this answers the question…


What if you use the colorimeter readings to construct a gamma ramp/LUT to correct the Delta E variances? That should reduce the Delta E values down to a very low level, especially with a 10-bit DAC on the video card... right? Even doing very simple gamma correction I got a very good-looking result, though I wished for a good way of setting gamma at two or more points. (I don't have a colorimeter, but am considering getting one.)

Comment: This is a basic software color calibration profiling that can be easily achieved by using any standard commercial color calibration system. The color calibration system should come with a colorimeter and the program, and it will do what you want to achieve. In order to reduce the Delta E values to zero (I do that all the time in my units), during the course of the calibration process, the brightness, contrast, GAINS and BIAS will need to be properly adjusted. That is the only way to achieve that goal. But none of this will work properly if the tube has issues such as G2 levels off specs, hardware factory calibration off specs, faulty gun(s), low emission (luminance). What you’ll end up is with an ICC profile with erroneous calibration data.


Also, what do you make of this success story of using car polish to remove scratches from a CRT's anti-reflective coating?

Comment: Unless I missed something, I think that was done in a glass surface with the antiglare built into such surface, but not in a plastic coated surface. Applying an abrasive or a solvent to the plastic antiglare will further damage it. Don’t believe me? Try it out… In the past, I had a client who brought me a GDM-FW900 for antiglare removal, which he tried toothpaste to smooth out antiglare scratches. The antiglare looked like a blackboard with cloudy chalk marks all over the surface.

Hope this helps...

Sincerely,

Unkle Vito!
 
Well, I've also calibrated mine. The blue and green phosphors have worn out a little bit, in that the contrast is lower.

It happens with all of CRT technology, after 10-20,000 hours of use. Those highly sought-after CRT projectors have separate red/green/blue units that can easily be replaced for this very reason.

I've made the white as white as possible without being way too white (around 7200K). The darkest greys are as "neutral" as possible, without displaying any reddish/yellowish/greenish/purplish/bluish hues.

Blacks are still superbly black and the whites are still rather bright (after modifying it via WinDAS, if I set the contrast any higher than 75, the text becomes blurry). I have to set the black level just a little higher due to this thing called "BLOOM". Due to the bloom problem found on all CRT's (native to CRT technology), some of the subtle dark grey details would be lost right next to a bright object on the screen. My eyes already have enough of this blooming problem, and LCD's are immune to blooming. So I have to compensate for bloom by making the blacks a tad bit less black (and also modifying the gamma ramp just a little bit without making it excessive over 2.2, only at the lower ranges). There's also ghosting trails whenever a bright object moves across a dark background (native to CRT's too), by the way.

My monitor has had 7 years of use, with nearly 30,000 hours of use, with contrast set at 75 (actually over 100 default without accounting for the additional "boost" from WinDAS modification). I have the red GAIN set at 100, green at 78, and blue at 82. Red bias is at 0, green bias at 70, and blue at 55, since the green/blue phosphors have lost some fidelity over the past 2 years. If I use sRGB default values, the black level would be so damn reddish.

I know that you could tweak away with your WinDAS stuff and keep your monitors looking fine, but did you really use both of them for 20,000 or even 30,000 hours each? (That's 80 hours a week, 50 weeks a year, for 7 years.)

I still love it.


The RED BIAS and GAIN levels may be an indication that a gun is faulty in your CRT, and the other guns are not able to compensate for the faulty gun. This can be checked with a Sencore CR7000 Bean Builder. The CRT can be tweaked in WinDAS/WinCAT, but it will never properly calibrate. After calibration/adjustment for D93, D65 and D50 points is completed, as soon as the sRGB which adjust automatically is engaged, if a faulty gun(s) is present, the color cast will show and the unit will never calibrate. A known work around trick used by some users is to stop the process as soon as the D50 (last calibration point on WinDAS) is completed, then exit the program and save the work (I DO NOT RECOMMEND TO DO THAT), but if you activate the color restore function in your unit at a later day, the EPPROM white point balance data stored will reset and the color cast will re-appear.

Hope this helps...

Sincerely,

Unkle Vito!
 
I just started getting the defocus/pop problem. It only happens a few times during the first few minutes after the monitor is turned on and once it has warmed up, it doesn't happen again. How concerned should I be about this?

Most likely, the culprit is the flyback transformer (FBT) and it is an easy fix that can be performed at any CRT Service Center.

Hope this helps...

Sincerely,

Unkle Vito!
 
The RED BIAS and GAIN levels may be an indication that a gun is faulty in your CRT, and the other guns are not able to compensate for the faulty gun. This can be checked with a Sencore CR7000 Bean Builder. The CRT can be tweaked in WinDAS/WinCAT, but it will never properly calibrate. After calibration/adjustment for D93, D65 and D50 points is completed, as soon as the sRGB which adjust automatically is engaged, if a faulty gun(s) is present, the color cast will show and the unit will never calibrate. A known work around trick used by some users is to stop the process as soon as the D50 (last calibration point on WinDAS) is completed, then exit the program and save the work (I DO NOT RECOMMEND TO DO THAT), but if you activate the color restore function in your unit at a later day, the EPPROM white point balance data stored will reset and the color cast will re-appear.

Hope this helps...

Sincerely,

Unkle Vito!

Hmm, I do not think the monitor gun is completely faulty. It's just a bit aged, it seems. I think that it's not really so bad since it only needs to be calibrated just a tiny little bit in WinDAS, then the color balance will look 100% perfect with the reds in sRGB (my eyes are very sensitive to the subtle reds of sRGB in all CRT's and LCD's, by the way).

I typed in the numbers wrong. It was from memory and I was off by a bit after I checked the OSD. The numbers are: R Bias 3, G Bias 56, B Bias 46, R Gain 100, G Gain 84, B Gain 84. The colors still look excellent--basically just as good as a relatively new CRT TV sitting next to this monitor and just as good as a calibrated PVA LCD monitor right next to this monitor too.

I just need to do some WinDAS calibration, since changing the Max value in WinDAS only increased the range of contrast value for me, so I got it to be more than 100% contrast whenever I wanted bright contrast in dark videos or games. Why do you not recommend doing that via WinDAS (stopping the calibration process after the D50)? I'd just leave it at that and be happy to enjoy my monitor more without ever messing with the sRGB image restore function.

Thanks for your help. Is there a calibration tool/instructions that I could "buy" from you? I'd like to know everything about what I need to do to try to fix it in WinDAS, and follow all of your advice. Sir!

By the way, I just checked again and all of my pictures still look excellent on this monitor. Many pictures actually look better than on my calibrated 24" PVA monitor sitting right next to it (the reds/blues/greens look pretty good). I'd very much appreciate just a litlte tweaking with the red phosphor in WinDAS.
 
Last edited:
Hmm, I do not think the monitor gun is completely faulty. It's just a bit aged, it seems. I think that it's not really so bad since it only needs to be calibrated just a tiny little bit in WinDAS, then the color balance will look 100% perfect with the reds in sRGB (my eyes are very sensitive to the subtle reds of sRGB in all CRT's and LCD's, by the way).

I typed in the numbers wrong. It was from memory and I was off by a bit after I checked the OSD. The numbers are: R Bias 3, G Bias 56, B Bias 46, R Gain 100, G Gain 84, B Gain 84. The colors still look excellent--basically just as good as a relatively new CRT TV sitting next to this monitor and just as good as a calibrated PVA LCD monitor right next to this monitor too.

I just need to do some WinDAS calibration, since changing the Max value in WinDAS only increased the range of contrast value for me, so I got it to be more than 100% contrast whenever I wanted bright contrast in dark videos or games. Why do you not recommend doing that via WinDAS (stopping the calibration process after the D50)? I'd just leave it at that and be happy to enjoy my monitor more without ever messing with the sRGB image restore function.

Thanks for your help. Is there a calibration tool/instructions that I could "buy" from you? I'd like to know everything about what I need to do to try to fix it in WinDAS, and follow all of your advice. Sir!

By the way, I just checked again and all of my pictures still look excellent on this monitor. Many pictures actually look better than on my calibrated 24" PVA monitor sitting right next to it (the reds/blues/greens look pretty good). I'd very much appreciate just a litlte tweaking with the red phosphor in WinDAS.

When you talk about your monitor and state "it's not really so bad since it only needs to be calibrated just a tiny little bit in WinDAS, then the color balance will look 100% perfect with the reds in sRGB (my eyes are very sensitive to the subtle reds of sRGB in all CRT's and LCD's, by the way)".... How do you know exactly if the monitor is truly adjusted to Delta zero in WinDAS? And how are you measuring and conforming if all the white point parameters are within factory specs? Are you taking measurements with a colorimeter? if yes, which one are you using and what is its accuracy of the unit's White (x/y), Color (RGB, x/y) and Luminance (y)? Are you displaying the patterns at the correct dot clock, vertical refresh rate and bandwidth?

You asked: "Why do you not recommend doing that via WinDAS (stopping the calibration process after the D50)? I'd just leave it at that and be happy to enjoy my monitor more without ever messing with the sRGB image restore function." A: Forcing erroneous calibration/adjustment parameters into the monitor's EEPROM, and/or stretching the monitor beyond its design limits-capabilities are the basic ingredients for driving your unit to failure and/or sudden death.

Beauty is in the eye of the beholder... The same analogy applies to the way a CRT may look to you eyes... If you like it, leave it the way it is.... But that does not necessary means it is correctly and accurate calibrated and adjusted to Sony factory specs...

As to "purchasing" written instructions on "how to" calibrate a display using WinDAS/WinCAT and professional calibration hardware (someone already accused me of being a salesman.....) I never really assembled a manual or a set of guidelines, but with so much interest in this area, I may have to find the time to start working on this project... If I ever complete one, there will way too much technical information in it, and the basics are absolutely necessary to include them in the manual.

Hope this helps...

Sincerely,

Unkle Vito!
 
When you talk about your monitor and state "it's not really so bad since it only needs to be calibrated just a tiny little bit in WinDAS, then the color balance will look 100% perfect with the reds in sRGB (my eyes are very sensitive to the subtle reds of sRGB in all CRT's and LCD's, by the way)".... How do you know exactly if the monitor is truly adjusted to Delta zero in WinDAS? And how are you measuring and conforming if all the white point parameters are within factory specs? Are you taking measurements with a colorimeter? if yes, which one are you using and what is its accuracy of the unit's White (x/y), Color (RGB, x/y) and Luminance (y)? Are you displaying the patterns at the correct dot clock, vertical refresh rate and bandwidth?

You asked: "Why do you not recommend doing that via WinDAS (stopping the calibration process after the D50)? I'd just leave it at that and be happy to enjoy my monitor more without ever messing with the sRGB image restore function." A: Forcing erroneous calibration/adjustment parameters into the monitor's EEPROM, and/or stretching the monitor beyond its design limits-capabilities are the basic ingredients for driving your unit to failure and/or sudden death.

Beauty is in the eye of the beholder... The same analogy applies to the way a CRT may look to you eyes... If you like it, leave it the way it is.... But that does not necessary means it is correctly and accurate calibrated and adjusted to Sony factory specs...

As to "purchasing" written instructions on "how to" calibrate a display using WinDAS/WinCAT and professional calibration hardware (someone already accused me of being a salesman.....) I never really assembled a manual or a set of guidelines, but with so much interest in this area, I may have to find the time to start working on this project... If I ever complete one, there will way too much technical information in it, and the basics are absolutely necessary to include them in the manual.

Hope this helps...

Sincerely,

Unkle Vito!


All right, thanks. I just thought that it would be safer to do it via WinDAS than having to change it in OSD (which directly affected the voltages for the guns, as you have highly recommended against). I do not have any calibration tools with me right now (only software programs like Nokia, Everest, etc..) so I have no idea how close the white point is to the factory recommended setting. It's very close to sRGB default, though. I had no idea that the white point was so important in keeping as close to factory settings as possible, but it must have been a good thing that I kept it really close after all. If I were using the 9300K setting, would the monitor have been more likely to go ka-put, right?
 
These are getting so hard to find and now I see I can "scratch" accurateit off my possibility list. :(
 
All right, thanks. I just thought that it would be safer to do it via WinDAS than having to change it in OSD (which directly affected the voltages for the guns, as you have highly recommended against). I do not have any calibration tools with me right now (only software programs like Nokia, Everest, etc..) so I have no idea how close the white point is to the factory recommended setting. It's very close to sRGB default, though. I had no idea that the white point was so important in keeping as close to factory settings as possible, but it must have been a good thing that I kept it really close after all. If I were using the 9300K setting, would the monitor have been more likely to go ka-put, right?

The monitor comes from factory calibrated and adjusted for D93, D65 and D50. The sRGB is done automatically after these three reference points are completed. Using a factory adjusted setting of 9300K, which is bluish, does not affect the unit at all.

The monitor functionality will be directly affected by performing WinDAS/WinCAT hardware calibration/adjustments, and not having the required instrumentation to take accurate readings while performing the adjustments/calibration. This data (saved in a DAT file which is uploaded into the unit's EEPROM via WinDAS/WinCAT) is absolutely critical and crucial in order to accurately and professionally calibrate/adjust these monitors. This process, if done correctly and by the book, is not a walk in the park and it could take up to six (6) hours and sometimes even more, if functional issues are discovered thru the process.

Hope this helps...

Sincerely,

Unkle Vito!
 
These are getting so hard to find and now I see I can "scratch" accurateit off my possibility list. :(

Not really... Grade A+ units fully calibrated and adjusted to Sony factory standards are still around, and a few brand new zero (0) hours are also around...
 
Anybody able to modify the original (Snymon17.inf) or driver to allow resolutions greater than 1600x 1200 after installing Vista64? The original monitor driver is not Vista64 compatible.
 
I could really use some help. I just loaded my edited dat file to monitor, the progress bar filled all the way and i clicked OK. Now I have nothing on my screen. Any ideas?

Stupid me over wrote the original dat file, so if that is a recommended option could someone upload theirs.
 
Here are the files that I am currently using on win7-64 bit...

SavedEDID.dat is my original:
http://headless.shackspace.com/FW900/SavedEDID.dat
NewEDID.dat is my replacement. It adds some key resolutions like 1680x1050x100hz and 1280x800x130hz, and 1600x1000 @ 120hz:
http://headless.shackspace.com/FW900/NewEDID.dat

testingmon.inf is the hacked 64bit inf file that I used for win7. It also sets the native res to 1280x800 so that win7's scaling shit doesn't fuck up everything.
http://headless.shackspace.com/FW900/TestingMon.inf
 
Last edited:
I could really use some help. I just loaded my edited dat file to monitor, the progress bar filled all the way and i clicked OK. Now I have nothing on my screen. Any ideas?

Stupid me over wrote the original dat file, so if that is a recommended option could someone upload theirs.


If you did not save the original DAT file before making the adjustments, and the modified DAT file loaded into the unit's EEPROM has erroneous information and/or corruption, the only thing you can do is to MPU the unit, and then re-adjust everything (geometry, landing, focus, convergence, dynamic convergence, white point balance, and balance limiters) all from scratch. It will be a long and tedious process but it is do-able.

For white point balance and G2, you must have the appropriate instrumentation to take measurements/readings, and to generate the required patterns at the correct dot clock, refresh rate, and bandwidth. Also, there may be some adjustments that must be made inside the unit such as in the flyback transformer (for focus), deflection yoke (for landing and convergence), and at the unit's assembly neck -magnets poles- (for convergence). You should have the unit's maintenance manual available to refer to these adjustment's ranges, if necessary.

Hope this helps...

Sincerely,

Unkle Vito!
 
Thanks for the feedback so far. Before I had checked back here for responses, I tried again to "load data to set" with a different G2 value and it worked.

My reason for even attempting this was to remedy greenish blacks with a brightness setting at around 25 with G2 of 148. I changed the value instead to 075, as I had read various posts with success at such a low value. In a last ditch attempt I set it to 125 and the monitor once again had a picture. For curiosity I then lowered it to 105, but got the same result as 075.

The blacks looked somewhat better, while everything else looked not so good. After allowing for the requires warm-up time I ran the image restoration function. So here I am at g2: 125, brightness: 50 and contrast : 70, but the image just looks dingy and still greenish.

Is my monitor a lost cause or was I just dreaming that this would be an easy fix. Any assistance is still appreciated.

Thanks
 
Thanks for the feedback so far. Before I had checked back here for responses, I tried again to "load data to set" with a different G2 value and it worked.

My reason for even attempting this was to remedy greenish blacks with a brightness setting at around 25 with G2 of 148. I changed the value instead to 075, as I had read various posts with success at such a low value. In a last ditch attempt I set it to 125 and the monitor once again had a picture. For curiosity I then lowered it to 105, but got the same result as 075.

The blacks looked somewhat better, while everything else looked not so good. After allowing for the requires warm-up time I ran the image restoration function. So here I am at g2: 125, brightness: 50 and contrast : 70, but the image just looks dingy and still greenish.

Is my monitor a lost cause or was I just dreaming that this would be an easy fix. Any assistance is still appreciated.

Thanks


A "green tint" and/or any kind of color cast including magenta, yellow, red, and blue may be an indication of a faulty gun(s) and/or faulty RGB transistor(s), or both. The FW900 has three RGB transistors: Red: Q101, Green: Q201, and Blue: Q301; that must be checked and they are located in the RGB video amp board (A Board).

This is the CORRECT way to fix the issue... I would check the tube for a faulty gun(s), shorts and emission, and then I'll check all three transistors. If ALL check out OK, then the unit needs white point balance via WinDAS/WinCAT, and after it is performed correctly, the color cast should be gone. Now, if the tube does not check out OK, then it has reached the end of its life. If the tube checks OK, but any of the transistors do not, then the faulty transistor(s) must be replaced, and that is a relatively easy fix for someone with basic knowledge of electronic, and above average soldering skills. You must have the unit's maintenance manual, which is available in the internet.

A "work around" is to just perform a white point balance via WinDAS/WinCAT without checking the hardware, but understanding beforehand that there may be an issue with the tube, specially the gun(s) and/or the transistors. During the white point balance, if there is a faulty gun(s), the good gun(s) may compensate for the faulty gun(s) and the calibration/adjustment can be achieved; but if the tube has low emission, the luminance adjustment can go as high as 255 (0-255 WinDAS adjustment scale) and you still may not reach the factory target luminance. Any luminance adjustments above 200 during the SECOND PASS with the WHITE BACKGROUND is an indication of poor emission, and then it will be confirmed that the tube has reached the end of its life. Now, if during during the WinDAS/WinCAT white point balance process, you notice any increased level of adjustment in one and/or any of the guns, and you still see a color cast after all the adjustments are completed, then it will be confirmed that you may have a faulty gun(s) and/or faulty transistors, or both, and the unit cannot and will not achieve proper white point balance. You may have an "OK" looking image, but when grays, white, or blacks are displayed, you'll notice the color cast present.

Hope this helps...

Sincerely,

Unkle Vito!
 
This is ridiculous.

I performed a clean format of my system drive due to a few random issues, and once I'd set up my display driver again, the monitor settings (geometry etc) are now 100% perfect (everything is absolutely straight, with the 'recommended' space on the edges of the screen), when before I could never get them to be straight everywhere.

What could have triggered the monitor to do this?
 
Sensuki, did you actually use the geometry controls? Rotation, pincushion, pin balance, keystone, key balance, vertical & horizontal centering & size? In my experience these have always had more than enough range and precision to get virtually perfect edges. (I'd like twice as much precision in the horizontal centering control, but it's not a necessity.) Could you be specific as to how you were not able to get straight edges?

And what is this about "recommended" space on the edges of the screen? One of the awesome things about FD Trinitron is that you can get the image to go out precisely to the exact rectangular edges of the viewing area. I've always done that. That way the full aperture grill resolution can be used, and it just looks much better.

The only really significant problem I've had with FW900 geometry is that I can't adjust vertical pincushion. Both FW900s I've owned have had convex distortion on the top and bottom edges, a problem my GDM-F500 does not have. Can WinDAS/WinCAT adjust away this distortion?


P.S. My AccurateIT FW900 stopped doing the popping, and has stayed sharp. I've been running it at 100% full OSD gain (with brightness dialed down to get very deep blacks) for watching TV and movies. The scratch marks are virtually always invisible/unnoticeable during TV/movie watching; in occasional scenes the scuff mark will become visible (when something of high contrast appears under it).
 
Last edited:
Comment: Correct in your assumption that the monitor was calibrated via OSD. On the brightness: A properly calibrated/adjusted monitor via WinDAS will have brightness level set at 50% (factory setting), and perfect blacks at such level. On your unit, I would adjust the G2 level to bring the unit up to the target 50% which is the factory and the recommended setting.
Well, the ideal black level is subjective. In a room with the lights not turned off, Brightness=50% might look good enough. But it's probably recommended because it tweaks the luminance curve to be very close to standard sRGB gamma, not because it's the ideal black.

I like blacks to be so deep that even in a completely dark room, with a fully black screen, and eyes adjusted to normal light levels, I can barely make out the rectangular shape of the display. This results in bringing the Brightness down to 32% or thereabouts. If this interferes with my enjoyment due to bringing out the phosphor persistence (motion trails against a dark or black background), then I'll dial up the black level a little.

Of course, dialing down the black level by lowering the brightness will result in a non-standard luminance curve, something that is not standard sRGB. But gamma adjustment via video driver or a device such as the HDFury will partially make up for this. A colorimeter-derived gamma ramp (loaded into the video card LUT) would completely make up for it, provided the colorimeter could take long exposures at very low luminance levels in order to get accurate readings. I've been putting off getting a colorimeter because I'd want to write my own calibration software, and I have doubts as to whether consumer colorimeters could be programmed to do what I want.

Comment: I reproduced your environment in one of our units at D65 and D93, took readings and measurements with our calibrators, and all the RGB and luminance readings were so far out of the calibration scale, that I had to reset the unit to the original calibration/adjustment levels. Rule of thumb: If it isn’t needed, it is not recommended to set these controls to the maximum because the tube will be stretched to its limits and beyond its limits, and you’ll run the risk of damaging and/or reducing the lifespan of the tube.
Not sure what "reproduced my environment" means. I did not want to you set R,G,B gain to 100%,100%,95% exactly like I said. This was only an example, and it only works for me because I have color deficiency.

What I was discussing was a direct adjustment to Red,Green,Blue GAIN using the OSD, in order to get the desired white balance with a white point that's as bright as the OSD can achieve. This means at least one of the R,G,B GAIN settings will be at 100%, and at most two of them may be below 100% to get the desired white balance. Since calibrator software is probably not programmed to facilitate this, you'd have to improvise, using the colorimeter readings as a guide towards tweaking the GAINs manually, until you get as close to the desired white balance as the adjustments' quantization allows. The graph of maximum luminance in each channel versus its corresponding GAIN setting is monotonic and continuous. This means that it must be possible to do this.

At this kind of 100% gain setting, in my experience (on both FW900s I have owned), white is not washed out; gradations can still be seen in the brightest gray levels just below white. There is some bloom, but it's nothing major; all it mainly does is make the gaps between scanlines disappear, and make bright pixel graphics (such as text) less crisp and sometimes a tiny bit distorted.

I am assuming that what the OSD allows will not put unreasonable stress on the picture tube. For all I know, my original FW900 may have lasted longer than 5 years had I not driven it at 100% gain most of the time. But the immersion of running the monitor at maximum gain & contrast for certain modes of use are a big benefit for me. If I see some hard data, or even some anecdotal information giving a correlation between running the monitor at high brightness and shortening its lifespan, I may be better convinced.

It seems logical that lowering the brightness (to get a deep black level) would at least partially offset the possible damage that high gain would introduce. I didn't do that on my original FW900 very often, because my video card at the time had only an 8-bit DAC, and I used a transcoder to watch TV, bypassing the video card... so to get a decent gamma curve without posterization I had to use a black level that was not incredibly dark (I think I may have actually used Brightness=50% back then, for these reasons).

The 100% gain most likely breaks the native sRGB luminance curve (and the dialed-down black level even more so), but as I have discussed, this can be fixed using a colorimeter-derived gamma ramp to be programmed into the video card's LUT. Even if (with native gamma) the monitor exhibits hue shifts in a grayscale gradient, this can be fixed with three individual LUTs for red, green and blue. The graph of luminance versus input voltage for each channel (R,G,B) will always be monotonic and continuous (even if it gets close to flat in certain areas) which means that it can be corrected to match any curve you want, up to the limit of quantizer precision (so a 10-bit or higher DAC is needed to do this well). If a colorimeter and calibration software are not achieving it, it's probably because the software is insufficiently flexible, or the hardware is incapable of accurately reading low luminance levels.

As long as the native black point and native white point are correctly balanced, what goes in between should always be adequately correctable using gamma ramp LUTs. With a 10-bit DAC this should even result in low Delta-Es.

What I've said here regarding LUTs is theory on my part, as I do not own a colorimeter, however I see no reason it should not be possible to put into practice. Everything I know about physics, optics, color theory, mathematics and computer science says it should be possible. Perhaps the existing calibration software and consumer-level colorimeter hardware is insufficiently flexible to do the job right, but barring that it should be possible.

I've been thinking of using my DSLR camera in RAW mode as a "colorimeter substitute" (maybe even more accurate than a colorimeter) to do the readings, but that would be extremely cumbersome. Getting the best precision would involve taking 766 exposures, one for each level from 0-255 of red, green and blue, and adjusting the shutter times to get good accuracy at the darker levels. Not to mention that the exposure times would need to be exact integer multiples of the CRT refresh period in order to avoid scan overlap. So, I haven't gotten around to doing this experiment. In the meantime, I've only calibrated by eye.
 
And what is this about "recommended" space on the edges of the screen? One of the awesome things about FD Trinitron is that you can get the image to go out precisely to the exact rectangular edges of the viewing area. I've always done that. That way the full aperture grill resolution can be used, and it just looks much better.

Are you sure about that? I checked the manual, and though it doesn't state anything about putting a margin around the image, I have tried filling out the screen in the past. It didn't look like real 16:10 to me. A little stretched out.

I'm going out on a limb here but, the way you fill out the screen to far edges is by increasing the strength of the magnetic field generated by the horizontal and vertical coils. One could hypothesize that this puts additional load components already past their prime in lifespan.
 
It's not a 16:10 screen. I always filled the screen, but used resolutions that corresponded to its actual viewable dimensions...
 
Indeed, the GDM-FW900 is not 16:10. It's actually 25:16 (to within a quarter of a percent). The only standard resolution I'm aware of that's 25:16 is 1600x1024, so you'll have to create custom resolutions if you want anything else. I'd recommend 1920x1228 or 2000x1280 as a desktop resolution.
 
Indeed, the GDM-FW900 is not 16:10. It's actually 25:16 (to within a quarter of a percent). The only standard resolution I'm aware of that's 25:16 is 1600x1024, so you'll have to create custom resolutions if you want anything else. I'd recommend 1920x1228 or 2000x1280 as a desktop resolution.

Unless I missed something, the GDM-FW900 is a 16:10 aspect ratio display, and capable of displaying a 4:3 aspect ratio screen. The maximum 16:10 aspect ratio resolution is 2304x1440, and the maximum 4:3 aspect ratio resolution is 2048x1536. The recommended 16:10 aspect ratio resolution is 1920x1200. Please refer to the GDM-FW900 service manual, page # 1, and the marketing brochure, last page.

A few users force the monitor with non-recommended resolutions and refresh rates, the so called "stretched" settings, which for the most part the monitor is capable of displaying (the million dollar question always is -how long can the unit sustain it), but the owners often pay the ultimate price... Then they don't understand what went wrong with the unit once it ends up with irreversible damage. Basically... You play... You pay...

Hope this helps...

Unkle Vito!
 
Indeed, the GDM-FW900 is not 16:10. It's actually 25:16 (to within a quarter of a percent). The only standard resolution I'm aware of that's 25:16 is 1600x1024, so you'll have to create custom resolutions if you want anything else. I'd recommend 1920x1228 or 2000x1280 as a desktop resolution.

SONY MULTISCAN GDM-FW900 PREMIER PRO 24" FD TRINITRON CRT COMPUTER DISPLAY

Standard image areas:

16:10 aspect ratio:
Approx. 474 x 296 mm (w/h)
(18 3/4 x 11 3/4 inches)

4:3 aspect ratio:
Approx. 395 x 296 mm (w/h)
(15 5/8 x 11 3/4 inches)


Factory Preset Signal Timings:
VGA-G 640 x 480/60 Hz
VGA-Text 720 x 400/70 Hz
ESVGA 800 x 600/75 Hz
VESA 1024 x 768/85 Hz
VESA 1152 x 864/85 Hz
VESA 1280 x 1024/85 Hz
VESA 1600 x 1024/60 Hz
VESA 1600 x 1024/75 Hz
VESA 1600 x 1024/85 Hz
VESA 1600 x 1200/85 Hz
VESA 1920 x 1080/60 Hz
MAC-G4 1920 x 1080/72 Hz
VESA 1920 x 1080/75 Hz
VESA 1920 x 1080/85 Hz
VESA 1920 x 1200/60 Hz
VESA 1920 x 1200/75 Hz
VESA 1920 x 1200/85 Hz
VESA 2048 x 1280/60 Hz
VESA 2048 x 1280/75 Hz
VESA 2048 x 1280/85 Hz
VESA 2048 x 1536/75 Hz
VESA 2304 x 1440/60 Hz
VESA 2304 x 1440/75 Hz
VESA 2304 x 1440/80 Hz

W900 1600 x 1024/76 Hz
GWM-3000 1920 x 1080/72 Hz

Factory Maximum Refresh Rates:
1280 x 1024/115 Hz
1600 x 1200/97 Hz
1920 x 1080/108 Hz
1920 x 1200/98 Hz

NOTE: Any other refresh rates above the factory recommended settings will be considered non-recommended "stretched" settings.

Bandwidth Design Limits:
Vertical Refresh Rates range: 48 to 160 Hz
Horizontal Refresh Rates range: 30 to 121 kHz

Hope this helps...

Sincerely,

Unkle Vito!
 
SONY MULTISCAN GDM-FW900 PREMIER PRO 24" FD TRINITRON CRT COMPUTER DISPLAY

Standard image areas:

16:10 aspect ratio:
Approx. 474 x 296 mm (w/h)
(18 3/4 x 11 3/4 inches)

4:3 aspect ratio:
Approx. 395 x 296 mm (w/h)
(15 5/8 x 11 3/4 inches)
I don't know where you found those specifications, but they are subsets of the full viewable area. To use that 16:10 subset, you'd need to leave 4 mm of unused space on the left, 4 mm on the right, 6 mm on the top and 6 mm on the bottom. That would be wasted space.

Straight from the manual:
Viewable image size — Approx. 482.1 × 308.2 mm (w/h)

I've measured it myself to be 479.5 mm × 306.3 mm.

A few users force the monitor with non-recommended resolutions and refresh rates, the so called "stretched" settings, which for the most part the monitor is capable of displaying (the million dollar question always is -how long can the unit sustain it), but the owners often pay the ultimate price... Then they don't understand what went wrong with the unit once it ends up with irreversible damage. Basically... You play... You pay...
Most CRT televisions have lots of overscan. The way the overscan works is by actually scanning the electron beam beyond the edges of the viewable area. All we're talking about doing on the FW900 is making the image go out exactly to the edges of the viewable area. If overscan is safe for a CRT to routinely do, then edge-scan has to be at least as safe or safer.

NOTE: Any other refresh rates above the factory recommended settings will be considered non-recommended "stretched" settings.
What this is actually saying is, since "1600 x 1200/97 Hz" (for example) is the factory maximum 1600x1200, don't try to do 1600x1200 at 98 Hz or higher. That's all it's saying. The monitor won't let you do anything higher, anyway; it'll show an error message in an OSD-style dialog box ("OUT OF SCAN RANGE") instead of trying to scan the mode.
 
Last edited:
I don't know where you found those specifications, but they are subsets of the full viewable area. To use that 16:10 subset, you'd need to leave 4 mm of unused space on the left, 4 mm on the right, 6 mm on the top and 6 mm on the bottom. That would be wasted space.

Straight from the manual:
Viewable image size — Approx. 482.1 × 308.2 mm (w/h)

I've measured it myself to be 479.5 mm × 306.3 mm.

Most CRT televisions have lots of overscan. The way the overscan works is by actually scanning the electron beam beyond the edges of the viewable area. All we're talking about doing on the FW900 is making the image go out exactly to the edges of the viewable area. If overscan is safe for a CRT to routinely do, then edge-scan has to be at least as safe or safer.

What this is actually saying is, since "1600 x 1200/97 Hz" (for example) is the factory maximum 1600x1200, don't try to do 1600x1200 at 98 Hz or higher. That's all it's saying. The monitor won't let you do anything higher, anyway; it'll show an error message in an OSD-style dialog box ("OUT OF SCAN RANGE") instead of trying to scan the mode.


The specifications are stated on the official Sony Maintenance/Repair manual. Sony engineers are the ones that designed the unit and issued the manuals. We, as service centers, followed factory recommended instructions according to official factory repair/maintenance manuals and repair bulletins. The maintenance/repair manual is widely available on the internet, so before making statements, please download it, read it, and understand it; and if you have any questions, I will try to give you a professional answer but if you still have concerns, please contact Sony Technical Support directly.

Now, setting the unit to a pre-set factory recommended setting, and making the image go edge-to-edge DOES NOT CHANGE THE REFRESH RATE OR THE RESOLUTION!... and it is perfectly safe to do that! The GDM-FW900 IS NOT A TELEVISION SET!!! Televisions reproduce totally difference signals (NTSC/PAL), and have totally difference scanning frequencies. In order for the GDM-FW900 to reproduce a TV signal, you would need a TV to VGA signal convertor. Also, over scanning and/or forcing the GDM-FW900 to go beyond the recommended pre-set signal timings or so-called factory settings IS NOT RECOMMENDED!!! Like I said, the unit is designed to sync "stretched" settings within the ranges specified under "bandwidth design limits", but they are not recommended.

Hope this helps...

Sincerely,

Unkle Vito!
 
It ultimately comes down to choosing to run it by the book, or to choose to run it without regard for what the manual says. I don't think any of us are going to determine a better/worse of good/bad in this situation. As the owner of an FW-900, you have the right to set the thing next to an unshielded 50kva transformer if you see fit. Whatever floats your boat.
 
It ultimately comes down to choosing to run it by the book, or to choose to run it without regard for what the manual says. I don't think any of us are going to determine a better/worse of good/bad in this situation. As the owner of an FW-900, you have the right to set the thing next to an unshielded 50kva transformer if you see fit. Whatever floats your boat.

You are correct! Let the owner choose how he wants to run his unit. Then hear them complaint about "why did they unit died so soon...", or "why is my unit having A, B, C, D issues...". I own a pair of GDM-FW900s with manufacturing dates of 2001/2002, that function and looks as new and as sharp as the day I bought them... I keep them calibrated/adjusted to factory specs, and I've NEVER ran then out of manufacturing specs. As I said before... You play, you'll pay the ultimate price!

Hope this helps...

Sincerely,

Unkle Vito!
 
I always ran it at 1600x1024 at 100Hz for years. Worked great. (And still going strong at my dad's...)

One should pick a resolution that matches the viewable dimensions and keeps the aspect ratio correct, circles as circles, and such....
 
I was using the television comparison to illustrate a point, that overscan is normal. Overscan in this context merely means that the electron guns are in an active state while their streams are aimed outside the viewable area, i.e., not hitting phosphors. Since we are in agreement that resizing a mode to go edge-to-edge is safe, there's no need to discuss this further. ;)

I do find it curious that Sony never admits that the GDM-FW900 is a 25:16 monitor, not even in the Service Manual, even though they list some "Standard image areas" and are obviously aware that 16:10 image area doesn't match up with the Viewable Image Size, which is stated on the same page (the first page). However, they made an awesome CRT monitor, so I forgive them — but tisk, tisk tisk at them for not following it up with an even better CRT! Just think how awesome it could be if they hadn't given up on CRTs. My GDM-F500 is actually sharper than both my FW900s, and keeps a better black level with ambient light hitting it. It has a better OSD, too. It's like Sony "forgot" some things in-between the GDM-F500R and the GDM-FW900. It would've been great to have a GDM-FW950 with the F500's constant 0.22mm grille pitch and other advantages, along with the FW900's size. (The FW900 is worse than spec in aperture grille pitch — 0.2382 mm to 0.277 mm, not just 0.23mm to 0.27mm. But the F500 actually is 0.220 mm.)

But anyway, I'm loving having a FW900 again, even with its Grade B scratches. It is absolutely beautiful to watch HDTV on, and blows away my 30" 3007WFP-HC even though it's much smaller (and necessitates sitting closer). But for viewing and editing photos, the 30" LCD wins.
 
Last edited:
Unkle Vito, how often do you keep those 2001 & 2002 GDM-FW900s turned on? Which factory modes do you run them at, and can you estimate how many years or hours they've been powered on and active while in your possession?

Have you owned any FD Trinitrons that failed despite similar good treatment?
 
Last edited:
Unkle Vito, how often do you keep those 2001 & 2002 GDM-FW900s turned on? Which factory modes do you run them at, and can you estimate how many years or hours they've been powered on and active while in your possession?

Have you owned any FD Trinitrons that failed despite similar good treatment?

Since I've acquired the units brand new in the wrapper, I use my GDM-FW900 an average of 3-4 hours a day, 5 day a week, and when the units are not in use, I TURN THEM OFF!!! (that means POWER DOWN THE UNITS AND NOT LEAVING THEM ON STAND-BY!!! That is one of my many recommendations I make to all my clients. When I am doing CAD/CAM/CAE/CAM, I run the units at 2304x1440@80 Hz. When I am performing video editing or working on my photos, then run the units at 1920x1200@85 Hz.

Maintenance of my units: I performed white point balance, landing, focus, geometry, convergence, balance limiters, and luminance adjustment/calibration via WinDAS every six (6) months. Also, I clean the inside every time I perform the adjustments.

As end user, I have had many Sony GDMs, and out of all the units that had passed thru my hands, I only had two (2) defective units: one (1) GDM-F520 -tube/FBT issue-, and one GDM-FW900 -tube issue-. I had the tubes and the FBTs replaced by Sony under warranty.

Hope this helps...

Sincerely,

Unkle Vito!
 
Ok i am not trolling. So stop .
I still feel you need to come out of the stone age!

You should take your own advice. Possibly excluding a really good local dimming set, LCD is nowhere near the quality of an FW900. Arguments from authority get you nowhere, and the technology speaks for itself. LCD's are brighter, and usually have somewhat sharper text, but everything else goes in the FW900's favor. Pixel density, refresh rate, response time, color, and contrast. If you like your NEC, good for you. It's not superior technology.

edit: Woah, olld post I'm replying to. Lol, carry on.
 
OK, I am getting ready to upgrade my entire system. I have a FW900 in perfect condition. My question is, I was considering replacing it with a IPS panel LCD, namely the HP LP2475w 24".

Will I be satisfied with this coming from a FW900?

Are there ANY flat panels out right now that come close to the FW900 yet?
 
i just got my FW900 ! got it from a guy who bought it brand new. outside its in mint condission, cant tell apart from a new one. but the problem is the image, it has two (2) very thin lines going across it horizontal and it has the green tinge. image restoration works for the green but it comes back after a restart. im beginning to think that this monitor is a lost cause maybe?

edit: i was able to get rid of the green tinge. AAAAmazing black i must say after 8 different tft with tn and non-tn panels. now only rea problem is the upper distortion in the corners and the two one pixel thin lines across the screen. could this be caused by the cable i use? i have vga cable with dvi-vga adapter from my gpu(radeon 5870)

1920x1200 @ 85hz works straight from ccc on windows 7. can i use the 98hz safely with powerstrip or something like that?
 
Last edited:
thats a shame, luckily those lines arent that visible. i have tried to read how to get those maximum safe refresh rates with radeon but cant seem to do it. i even tried powerstrip but it didnt allow me to put more than 85hz in any resolution. isnt 98hz maximum for the 1920x1200? and should i try to sharpen the picture from the two "screws" when my 1920x1200 resolution is a bit fuzzy on text but crystal sharp with for example 800x600 resolution in games etc. does that mean that the monitor is in good shape but just doesnt display higher resolutions more sharper?
 
Back
Top