24" Widescreen CRT (FW900) From Ebay arrived,Comments.

BTW. I really love CRT's, just not gonna pretend they do not have inherent flaws in picture quality as they do and some are quite severe. Same goes for FW900 which do have some really strong flaws and in my books it was in some characteristics absolutely the worst CRT monitor I ever owned.

By saying I should stick with LCD it is meant exactly what? Only because I notice how bad FW900 really is and how useless and overhyped methods described in this thread are and how stupid some advices like "remove AG" are doesn't mean I should use LCD's... it only means that people doing those things should use LCD's. People who more than actual image quality care for sharpness, brightness, geometry and some imagined color accuracy. LCDs are super bright, sharper, perfect geometry and much easier to calibrate, especially hardware calibrated monitors. If someone likes sharpness and good convergence to the point it is his only quality pointer and such idiotic thing as removing AG yield for them "better image quality" because it enable slightly better sharpness then perhaps those are the people who should stick to LCDs. They improved considerably recently, up to the point that for gaming purposes there is no need to use CRT.

I fixed my FW900 and doing no methods suggested here. Using suggested method of removing AG that was supposed to be such a great mod I only broken it more...

AG-less vs with polarizer applied
CeM8n2c.jpg


and for comparison similar picture taken by other member while removing orignal AG
REMOVEAG-4.jpg

(URL to whole page if image does not load)

Original AG was barely making screen any darker while having actually much higher impact on sharpness because of worse light properties.

I can appreciate both good CRT and good LCD.
Default FW900 was not really worth appreciation, FW900 without AG was definitely not worth appreciation. If it was not bigger than other CRT's and 16:10 then I would not use it (would not buy it in the first place), almost no one would use it, there would be no such thread on [H], no one would put it as example of 'CRT goodness'. There are far superior CRT's out there that lack any of issues I mentioned, with nice dark screens that can be used with lights on in room, no G2 issues, faster maximum refresh rates, even better sharpness etc.

Only with polarizer it is actually on par with best CRT's in terms of picture quality and it being widescreen make it undoubtedly best CRT monitor evar :cool: Now having one of the best LCD's and best CRT monitor ever I can wait strobed 4/5K OLED's in peace :D
 
Different strokes for different folks I guess. I only suggested the Eizo because you like that dark black look, and apparently it has good blacks for an LCD. What exactly do you mean by "fake" contrast ratio?
 
Contrast ratio is measured by probe at perfect angle and in reality is only seen as two small spots, each eye see it in different place and they move along with head. IPS have similar issue but image deteriorate visibly only from some angle which is ennough to make most of the screen look good. VA also have gamma shift which is just hideous and no matter how much it was calibrated it will give bad colors. Some VA's are so bad that typical TN is better. Just bleh... I should never buy any VA screen and go straight to IPS.

Now I use A-TW IPSes and they have none of the said issues. I can move my head all I want and it always looks the same, just like on CRT. Comparing professional CRT vs professional LCD reveals that CRT were not that good for professional uses and here improvement was quite impressive. Multimedia-wise with todays technology we could also have monitors that could easily rival CRT in overall image quality. Those would be expensive but still less expensive to make than CRT :)

That most LCDs are shite is result of lack of demand for quality. Most of my friends who do play games changed many years ago their good 17" CRTs for 17" and 19" crappy 60Hz LCD's with really bad colors. What was the reason? Mostly more desk space...
 
That most LCDs are shite is result of lack of demand for quality. Most of my friends who do play games changed many years ago their good 17" CRTs for 17" and 19" crappy 60Hz LCD's with really bad colors. What was the reason? Mostly more desk space...

Same can be said for audio quality, I'm afraid. Most people who like audio quality also have an eye for visual quality - at least in my personal experience with people I know.
 
I would rather ask different question: why these effects are so strong on FW900 in the first place?

I used CRT's my whole life and I do not remember any of them having this kind of issues on the same scale as FW900 had.

It might have something to do with the thickness of the phosphor layer (also known as phosphor screen weight). Thicker phosphor layers have a longer lifetime, but they also provide an opportunity for more scattering of light as it makes its way from the back of the layer to the front.

In WinDAS I only changed some setting that allowed for higher luminance/contrast. Color-wise nothing was really necessary.
What setting relate to tube health and why?

First off, controlling overall luminance is important (determined by beam current), as higher beam currents wear out the tube faster. The phosphors age faster, and more impurities are generated by the increased electron bombardment. Impurities are bad, as they accumulate on sensitive structures within the tube, reducing gun efficiency and eventually causing shorts.

I don't have a good grasp on these next two, but:

Second, the relationship between G2 voltage and the cutoff voltage has pretty dramatic consequences. If things are not well calibrated, the electrical fields around G1 result in the aperture, through which electrons flow, to "appear" smaller to the electrons. This means that they are drawn from a smaller area of the cathode, which intensifies cathode loading.

Third, the cutoff voltage has an impact on the burden on the amplifier circuitry, as it determines how much the amplifier has to "swing" to generate the necessary voltages to produce the desired beam currents. I think it may also have an impact on focus considerations.

WinDAS WPB systematically adjusts ALL these various parameters to factory specs, including cutoff voltages, which ensures optimal operation. You simply cannot adjust the tube to the same state using only the OSD.
 
Got two more victims into my sadistic lair of electro-shocks and electro plugs into the rear end. :D

A Mitsubishi television made in 1991 and a Toshiba television made in 2004. I may keep the Toshiba actually. Three video inputs baby - all separate! :eek:
 
and for comparison similar picture taken by other member while removing orignal AG
REMOVEAG-4.jpg

(URL to whole page if image does not load)

Original AG was barely making screen any darker while having actually much higher impact on sharpness because of worse light properties.

what the hell. well that explains your thoughts
the ag on my cpd-g520p is way darker than whatever's in that picture. maybe the ag of that monitor got worn down? or may the fw900's ag isn't too dark by design.
 
just spent half an hour trying to get transmission measurements. Not satisfied with the way I set it up. Hard to keep the smart phone (which I was using as a flashlight) at the same angle consistently, etc.

But got some decently consistent measurements.

With AG: 4.5 nits
Without AG: 10 nits.

(4.5/10)^0.5 = 0.67.

So 67% transmittance, just as flod suspected :)

Crazy, so removing the AG means the tube has to produce almost half the beam current to achieve the same brightness!

I may repeat in future with a better setup. If I have a regular flashlight, I can create a consistent setup from screen to screen fairly easily.
 
Last edited:
So by nature, when we remove the AG and do a calibration, we reduce the amount of effort needed by the monitor by 33% then? If so, that would explain why my monitor's drive slider only needs to be set to 58 to reach 105 cd/m2 for the WPB at 6500K.
 
[specular] reflectance of g520p with antiglare coating vs. fw900 without antiglare coating: (taken on iphone 5 with exposure locked)
fw900: http://i.imgur.com/26TIzI7.jpg
g520p: http://i.imgur.com/X5sg8dr.jpg

spacediver could you replicate this comparison for your monitors? probably they'll be similar but it would be interesting to see if the ag on the g520p is optically different from the one on the fw900
 
Sure, I might be able to get a couple quick snaps tonight, though measurements may have to wait until I get an actual flashlight.
 
hmmm... so if I use AG with transmittance even lower then my monitor must put even more current to achieve the same brightness... well, who cares...
BURN BABY, BURN!
ogien.gif


In my case having no AG contributed to tube health mostly because most of the time it was unusable and no fun to use at all... so most of the time it was not used. Even when lighting condition were perfect it still didn't look very good, definitely not like CRT picture should look like... so I rather used LCD and CRT was just a piece of furniture...
 
xor fyi on my cpd-g520p with antiglare, when you push luminance above 110 or so (can't remember exactly) the luminance curve becomes s-shaped i.e. like
ugazeKO.png


this threshold of ~110 becomes lower with darker filters on the screen. so you may want to check that this isn't happening for your fw900 with polarizer when used at whatever settings you usually use
 
Ok tried redoing the transmittance measurements with a flashlight.

Taped the flashlight, colorimeter, and a baffle to a shoe box, as seen below:

jl3tcn.png


Then, to take consistent measurements, I held the box flush against the screen:

ta4eqg.png


I wasn't able to get consistent readings after removing the box and putting it back in the same spot. I think it might have to do with the deformability of the box. Next time I'll try a more robust platform. Same problem even when i put box so that the corner of box fit against corner of display surface.

Anyway, loose measurements indicate a transmittance of about 60%, and my IBM p275 (CPD-G520p rebrand) was about the same.

Next, I took photos of the flashlight against the screen, as per flod's request. The first two are the FW900's, one with and one without AG, although I can't remember which photo corresponds to which! The last one is the P275.

712ryc.png


v63ihg.png


cpxc0.png
 
could you set a lower exposure? I can't see any difference between the first two but that !may be from overexposure
maybe it would be easier if instead of a flashlight you use a white screen on your phone

not sure exactly where the flashlights are positioned but what I'm looking for is the mirror reflection of the flashlight's bulb(s) and whether the reflections are tinted. e.g. in my cpd-g520p picture the first reflection is bluish

btw just redid wpb as my g520p's black level has rised a bit over the last 8months. contrast ratio drifted from 5000:1 to ~1500:1. white balance was still near perfect though. cmaxbmax or whatever slider was 101 this time, which is almost identical to what I set one year ago.
 
Last edited:
did some fast measurements
I am able to hit around 120cd/m2 maximum
no gamma distortions

I cannot get the dammed thing to work on my main OS Win10
will try to set it in VMWare and then try to calibrate it from there in such a way to not disturb my work
 
could you set a lower exposure? I can't see any difference between the first two but that !may be from overexposure
maybe it would be easier if instead of a flashlight you use a white screen on your phone

Yea I just took some quick snaps with smartphone. I'll try your idea later tonight :)
 
did some fast measurements
I am able to hit around 120cd/m2 maximum
no gamma distortions

I cannot get the dammed thing to work on my main OS Win10
will try to set it in VMWare and then try to calibrate it from there in such a way to not disturb my work

ok good, I was worried that you'd boosted your peak luminance to some ridiculous value. After the WPB adjustment, the peak luminance is around 87 cd/m^2, but I don't think 120 is high enough to cause severe distortions.
 
if I had two identical monitors without AG next to each other
first was set on at 100cd/m2 in OSD using contrast @ 50%, bias and gain settings to tweak colors
Second would be set to the same peak luminance and xy point by doing WPB in WinDAS so that it had 100cd/m2 at 100% contrast.

What would be difference in image quality between these two?
 
I'm not 100% sure, but I'd guess that there wouldn't be a significant difference in image quality between those two.
 
well 120 would be 240 without.
regardless it's probably not a good idea to push it that far
 
yea, if you had two screens next to each other, one with AG and one without (unlike your quesiton, where they both lack AG), and they measure the same peak lum, then the one with AG will probably have a blurrier image.
 
Why the obsession with sharpness?
Maybe you should concentrate on 4K LED monitors if longevity of monitor and sharpness is your thing?

If only way FW900 to have good image quality is to put polarizer and let phosphor and guns burn twice as much then I am all happy to do it. And I do not have any 'issue', not anymore. It is just freaking perfect now, just bliss to watch at.

Imho the best thing one could do to it is fixing what SONY engineers screwed up.
But on other hand preserving endangered species is very noble cause. So remove AG, set brightness to level with which it will have contrast ratio at day like <10:1 and then do not power it on at all because how horribly image looks on it :eek:
 
they didnt screw up anything. the monitor is intended to be used in reasonably light-controlled environments
 
yea, if you had two screens next to each other, one with AG and one without (unlike your quesiton, where they both lack AG), and they measure the same peak lum, then the one with AG will probably have a blurrier image.

from what i can see, a clean ag doesn't directly blur the image
but having the same luminance with an ag needs more electrons, and i think i've read somewhere that the focus is worse with more electrons because they come from a larger region of the cathode or something

ultimately i think the age of the tube is the most important thing for sharpness
 
but having the same luminance with an ag needs more electrons, and i think i've read somewhere that the focus is worse with more electrons because they come from a larger region of the cathode or something

Yes, this is part of it. I believe the emissive area of the cathode is related to the relationship between cathode voltage (i.e. how high you're driving the tube), and G1. Actually, in addition, the optimal focus voltages (at the focus grids) also depend on the relationship between cathode voltage and G2, so by changing the drive voltage so much, the optimal focus voltages need to be recalibrated. Not sure if the GDMs do this automatically.

Also, higher beam currents mean more electrostatic repulsion within the beam itself, which means that the pixel size increases. This is probably the most important factor, and it is why you get so much blooming with high drive voltages.

Finally, more electron back scatter.

These are the factors I was considering in my last post. I didn't mean to imply that the AG directly changes the focus.
 
Last edited:
In well lit environment image text is visible enough to see two A4 pages at once so I guess it does what it was advertised to do... so SONY engineers did a very fine job on it :)

It is just bad for fancy multimedia and gaming stuff. To meet those market needs SONY had a lot of cheaper 4:3 tubes which sacrificed sharpness for contrast ratio and were much better fit for 4:3 content of the times.

BTW. I can detect polarization differences between my three monitors. SONY and NEC have clearly horizontal yellow pattern on them, LG have vertical. But to test it out I checked some laptop lying around and it clearly had 45 degree patch in the exact direction as piece polarizer was darkening image as tested on my monitors. It is barely noticeable but is there.
 
If you look at any Trinitron televisions, you'll notice that their antiglare is much darker than the GDM monitors. My TV looks black compared. Probably because Sony knew they would be used in well-lit environments is my guess. Even with all the lights on in the basement, the picture looks just fine.
 
they didnt screw up anything. the monitor is intended to be used in reasonably light-controlled environments

precisely. I'm guessing XoR's working environment looks nothing like this - this is from The Mummy: Tomb of the Dragon Emperor (2008)

fw900_mummy-tomb-design.jpg
 
and sony bvm crt's have no antiglare at all :p

BTW. I can detect polarization differences between my three monitors. SONY and NEC have clearly horizontal yellow pattern on them, LG have vertical. But to test it out I checked some laptop lying around and it clearly had 45 degree patch in the exact direction as piece polarizer was darkening image as tested on my monitors. It is barely noticeable but is there.

not sure what you mean... for lcd monitor's it's expected since they work by using cross polarizers
 
my lighting conditions are absolutely normal and very typical
It is morning, cloudy day. With curtains closed room seems quite dim, definitely dimmer than most working enviroments FW900 was intended for. Comparing FW900 blacks with W2420R with similar whites and... they are very similar too. With original AG blacks on FW900 would be far worse and without AG they would be just terrible.

It would have to be pretty much night and no other monitor being on before AG-less FW900 started to have better screen color when off than W2420R have blacks when it is on, especially comparing operation at ~100cd/m2. I am utterly amazed how this serious issue can be dismissed in this way by so many people...

BTW. humans have mean to detect polarized light, its called Haidinger's brush
 
my lighting conditions are absolutely normal and very typical
It is morning, cloudy day. With curtains closed room seems quite dim, definitely dimmer than most working enviroments FW900 was intended for. Comparing FW900 blacks with W2420R with similar whites and... they are very similar too. With original AG blacks on FW900 would be far worse and without AG they would be just terrible.

If you had done a WPB in WinDAS your blacks would be fine. Did you ever adjust your G2? Prad.de measured the W2420R black level between 0.04 and 0.26 nits, depending on state of display. My FW900 measures at 0.0006 nits after an adjustment.

I am utterly amazed how this serious issue can be dismissed in this way by so many people...

maybe because most other people don't have this issue, and haven't boosted their drive to an out of spec level?
 
Last edited:
Back
Top