24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I'm considering removing the anti-glare coating on my FW900. What are your thoughts on this? There are a few scratches on it.

The room that it is in is pretty light controlled, so I don't really worry about reflections too much. Is an increase in reflections the only argument against removing the antiglare coating?

I am curious if there are significant benefits to doing so? I've heard that text becomes sharper without the coating and the monitor outputs more light. Also I've heard that a meter will take more accurate readings without the coating so that a calibration can be more accurate.

What is your opinion on this? The scratches are not TOO bad but I am still weighing the pros and cons of removing the coating. I can control the lighting in my room. What do you think about the supposed benefits to image quality from removing the coating?

Thanks.
 
If you don't mind waiting until end of december, I can provide you with some comparison results. Personally, I love the look of the screen with the AG removed. It just looked glassier and glossier and I love that quality. As for color accuracy, it would be interesting to compare the chromaticities of the primaries with and without the AG. White balance will be unaffected since you can tweak these with high precision anyway to compensate for possible chromatic changes that the AG introduces.

It certainly makes it brighter, but when you're calibrating with WinDAS you adjust these so you end up with the same brightness either way (just means that with AG you need to run the tube a bit higher to get the same brightness.

One thing that the AG might be good for is reducing halation due to internal reflections (the glow around bright objects) - that's the thing I want to test.
 
coating just makes the monitor darker. doesn't affect colors. my dtp94 measures the x,y coordinates for the primaries on my g520p with coating as spacediver does on his fw900 without coating. it doesn't affect sharpness at all if it is kept clean, though due to the nature of the coating, even a few hundred nanometers of finger oils on it will show up quite visibly. calibrating it isn't really different.

i'd keep it on because it's really not replaceable. if you ever use it in a non lit room you'll probably regret it :p
 
I wouldn't remove it. Unlike cheap consumer LCDs, high-end CRTs generally have a proper anti-reflective coating. Note the key distinction between anti-reflective and anti-glare. Anti-reflective films literally prevent some degree of reflection from external light sources. Anti-glare films are the garbage which refract incoming and outgoing light and ruin image quality.

You may be severely disappointed when you remove the AR only to find that:

1. The picture quality hasn't improved at all (sharpness, colors, contrast, etc)

2. Reflections now bother you
 
If you want my opinion: it does not hurt to remove it, but only consider it if you use your FW900 in a room with no direct lights or sunlight that can fall on the monitor.
In reality both monitors WITH and WITHOUT AG coating look very good. Blacklevel is beautiful on AG-Removed and AG-Coated. Also Sharpness and Colors. You can always remove it, but when its down then you cant get it back on!!!

People say it improves colors and can "restore" a screen with blurry text output. So if you play in a dark room (as most gamers do, right) you should consider getting it off. I posted some links below on the procedure on how to remove the bezel of the monitor (so you can access and remove the glue of the coating) and what the best procedure of removing the AG coating is.

WARNING: don't do it without reading all this. SOME PARTS ON THE INSIDE OF THE MONITOR CAN GIVE YOU ELECTRIC SHOCK EVEN WHEN THE MONITOR IS COMPLETELY OF THE ELECTRICAL GRID!! So be careful and follow the step by step guide.

Uncle Vito has this to say on the matter:

ANTIGLARE ISSUE: There are no fixes to the antiglare as it is a thick film pressured adhered to the screen at the factory. The only thing that can be done is to remove it. At the customer's request, I do that quite often as I've observed thru measurements that the antiglare film interferes with the calibration process and we've obtained different calibration data with and without the antiglare. Like I said, it is the customer's preference. Now, to remove the antiglare, you have to open the case, and then detach the bezel. Before you do that, MAKE ABSOLUTELY SURE THAT THE MONITOR IS OFF AND DISCONNECTED FROM POWER SOURCE! Then to remove the thick antiglare is very tricky and you must be patient! Carefully lift the film from any corner of the screen using a blade, and once you have enough detached film, with your fingers start pulling off the film. The pull action must be done slowly and from the top and bottom corners working inwards towards the center of the screen. If you pull too hard, you may peel off the film leaving a thick coat of glue and pieces of the film that are not easy to remove.

i had links to the guides within this thread, but they changed the post counts, and all my links are broken. Maybe someone else can direct you to the photo guide for bezel removal and for removing the anti glare
 
Last edited:
I'm gonna do some thorough comparisons this december. Need to buy a second video card so I can run my units side by side and do some proper analysis.

I have three working FW900s, two of which have coating removed. I'll calibrate them all to the same specs, and do the following measurements (in addition to making side by side subjective picture quality comparisons in a light controlled room).


1) chromaticity of primaries
2) analysis of line spread function (looking at the extent and nature of the bleed of a white rectangle to surrounding black background. Will do this with both horizontal and vertical bars). I'll be using a method based on the work flood and I did here. My plan is to take an image using the same setup, and analyze the luminance information.

Having two units with coating removed will also allow me to get a sense of how much the variance is due to the coating and how much is due to simple inter-unit variation.

If anyone has any suggestions or requests for other measurements, let me know!
 
Last edited:
I have the gtx 660 and as far as I can tell, it only has one DVI-I. the other DVI is DVI-D. It has HDMI and displayport too, but that won't help me.
 
could be another active/passive thing

i think i've read that the image is darker for passive splitters due to the voltage getting halved or something
 
I'm considering removing the anti-glare coating on my FW900. What are your thoughts on this? There are a few scratches on it.

The room that it is in is pretty light controlled, so I don't really worry about reflections too much. Is an increase in reflections the only argument against removing the antiglare coating?

I am curious if there are significant benefits to doing so? I've heard that text becomes sharper without the coating and the monitor outputs more light. Also I've heard that a meter will take more accurate readings without the coating so that a calibration can be more accurate.

What is your opinion on this? The scratches are not TOO bad but I am still weighing the pros and cons of removing the coating. I can control the lighting in my room. What do you think about the supposed benefits to image quality from removing the coating?

Thanks.

I certainly wouldn't unless it's significantly damaged. Even with ambient light control it will raise the black level, which is the last thing you'd want. (Well, that I would want anyway...black level is a big reason I still use CRTs...)

Unless you're using it in a room that's totally blacked out or something...
 
Someone else here was trying an alternative film. Guess it didn't work out though. Haven't heard a word on it since...
 
Someone else here was trying an alternative film. Guess it didn't work out though. Haven't heard a word on it since...

I didn't read the rest of this conversation, but do you mean flod and his dayvue?
 
If you want my opinion: it does not hurt to remove it, but only consider it if you use your FW900 in a room with no direct lights or sunlight that can fall on the monitor.
In reality both monitors WITH and WITHOUT AG coating look very good. Blacklevel is beautiful on AG-Removed and AG-Coated. Also Sharpness and Colors. You can always remove it, but when its down then you cant get it back on!!!

People say it improves colors and can "restore" a screen with blurry text output. So if you play in a dark room (as most gamers do, right) you should consider getting it off. I posted some links below on the procedure on how to remove the bezel of the monitor (so you can access and remove the glue of the coating) and what the best procedure of removing the AG coating is.

WARNING: don't do it without reading all this. SOME PARTS ON THE INSIDE OF THE MONITOR CAN GIVE YOU ELECTRIC SHOCK EVEN WHEN THE MONITOR IS COMPLETELY OF THE ELECTRICAL GRID!! So be careful and follow the step by step guide.

Uncle Vito has this to say on the matter:



i had links to the guides within this thread, but they changed the post counts, and all my links are broken. Maybe someone else can direct you to the photo guide for bezel removal and for removing the anti glare

Text is a bit blurrier than I would like and that is one reason I am considering it. I take from what Unkle Vito said that calibrations are more accurate on monitors with the coating removed?

Are reflections THAT bad on AR coating removed FW900s?
 
I didn't read the rest of this conversation, but do you mean flod and his dayvue?

no but i was considering that a while ago

the ar coating on the crt's film is way better than anything i've ever seen on a display.

here's paper describing the film, if you can't access i send it via pm
http://onlinelibrary.wiley.com/doi/10.1889/1.1832845/abstract

check this out
http://i.imgur.com/yvA8Tlr.jpg

blue reflection is crt, white triangle is raw nexus 7's screen, rest of the nexus 7 is covered by a dayvue film. now looking at this, you may think that dayvue isn't effective. then look at this:
http://i.imgur.com/veivEel.jpg
 
Text is a bit blurrier than I would like and that is one reason I am considering it. I take from what Unkle Vito said that calibrations are more accurate on monitors with the coating removed?

Are reflections THAT bad on AR coating removed FW900s?

alright just don't take it off :D
the film (if clean) doesn't affect text sharpness at all. EDIT: well actually crt sharpness changes a little depending on how high the contrast is. without the film the display can be turned down a bit to acheive the same luminance, which could affect sharpness. anyway, if you want to check, just reduce the contrast in the osd and see if the image becomes any sharper or blurrier. it it becomes significantly sharper AND you find that setting to be too dark for you in a dark room, then you could consider removing the coating.

how it affects calibration doesnt matter. the white point may shift a bit if the film is removed, but with or without the film the display can be calibrated very well in windas. the important thing is that the gamut isn't affected by the film, at least for my g520p
 
Last edited:
check this out
http://i.imgur.com/yvA8Tlr.jpg

blue reflection is crt, white triangle is raw nexus 7's screen, rest of the nexus 7 is covered by a dayvue film. now looking at this, you may think that dayvue isn't effective. then look at this:
http://i.imgur.com/veivEel.jpg

How should I interpret the second pic. The film covered part looks glossier if anything. Wouldn't that mean that it's increasing reflection?
 
I take from what Unkle Vito said that calibrations are more accurate on monitors with the coating removed?

yea I fail to see how this makes any sense. Don't worry about calibration accuracy - removing coating will not make your instrument read more accurately, nor will it create a more accurate white point.
 
yea I fail to see how this makes any sense. Don't worry about calibration accuracy - removing coating will not make your instrument read more accurately, nor will it create a more accurate white point.

We have observed different values with the AC on vs. AC off... The values with the AC off are better than with the AC on. The AC coating is stuck to the glass with glue in case you do not know... Also the AC is a thick plastic film. All these items alter the results.

UV!
 
We have observed different values with the AC on vs. AC off... The values with the AC off are better than with the AC on. The AC coating is stuck to the glass with glue in case you do not know... Also the AC is a thick plastic film. All these items alter the results.

UV!

this isn't really helpful. which values are better and by how much?
 
My PDC has been calibrating (dispcal gui) about 500k off of my AG-fitted CPVAs??

dude you're comparing two different displays. the pdc/AR coating doesn't affect the light of the monitor at all. if you removed the ar coating you would see the exact same image, except with more reflections from outside lighting.
 
dude you're comparing two different displays. the pdc/AR coating doesn't affect the light of the monitor at all. if you removed the ar coating you would see the exact same image, except with more reflections from outside lighting.

So, the colorimeter's inaccuracy is not because of the coating, it's because of the panel type?

I've been wondering the cause.
 
no problem ;d

How should I interpret the second pic. The film covered part looks glossier if anything. Wouldn't that mean that it's increasing reflection?
everything is glossy. it's darker... i.e. less reflection.

it's a picture outside on an overcast day, the reflections are of the sky
 
We have observed different values with the AC on vs. AC off... The values with the AC off are better than with the AC on. The AC coating is stuck to the glass with glue in case you do not know... Also the AC is a thick plastic film. All these items alter the results.

UV!

Could you elaborate on the differences you have seen with AC on vs off? Are there any other benefits/drawbacks to removing the AC coating?
 
Could you elaborate on the differences you have seen with AC on vs off? Are there any other benefits/drawbacks to removing the AC coating?

You need to run the entire process using laboratory grade equipment several times. Then after running the process using the proper instrumentation and assessing the results, ONLY IF you did it correctly, then you will find out that all the readings and results of one process (AC on) are different from the other (AC off).

Lastly, I don't want to engage anyone on differences of opinion and/or process techniques, interpretation of results, testing methodology, etc... I stated this to Mr. Spacediver long time ago.

UV!
 
well guys, i guess that means for those of us without laboratory grade equipment or without the patience to repeatedly measure the display, we won't see any differences
 
white point calibration accuracy, as measured by delta E of the white point relative to D65 has nothing to do with coating, and everything to do with how precisely you can control the voltages driving the guns (whether this be through an 8 bit WinDAS adjustment or through a 10 bit LUT adjustment).

Suppose, on a unit without coating, you calibrated your white point and got a delta E of 0.1. Then you add coating and measure white point. If it's different (say delta E of 0.5), then you just need to readjust the guns to compensate for this difference.

Similarly, if you calibrated a tube that had coating on it, and had a delta E of 0.1, and took the coating off and remeasured and got a delta E of 0.5, you'd again readjust the guns to compensate.

The only way that a coating could reduce white point calibration accuracy was if it shifted the chromaticity so severely that you could never compensate enough to bring the white point to D65. If that were the case, however, the primaries would be so far off that the monitor wouldn't be usable.

There is the possibility that through chance alone, a tube could be more precisely calibrated to D65 with coating off (or with coating on). This would be in a situation where, for example, you could only reach, say, 0.5 delta E, given the limitations of 8 or 10 bit adjustments, and it may be the case that removing or adding the coating nudges it that tiny bit towards perfection. But this is a totally chance event, and it's equally likely that you would get this extra precision with or without the coating on.
 
Yeah, I always wondered that. 6500k is 6500k, right? If it's different without the AG coat on, then you'd readjust it to match.

In any event, count me as one of the few who wished he'd kept his AG coating on. That way I don't have to be so damn picky about my lighting. But whatever - the monitor looks gorgeous either way.
 
white point calibration accuracy, as measured by delta E of the white point relative to D65 has nothing to do with coating, and everything to do with how precisely you can control the voltages driving the guns (whether this be through an 8 bit WinDAS adjustment or through a 10 bit LUT adjustment).

Suppose, on a unit without coating, you calibrated your white point and got a delta E of 0.1. Then you add coating and measure white point. If it's different (say delta E of 0.5), then you just need to readjust the guns to compensate for this difference.

Similarly, if you calibrated a tube that had coating on it, and had a delta E of 0.1, and took the coating off and remeasured and got a delta E of 0.5, you'd again readjust the guns to compensate.

The only way that a coating could reduce white point calibration accuracy was if it shifted the chromaticity so severely that you could never compensate enough to bring the white point to D65. If that were the case, however, the primaries would be so far off that the monitor wouldn't be usable.

There is the possibility that through chance alone, a tube could be more precisely calibrated to D65 with coating off (or with coating on). This would be in a situation where, for example, you could only reach, say, 0.5 delta E, given the limitations of 8 or 10 bit adjustments, and it may be the case that removing or adding the coating nudges it that tiny bit towards perfection. But this is a totally chance event, and it's equally likely that you would get this extra precision with or without the coating on.

The only gun adjustment I know of is setting the color purity and the hard convergence and that is by moving the poles in the CRT's neck, and this is not a procedure of a WiDAS/WinCAT white point balance. Setting BIAS and GAINS is not the proper definition of "readjusting the guns..."

In addition to this, altering the unit's voltages is like forcing an engine to go higher RPMs. In the end you will worn out the engine faster than normal...

Any types of obstruction(s) between the instrument and the glass of the CRT will cause erroneous readings. Any foreign material (glue, smudges, finger prints, dirt) will also cause the instrument to emit erroneous readings. I don't want to get into physics and optics, but this is a fact.

Now, based on our results, we have achieved more accurate readings and better results with the AC off vs. the AC on.

UV!
 
Yeah, I always wondered that. 6500k is 6500k, right? If it's different without the AG coat on, then you'd readjust it to match.

In any event, count me as one of the few who wished he'd kept his AG coating on. That way I don't have to be so damn picky about my lighting. But whatever - the monitor looks gorgeous either way.


6500K = 6500K, and your instrument may get you there... but at what Delta E?

UV!
 
The only gun adjustment I know of is setting the color purity and the hard convergence and that is by moving the poles in the CRT's neck, and this is not a procedure of a WiDAS/WinCAT white point balance. Setting BIAS and GAINS is not the proper definition of "readjusting the guns..."

In addition to this, altering the unit's voltages is like forcing an engine to go higher RPMs. In the end you will worn out the engine faster than normal...


By guns, I meant the voltage driving the cathode of each gun, which is essentially what you are doing when you adjust the sliders in WinDAS when meeting the chromaticity targets.

Similarly, when you make adjustments to the videoLUT, you are changing the resulting cathode voltage for any given input video level.

Any types of obstruction(s) between the instrument and the glass of the CRT will cause erroneous readings. Any foreign material (glue, smudges, finger prints, dirt) will also cause the instrument to emit erroneous readings. I don't want to get into physics and optics, but this is a fact.

Not sure what you mean by "erroneous" readings. Higher delta E's don't mean the readings are less accurate or erroneous. A reading of delta E of 10 could be extremely accurate. Likewise, a reading of delta E of 0 could be extremely inaccurate. And, assuming your instrument is very accurate, a reading of delta E of 10 doesn't mean the reading is erroneous. It means that the chromaticity of the light you are measuring differs from the target chromaticity by a delta E of 10.

This is, of course, assuming we're discussing delta E with respect to our calibration targets, and aren't discussing delta E in the context of inter-instrument agreement (where you measure two instruments reading the same light source and compare their readings to each other in units of delta E).
 
Last edited:
Back
Top