24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Greetings to all and happy new year!

Back on pages 433 and 434 I have described some problems with my HP A7217A(differently branded FW900). Basically the main problem was that it turned off the moment I plugged a VGA cable in it. After leaving the monitor in a repair shop for a few months this issue is finally fixed. The man said there were broken circuits on the D board. Now the monitor works, however when I first turn it on, the image is extremely bright and it takes about 30min to come down to normal. The problem can't be fixed by adjusting the G2 voltage with WinDAS. I have the same problem with a Sony E530 except it warms up faster and the brightness goes down to normal in about 10min.

Any ideas on how to fix this?

That's normal for CRTs. You need at least 30 min for the monitor to warm up and stabilize.
 
For the FW900 the over bright warm up period is normal. (Have been using these displays since they were still available in regular retail.)

Exactly. ALL CRT's have a warm up period needed for proper colors BUT the FW900 needs MUCH longer warm up (45 min to 1 hr) and it has much brighter/washout colors than any other CRT monitors when first turned on. I think all Trinitron monitors are like this to an extent. I have a 19" Sony G420S and while it doesn't take as long as the FW900 to warm up, it still exhibits the same bright/washout colors.
 
Lol ok dude whatever you say. I've been messing with a FW900 a lot longer than you have but if you think you know everything and want to be belligerent keep right on doing it.
Well, you're the one stating something obviously wrong. I wonder why you're persisting BTW as anyone with a filmless FW900 can witness it easily. It's as easy as turning the monitor on, putting the hand in front of the screen and taking the static discharge.

The "I was there first" or "you're belligerant" or "Mr I know all" argument isn't going to give you more credit. :rolleyes:
 
Well, you're the one stating something obviously wrong. I wonder why you're persisting BTW as anyone with a filmless FW900 can witness it easily. It's as easy as turning the monitor on, putting the hand in front of the screen and taking the static discharge.

The "I was there first" or "you're belligerant" or "Mr I know all" argument isn't going to give you more credit. :rolleyes:

There is an ever so SLIGHT static charge on the screen when the monitor is on, but it quickly goes away after it is turned off. Pretty minimal and normal for ANY CRT. I'll give you that, but it isn't as big as you make it out to be. You saying that the monitor isn't OK without the film and the film is required for a good quality display is complete bullshit.

Being a dick and acting like you know everything isn't going to give you more credit. ;)
 
Regarding static electricity, you'll probably be a lot better off if the metallic tape on the display's face is left intact to help it discharge. (I made the mistake of removing it on mine. Which was no problem on my F520, but the FW900 is a larger beast.)
 
Seeing that these beautiful monitors (FW900) are going on 20 years old I'm looking at ways to prolong and preserve mine as long as possible. One thing that will kill electronics faster than anything is heat. The FW900 puts off a TON of heat! I was thinking of adding cooling fans to the outside of the case. Possibly adding both exhaust and intake fans. Has anyone done this or have thoughts on it?
 
Last edited:
There is an ever so SLIGHT static charge on the screen when the monitor is on, but it quickly goes away after it is turned off. Pretty minimal and normal for ANY CRT. I'll give you that, but it isn't as big as you make it out to be. You saying that the monitor isn't OK without the film and the film is required for a good quality display is complete bullshit.

Being a dick and acting like you know everything isn't going to give you more credit. ;)
Ah, what a way to minimise it when you're cornered. It's SLIGHT and it goes away when the monitor is turned off ... Really ? Who would be dumb enough to expect static electricity generated by a device to last when the device is off ? :rolleyes:

Next point then. You're pretending the blacks are black without a film ? You're about the only one to state that. It's not even an opinion you might be entitled to, it's a fact they are not. And well, daring to say people act like they know everything when they contradict you, just like do most users with a filmless monitor AND the engineers who designed this (they surely didn't know anything about it, hey, I even have the publications explaining their work ...), THAT is pretty spicy. :cool:

Again, I wonder, what's the point in pretending removing the AR film is ok, even better ? That is, unless someone would like to sell a filmless monitor at inflated prices of course ...
 
Speaking about the AR film, mine is in relatively good condition but still it has some vertical marks on it visible on white background. Would it be a good idea to put a clear window film on it? For example like the one they put on car windshields for protection?
 
Ah, what a way to minimise it when you're cornered. It's SLIGHT and it goes away when the monitor is turned off ... Really ? Who would be dumb enough to expect static electricity generated by a device to last when the device is off ? :rolleyes:

Next point then. You're pretending the blacks are black without a film ? You're about the only one to state that. It's not even an opinion you might be entitled to, it's a fact they are not. And well, daring to say people act like they know everything when they contradict you, just like do most users with a filmless monitor AND the engineers who designed this (they surely didn't know anything about it, hey, I even have the publications explaining their work ...), THAT is pretty spicy. :cool:

Again, I wonder, what's the point in pretending removing the AR film is ok, even better ? That is, unless someone would like to sell a filmless monitor at inflated prices of course ...

It's only better if you're talking about making it easier on the electron gun. You don't have to drive it as hard and thus can extend the life of the tube, and even sharpen the focus on some aged sets because over time, the beam gets harder to focus with electrostatic focusing. My thoughts.

And please, can you tone down the rhetoric a little? No need to go on the offense. I understand why you're frustrated but I don't want moderators to have to step in here on this epic thread.
 
Speaking about the AR film, mine is in relatively good condition but still it has some vertical marks on it visible on white background. Would it be a good idea to put a clear window film on it? For example like the one they put on car windshields for protection?
I don't think so. The best you can do to preserve it is to clean it the less often you can, very gently, with nothing else than water/kitchen towel.

It's only better if you're talking about making it easier on the electron gun. You don't have to drive it as hard and thus can extend the life of the tube, and even sharpen the focus on some aged sets because over time, the beam gets harder to focus with electrostatic focusing. My thoughts.

And please, can you tone down the rhetoric a little? No need to go on the offense. I understand why you're frustrated but I don't want moderators to have to step in here on this epic thread.
Not sure if you did follow everything but I'm the one being called a dick. I consider myself pretty kind given the context. ;)

And it's not a matter of frustration. Frustration of what exactly ?
It's a matter of being sick of people using this thread to spread false informations with mercantile and speculative goals in mind.

Before that bloody digitalfoundry video it was a place where everyone interested in a nice device could talk about it and share freely. Not anymore. Now it's just the first place where potential buyers go and *some* people seem to want the information to match their interests. I'm not willing to condone that. Talking about moderation these behaviours are probably not very compliant with the forum rules BTW.
 
Ah, what a way to minimise it when you're cornered. It's SLIGHT and it goes away when the monitor is turned off ... Really ? Who would be dumb enough to expect static electricity generated by a device to last when the device is off ? :rolleyes:

Next point then. You're pretending the blacks are black without a film ? You're about the only one to state that. It's not even an opinion you might be entitled to, it's a fact they are not. And well, daring to say people act like they know everything when they contradict you, just like do most users with a filmless monitor AND the engineers who designed this (they surely didn't know anything about it, hey, I even have the publications explaining their work ...), THAT is pretty spicy. :cool:

Again, I wonder, what's the point in pretending removing the AR film is ok, even better ? That is, unless someone would like to sell a filmless monitor at inflated prices of course ...

You are incorrigible, lol. Why do you feel the need to jump people's shit when you disagree with them!? Can you not discuss something without attaching someone? You are one disgruntled person.

Yes, SOMETIMES, but not always, there is a SLIGHT static charge on the screen. NOTHING like some of the larger CRT TV's had. And who would be "dumb enough" not to know that a static charge persists on tube screens well after it has been turned off!

And who would be "dumb enough" to think that without external light sources reflecting off the screen that blacks wouldn't be "black"? Without the AR, you CAN achieve deep blacks IF you are in a dark room. Why you are arguing this point is absurd.

I never claimed that removing the AR was better or worse. It is worse in some ways and better in others. Worse because of reflection (obviously), obtaining blacks in a lit room, and protecting the screen. Better because of better contrast and clarity.
 
Last edited:
Not sure if you did follow everything but I'm the one being called a dick. I consider myself pretty kind given the context. ;)

And it's not a matter of frustration. Frustration of what exactly ?
It's a matter of being sick of people using this thread to spread false informations with mercantile and speculative goals in mind.

Before that bloody digitalfoundry video it was a place where everyone interested in a nice device could talk about it and share freely. Not anymore. Now it's just the first place where potential buyers go and *some* people seem to want the information to match their interests. I'm not willing to condone that. Talking about moderation these behaviours are probably not very compliant with the forum rules BTW.

And yet you jump my shit for giving my opinion on the matter, which differs from yours. I was here at the beginning in 2006 when this topic first started, so I'm therefore not spreading false information with mercantile and speculative goals in mind like you inferred I was doing. So seriously, chill the F out.
 
Last edited:
Could you provide more information on the Kantek filter you used on your FW900 please? I removed my factory filter some years back and have no negative results from doing so. Better colors and contrast and no static build up. Just have to make sure there are no external light sources in front of the screen to get good blacks and image quality. I would like to replace the screen filter mainly to protect the glass.
I recently did a well controlled experiment where I simultaneously did a WinDAS WPB on two FW900s side by side, using two laptops to connect to each monitor, and using a startech VGA splitter to split the video signal from my PC. (which was being used to generate images). I achieved the same luminance targets on each tube.

One of the FW900s had the antiglare removed, and the other didn't.

I went into this experiment expecting to see a clearer, cleaner, image on the screen with the antiglare removed. This was based on years of experience using an FW900 with the antiglare removed. I remember the first time I removed antiglare from an FW900 - there was this glossiness to the image that I loved, and I was expecting to see that contrast in glossiness between the two monitors.

I was so wrong. The images in a dark room were virtually identical. With lights on, the one with the antiglare off was indeed glossier, but the one with antiglare on was much more usable.

I took photos and wrote up this experiment and the results in detail, and posted it on the forum. It was within the last 3 months. I can't find that post though! Perhaps it got deleted or something, or never made it through.

This experiment changed my mind completely. Antiglare is far superior to no antiglare, although, as JBL correctly points out, the tube has to be run harder to achieve the same luminance.
 
You are incorrigible, lol. Why do you feel the need to jump people's shit when you disagree with them!? Can you not discuss something without attaching someone? You are one disgruntled person.

Yes, SOMETIMES, but not always, there is a SLIGHT static charge on the screen. NOTHING like some of the larger CRT TV's had. And who would be "dumb enough" not to know that a static charge persists on tube screens well after it has been turned off!

And who would be "dumb enough" to think that without external light sources reflecting off the screen that blacks wouldn't be "black"? Without the AR, you CAN achieve deep blacks IF you are in a dark room. Why you are arguing this point is absurd.

I never claimed that removing the AR was better or worse. It is worse in some ways and better in others. Worse because of reflection (obviously) and obtaining blacks in a lit room. Better because of better contrast and clarity.

And yet you jump my shit for giving my opinion on the matter, which differs from yours. I was here at the beginning in 2006 when this topic first started. I'm not spreading false information with mercantile and speculative goals in mind. So seriously, chill the F out.

You pretended there was no static without a film, then there was some, then there's some but it's nothing. There is and it's pretty heavy. This is not anything that can be witnessed on any other CRT computer monitor. It's a fact, not an opinion.
(BTW in relation to this I didn't even talk about the missing electromagnetic shielding since most people don't care about what they can't see or feel. Yet that makes the monitor probably fail the TCO norms)

You're saying that the blacks are black in a dark room. I appreciate the viciously related insult by the way. Again, despite the fact you would be here since 2006 (and that legitimates everything you say according to you), you're wrong, it's a fact.
It's not as blatant as the ambiant light phenomenon but there are internal reflections as well inside the wall of the tube itself, and the darkening films also reduce that.
SH1 reported an improvement in the dark with his filter not long ago BTW, it's related to this.

You said the constrast is better without a film which is again, obviously wrong since the contrast depends heavily on the black levels, more than the brightness levels. Worse black levels mean a worse contrast, it's a fact, not an opinion.
The part about better colors is questionable at best. At least that one could be called an opinion, but what's a better color exactly ? One has to think hard to guess that.

The only true advantage you stated is a sharpness improvement (normally quite light unless there was some abnormal haze due to an aging film though)

To sum up, you:
- gave mostly false supposed advantages of the AR film removal
- denied or minimized known drawbacks
- stated the film was not required for a good quality display

Yet you're now claiming you never said it's better or worse. And you're trying to twist the story like you're the victim of some kind of psycho ?
 
I never claimed that removing the AR was better or worse. It is worse in some ways and better in others. Worse because of reflection (obviously) and obtaining blacks in a lit room. Better because of better contrast and clarity.
Sort of. I don't remember if my FW-900 was sharper because of removing the anti-glare. Then again it was less used than others. So I would say that aged sets probably benefit. But better contrast? Maybe... But only in a real dark room.
 
You pretended there was no static without a film, then there was some, then there's some but it's nothing. There is and it's pretty heavy. This is not anything that can be witnessed on any other CRT computer monitor. It's a fact, not an opinion.
(BTW in relation to this I didn't even talk about the missing electromagnetic shielding since most people don't care about what they can't see or feel. Yet that makes the monitor probably fail the TCO norms)

You're saying that the blacks are black in a dark room. I appreciate the viciously related insult by the way. Again, despite the fact you would be here since 2006 (and that legitimates everything you say according to you), you're wrong, it's a fact.
It's not as blatant as the ambiant light phenomenon but there are internal reflections as well inside the wall of the tube itself, and the darkening films also reduce that.
SH1 reported an improvement in the dark with his filter not long ago BTW, it's related to this.

You said the constrast is better without a film which is again, obviously wrong since the contrast depends heavily on the black levels, more than the brightness levels. Worse black levels mean a worse contrast, it's a fact, not an opinion.
The part about better colors is questionable at best. At least that one could be called an opinion, but what's a better color exactly ? One has to think hard to guess that.

The only true advantage you stated is a sharpness improvement (normally quite light unless there was some abnormal haze due to an aging film though)

To sum up, you:
- gave mostly false supposed advantages of the AR film removal
- denied or minimized known drawbacks
- stated the film was not required for a good quality display

Yet you're now claiming you never said it's better or worse. And you're trying to twist the story like you're the victim of some kind of psycho ?

I digress...

I don't have the time or energy to argue with you anymore and pick apart all your discrepancies. Besides, you'll twist whatever I say and take it out of context.

You obviously are a disgruntled person that feels the need to attack anyone who disagrees with you. Good luck with that. I hope it serves you well.
 
Last edited:
Sort of. I don't remember if my FW-900 was sharper because of removing the anti-glare. Then again it was less used than others. So I would say that aged sets probably benefit. But better contrast? Maybe... But only in a real dark room.

I noticed an overall sharper and more vibrant image with the AR removed in a dark room with no external light source in front of the screen. In a brightly lit room is obviously a different story.

I would much rather have the AR on to protect the screen and not have the need to use the monitor in a dimly lit room. But overall removing the AR is not a big deal like some here want to lead people to believe.
 
I recently did a well controlled experiment where I simultaneously did a WinDAS WPB on two FW900s side by side, using two laptops to connect to each monitor, and using a startech VGA splitter to split the video signal from my PC. (which was being used to generate images). I achieved the same luminance targets on each tube.

One of the FW900s had the antiglare removed, and the other didn't.

I went into this experiment expecting to see a clearer, cleaner, image on the screen with the antiglare removed. This was based on years of experience using an FW900 with the antiglare removed. I remember the first time I removed antiglare from an FW900 - there was this glossiness to the image that I loved, and I was expecting to see that contrast in glossiness between the two monitors.

I was so wrong. The images in a dark room were virtually identical. With lights on, the one with the antiglare off was indeed glossier, but the one with antiglare on was much more usable.

I took photos and wrote up this experiment and the results in detail, and posted it on the forum. It was within the last 3 months. I can't find that post though! Perhaps it got deleted or something, or never made it through.

This experiment changed my mind completely. Antiglare is far superior to no antiglare, although, as JBL correctly points out, the tube has to be run harder to achieve the same luminance.

Interesting info. I believe what you are saying 100% (and thanks for not being an ass). If I had a choice, I would rather have the AR on. That being said, IMO, having the AR removed doesn't affect the image quality negatively when used in the right lighting conditions.

I would love to read your experiment findings. Did you save a copy of your write-up and pictures?
 
Last edited:
I noticed an overall sharper and more vibrant image with the AR removed in a dark room with no external light source in front of the screen. In a brightly lit room is obviously a different story.

I would much rather have the AR on to protect the screen and not have the need to use the monitor in a dimly lit room. But overall removing the AR is not a big deal like some here want to lead people to believe.
Back when I didn’t have kids and had my own dedicated dark room, this would have been preferable. Now? :). My computer is in the “play room”. So I could be gaming while one of my girls is reading something, so an AR-less monitor would be awful for me.

When I removed my AR coating, I ultimately regretted it. My warning to folks is that they need to weigh the risks and decide for themselves if they want to go through with it. Because it seems like there’s no real replacement for it; and once removed, the change is permanent. If the AR is fine and not scratched, my recommendation would be to leave it alone. If it’s scratched enough so that it’s visually distracting then go ahead.
 
Last edited:
I am sorry if it has already been discussed but what about the protective films like the ones offered here and here for example? Wouldn't they be good enough to cover light scratches and spots on the original AR coating? On the second link you can select Sony CPD-G520 which makes me think these are suitable for CRTs.
 
I am sorry if it has already been discussed but what about the protective films like the ones offered here and here for example? Wouldn't they be good enough to cover light scratches and spots on the original AR coating? On the second link you can select Sony CPD-G520 which makes me think these are suitable for CRTs.
It's a common commercial trap. They list as much product references they can find, more or less related to what they want to sell to attract customers, but that doesn't mean at all it is suitable.

You have to understand that if you add another film on the top of the existing one:
- this will inevitably add some haze
- this will lower the brightness of the screen (by amounts that may be negligible or not, depending of the film)
- the film you'll add will need to have very good antireflection properties, otherwise you'll end with worse performances in that aspect
- the optical behaviour at the junction between the original film and the glue of the 2nd film may be unpredictable. There could be unwanted reflections appearing because the external layer is designed to be next to ambiant air, not some glue.
- it will be very risky to place and any slight mistake would be unforgiving, trying to unstick even a small area of the new film would probably ruin the AR layer of the original film.

As for the links you found, these are poor quality products in my opinion.
Anything advertised as anti-GLARE uses an uneven surface to break reflections but this generate levels of haze totally unacceptable for a high resolution display (often > 5%), anyone should stay away from this
A good quality anti-REFLECTIVE film will have haze levels below 1% and reflection levels about in the 0.2-1% range depending of wavelength, none of theirs is close of this.
 
Once again thanks for the detailed info, Strat_84 ! Do you know from where a good quality AR film could be obtained? Preferably in Europe. I have around 10 CRT monitors, most of them Trinitrons or Diamondtrons. Some of them are in bad enough condition to be sacrificed for experiments. If there are some kinds of coatings which could eventually replace the original AR coating or added on top of it, I am willing to try.
 
Here a summary of various adapters.
If 375 MHz max pixel clock is enough for you, buy the Startech DP2VGAHD20, it doesn't cost too much, it's available everywhere and it works good.
If you want more performance the only solution is one of the Synaptics VMM2322 based adapters.
There are also the USB-C Lontium based adapters, but they are difficult to find, we don't know the real performance and you need a video card with USB-C output or a special card to convert the displayport to USB-C output.
About the LT8612UX, the chipset is done but they have not tested it with 400 MHz so we don't know the real performance, for now i don't know adapters with this chipset.
Thank you for replying, thanks for letting me know, and thank you for the list. I will be picking up one of the Synaptics VMM2322 boards, specifically the Delock 87685 one since I live in Germany. Much Love <3
 
Regarding static electricity, you'll probably be a lot better off if the metallic tape on the display's face is left intact to help it discharge. (I made the mistake of removing it on mine. Which was no problem on my F520, but the FW900 is a larger beast.)

Interesting idea. It has been years since I removed my AR film (now wishing I wouldn't have considering it was perfect before I put a small scratch on it). I used it for 7 yrs with it on and it has been 8 years since I've taken it off. I haven't noticed a significant difference with the AR on or off except the reflective nature of the screen when the lights are on.

I just let my monitor warm up again and there is ZERO static on the screen, so I wonder if I left the metallic tape on the display when I took off the AR. I don't remember since it has been so long ago. Also, I wonder if you could reapply some aluminum tape to the frame and screen to ground the monitor? From my understanding, glass isn't a good conductor of electricity, so having the aluminum tape directly on the glass might not ground the monitor like with the AR attached. Worth a try.
 
Last edited:
I *think* but am not sure, that one of HD Fury's selling points was removal of blu ray encryption, but may be wrong.

HD Fury's are overpriced. The main selling point of their X3 and X4 is that it supports 10bit and 12bit colors (I'm not aware of any other vga adapter that supports 10bit). I talked with one of their reps and they said they are looking into making one that supports a higher pixel clock of around 500MHz.
 
Last edited:
HD Fury's are overpriced. The main selling point of their X3 and X4 is that it supports 10bit and 12bit colors (I'm not aware of any other vga adapter that supports 10bit). I talked with one of their reps and they said they are looking into making one that supports a higher pixel clock of around 500MHz.
I respectfully disagree. HD-Fury 3 is a nice unit that I would have loved to have for my CRT projector. That was their main point. I do think it’s unfortunate that they never made high end units to support the top end CRT monitors but I loved my HD-Fury.
 
I respectfully disagree. HD-Fury 3 is a nice unit that I would have loved to have for my CRT projector. That was their main point. I do think it’s unfortunate that they never made high end units to support the top end CRT monitors but I loved my HD-Fury.

Don't misunderstand me, they are great adapters. I own the original HDFury and the HDFury X3. But for $200 (for the X3), that is a crazy amount compared to the Startech adapter, which offers almost double the pixel clock frequency for $30.
 
Last edited:
When I removed my AR coating, I ultimately regretted it. My warning to folks is that they need to weigh the risks and decide for themselves if they want to go through with it. Because it seems like there’s no real replacement for it; and once removed, the change is permanent. If the AR is fine and not scratched, my recommendation would be to leave it alone. If it’s scratched enough so that it’s visually distracting then go ahead.
No, it is not permanent.
You can get linear polarizer and it has far superior performance than original AG, especially when it comes to having to worry about ambient light and reflections. Unlike any Trinitron I ever used I need very strong light to even see my head's outline and even then I would not be able recognize myself. In gaming conditions (no lights or with ambient light) I cannot see myself at all. Screen appear black with main ceiling light on eg. now 5 light bulbs, different types, on average 90W and in this lighting image looks good enough that I can some times forget to turn it off 🤯
Electrostatics seems vastly improved compared to not having any film. How it compares to original AG I am not sure but I'd say about the same level with polarizer as I remember I had on IBM P275.

Maximum luminance is literally cut in half... or even slightly more. This is however the only disadvantage to having this film. FW900 without film was however so ridiculously bright that I can still get usable 120 cd/m2 out of it which is for all intents and purposes sufficient.

Imho the best mod one can do to this monitor 😎
 
Will this thing (fw900) blow up if running it from an ungrounded power outlet, asking for a friend..
Grounding is used not so much to protect devices as to protect users from electric shock. When there is any part of the device housing that is exposed to the user's touch then running device without also grounding it is risky. Having it grounded will blow your fuses and you will know that something is wrong and having it ungrounded can lead electricity to run through you in which case you might realize there is an issue but it will be too late...

FW900 has not so much exposed metal parts so it should be quite safe. However there is always some risk associated with not having it grounded. Alternative to grounding is using isolation transformer. It doesn't prevent failures but should make it unlikely for dangerous current to run through users body in case dangerous issue does occur.
 
Back
Top